├── .env.example ├── .github ├── FUNDING.yml └── workflows │ └── release-please.yml ├── .gitignore ├── CHANGELOG.md ├── README.md ├── demo.gif ├── index.js ├── license ├── logo.png ├── package-lock.json └── package.json /.env.example: -------------------------------------------------------------------------------- 1 | OPENAI_API_KEY= 2 | GOOGLE_SEARCH_API_KEY= 3 | GOOGLE_SEARCH_ID= 4 | -------------------------------------------------------------------------------- /.github/FUNDING.yml: -------------------------------------------------------------------------------- 1 | github: tobiasbueschel 2 | -------------------------------------------------------------------------------- /.github/workflows/release-please.yml: -------------------------------------------------------------------------------- 1 | on: 2 | push: 3 | branches: 4 | - main 5 | name: release-please 6 | jobs: 7 | release-please: 8 | runs-on: ubuntu-latest 9 | permissions: 10 | contents: write 11 | pull-requests: write 12 | steps: 13 | - uses: google-github-actions/release-please-action@v3 14 | id: release 15 | with: 16 | release-type: node 17 | package-name: search-gpt 18 | pull-request-title-pattern: "chore: release ${version}" 19 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | node_modules/ 2 | .env 3 | .DS_Store 4 | -------------------------------------------------------------------------------- /CHANGELOG.md: -------------------------------------------------------------------------------- 1 | # Changelog 2 | 3 | ## [1.2.0](https://github.com/tobiasbueschel/search-gpt/compare/v1.1.2...v1.2.0) (2023-04-11) 4 | 5 | 6 | ### Features 7 | 8 | * integrate gpt-3-encoder for token count approximation ([c5aa3ca](https://github.com/tobiasbueschel/search-gpt/commit/c5aa3cac0f5dc054e50cfab48186a22cb3f00f87)) 9 | 10 | ## [1.1.2](https://github.com/tobiasbueschel/search-gpt/compare/v1.1.1...v1.1.2) (2023-03-26) 11 | 12 | 13 | ### Bug Fixes 14 | 15 | * timeout if page can't be crawled ([8e48f80](https://github.com/tobiasbueschel/search-gpt/commit/8e48f808e615974b5cedd0f548c3861b851e5e81)), closes [#5](https://github.com/tobiasbueschel/search-gpt/issues/5) 16 | 17 | ## [1.1.1](https://github.com/tobiasbueschel/search-gpt/compare/v1.1.0...v1.1.1) (2023-03-04) 18 | 19 | 20 | ### Bug Fixes 21 | 22 | * correct bin command in readme & update demo ([6ba03ec](https://github.com/tobiasbueschel/search-gpt/commit/6ba03ece68a135bf57a1059eeb66de99728fdb8a)) 23 | 24 | ## [1.1.0](https://github.com/tobiasbueschel/search-gpt/compare/v1.0.1...v1.1.0) (2023-03-04) 25 | 26 | 27 | ### Features 28 | 29 | * improve console logs to provide context ([6e444a5](https://github.com/tobiasbueschel/search-gpt/commit/6e444a5ed4dd3b2d5467c106b16488d067f781a1)) 30 | 31 | 32 | ### Bug Fixes 33 | 34 | * start bin script with /usr/bin/env node ([df59012](https://github.com/tobiasbueschel/search-gpt/commit/df5901223f67bfc7220eef197e1ffa089d40f760)) 35 | 36 | ## [1.0.1](https://github.com/tobiasbueschel/search-gpt/compare/v1.0.0...v1.0.1) (2023-03-04) 37 | 38 | 39 | ### Bug Fixes 40 | 41 | * remove hyphen from bin command and strict mode ([6fdfdec](https://github.com/tobiasbueschel/search-gpt/commit/6fdfdecc4ee781bac37abcfe616b44858269cc36)) 42 | 43 | ## 1.0.0 (2023-03-04) 44 | 45 | 46 | ### Features 47 | 48 | * implement first version of search-gpt ([9df67ec](https://github.com/tobiasbueschel/search-gpt/commit/9df67eca938346be16473f9fea697b1a8688cc93)) 49 | 50 | ## 1.0.0 (2023-03-03) 51 | 52 | 53 | ### Features 54 | 55 | * implement first version of search-gpt ([d6f9e79](https://github.com/tobiasbueschel/search-gpt/commit/d6f9e79167b887bd81fac0ad5b228da4cfbe7cfe)) 56 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 |
2 |
3 | 4 | SearchGPT 5 | 6 |

SearchGPT

7 |

8 | Connecting ChatGPT with the Internet 9 |

10 |
11 |
12 |
13 | 14 | You want to try ChatGPT with Internet connectivity so that you can ask about events beyond 2021, but don't have access to AI-enabled Bing and don't want to wait for Google's Bard? SearchGPT gives you this functionality today - it crawls the Internet for information and then feeds it back to ChatGPT. 15 | 16 | ![SearchGPT Demo](./demo.gif) 17 | 18 | ## Usage 19 | 20 | The easiest way to get started with search-gpt is to run the following: 21 | 22 | ```sh 23 | export OPENAI_API_KEY= 24 | export GOOGLE_SEARCH_API_KEY= 25 | export GOOGLE_SEARCH_ID= 26 | 27 | npx search-gpt 28 | ``` 29 | 30 | Alternatively, you can also run: 31 | 32 | ```sh 33 | npm install --global search-gpt 34 | 35 | # Run SearchGPT with this command 36 | searchgpt 37 | ``` 38 | 39 | Ensure you have your own [Google Search API key](https://developers.google.com/custom-search/v1/introduction), [Programmable Search Engine](https://programmablesearchengine.google.com/controlpanel/all) and [OpenAI API key](https://platform.openai.com/) before running the CLI. 40 | 41 | Once the CLI starts, it will prompt you to enter a question. Simply type in your query, and the AI assistant will search the web and generate a response. 42 | 43 | ## How it works 44 | 45 | This is a proof of concept and is far from a proper implementation (e.g., Microsoft's [Prometheus Model](https://techcrunch.com/2023/02/07/openais-next-generation-ai-model-is-behind-microsofts-new-search)) - I wanted to experiment how easy it could be to crawl certain search engines and then feed these results into a large language model (LLM) such as GPT 3.5. Apart from querying Google Search, one could also think about integrating other APIs to crawl data and then feed it into the LLM. 46 | 47 | ```mermaid 48 | flowchart LR 49 | A[User enters question] --> B[Search Google] 50 | A --> C[Search Twitter, not implemented yet] 51 | A --> D[Search other engines] 52 | B --> E[Search results handed to ChatGPT] 53 | E --> F[ChatGPT uses this context to provide an answer] 54 | ``` 55 | 56 | Please note: the current implementation feeds Google Search results to `gpt-3.5-turbo` and does not include previous messages in subsequent queries to avoid surpassing the token limit. 57 | 58 | ## License 59 | 60 | This project is licensed under the [MIT license](./license). 61 | -------------------------------------------------------------------------------- /demo.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/tobiasbueschel/search-gpt/37b2374a1347c9b03e813e2c3d69f412d77d69e4/demo.gif -------------------------------------------------------------------------------- /index.js: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env node 2 | import { compile } from "html-to-text"; 3 | import readline from "readline"; 4 | import chalk from "chalk"; 5 | import fetch from "node-fetch"; 6 | import { encode } from "gpt-3-encoder"; 7 | import * as dotenv from "dotenv"; 8 | dotenv.config(); 9 | 10 | const { env } = process; 11 | const OPENAI_API_KEY = env.OPENAI_API_KEY; 12 | const GOOGLE_SEARCH_API_KEY = env.GOOGLE_SEARCH_API_KEY; 13 | const GOOGLE_SEARCH_ID = env.GOOGLE_SEARCH_ID; 14 | 15 | if (!OPENAI_API_KEY || !GOOGLE_SEARCH_API_KEY || !GOOGLE_SEARCH_ID) { 16 | console.error( 17 | "Please ensure you set up your .env file with the correct API keys" 18 | ); 19 | process.exit(1); 20 | } 21 | 22 | const rl = readline.createInterface({ 23 | input: process.stdin, 24 | output: process.stdout, 25 | }); 26 | 27 | const convert = compile({ 28 | preserveNewlines: false, 29 | wordwrap: false, 30 | // The main content of a website will typically be found in the main element 31 | baseElements: { selectors: ["main"] }, 32 | selectors: [ 33 | { 34 | selector: "a", 35 | options: { ignoreHref: true }, 36 | }, 37 | ], 38 | }); 39 | 40 | async function startCli() { 41 | rl.question( 42 | chalk.bgHex("#00A67E").white("🧠 Ask me anything:") + " ", 43 | async (userPrompt) => { 44 | await searchGPT(userPrompt); 45 | startCli(); 46 | } 47 | ); 48 | } 49 | 50 | async function searchGPT(userPrompt) { 51 | process.stdout.write(chalk.dim("> Starting Google Search...")); 52 | 53 | // Step 1: perform Google Search 54 | // We crawl the first 5 pages returned from Google Search as it often contains the result of the query. 55 | // As a fallback, we also include all snippets from other search result pages in case the answer is not 56 | // included in the crawled page already. 57 | const searchResults = await getGoogleSearchResults(userPrompt); 58 | const [context, urlReference] = 59 | (await getTextOfSearchResults(searchResults)) || []; 60 | 61 | // Step 2: build up chat messages by providing search result context and user prompt 62 | const chatMessages = [ 63 | { 64 | role: "system", 65 | content: `You are my AI assistant and I want you to assume today is ${new Date().toDateString()}.`, 66 | }, 67 | { 68 | role: "assistant", 69 | content: context, 70 | }, 71 | { 72 | role: "user", 73 | content: `With the information in the assistant's last message, answer this in the same language: ${userPrompt}`, 74 | }, 75 | ]; 76 | 77 | // Step 2: reach out to OpenAI to answer original user prompt with attached context 78 | const finalResponse = await getOpenAIChatCompletion(chatMessages); 79 | 80 | process.stdout.clearLine(0); 81 | process.stdout.cursorTo(0); 82 | 83 | console.log("\n" + chalk.green("> ") + chalk.white(finalResponse)); 84 | console.log(chalk.dim(`> Know more: ${urlReference}` + "\n")); 85 | 86 | return finalResponse; 87 | } 88 | 89 | /** 90 | * Crawl the first page of Google Search results and get the main content 91 | * If the first page is not accessible, try the next page and so on. 92 | */ 93 | async function getTextOfSearchResults(searchResults) { 94 | try { 95 | let urlReference = ""; 96 | 97 | // Get all Google Search snippets, clean them up by removing "..." and add to the text context 98 | let context = searchResults.items.reduce( 99 | (allPages, currentPage) => 100 | `${allPages} ${currentPage.snippet.replaceAll("...", " ")}`, 101 | "" 102 | ); 103 | 104 | // Loop over searchResults.items until we find a page that is accessible, break if we try more than 5 pages or we reached the end of searchResults.items 105 | for (let i = 0; i < searchResults.items.length && i < 5; i++) { 106 | const urlToCheck = searchResults.items[i].link; 107 | 108 | process.stdout.clearLine(0); 109 | process.stdout.cursorTo(0); 110 | process.stdout.write(chalk.dim(`> Checking: ${urlToCheck}`)); 111 | 112 | // Fetch the HTML of the page & get main content. If we get a non 200-code, we try the next page. 113 | // if fetch request gets stuck for more than 5 seconds, we try the next page. 114 | const response = await Promise.race([ 115 | fetch(urlToCheck), 116 | new Promise((resolve) => setTimeout(() => resolve(undefined), 5000)), 117 | ]); 118 | 119 | if (!response?.ok) { 120 | continue; 121 | } 122 | 123 | // Get the full text from the raw HTML and remove any new lines from it as we don't need them 124 | const fullText = convert(await response.text()) 125 | .replaceAll("\n", " ") 126 | .trim(); 127 | context = fullText + context; 128 | urlReference = urlToCheck; 129 | 130 | break; 131 | } 132 | 133 | // We must stay below the max token amount of OpenAI's API. 134 | // "Depending on the model used, requests can use up to 4096 tokens shared between prompt and completion" 135 | // Therefore, the following will allow 3000 tokens for the prompt and the rest for the completion. 136 | // - https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them 137 | // - https://platform.openai.com/docs/api-reference/chat/create 138 | // Note: the following encodes for GPT-3, hence, is only an approximation for other models. 139 | const maxPromptTokenLength = 3000; 140 | const encoded = encode(context); 141 | 142 | if (encoded.length > maxPromptTokenLength) { 143 | context = context.slice(0, maxPromptTokenLength); 144 | } 145 | 146 | return [context, urlReference]; 147 | } catch (error) { 148 | console.error(error); 149 | } 150 | } 151 | 152 | /** 153 | * Fetch the first page of Google Search results 154 | */ 155 | async function getGoogleSearchResults(searchTerm) { 156 | const response = await makeFetch( 157 | `https://www.googleapis.com/customsearch/v1\?key\=${GOOGLE_SEARCH_API_KEY}\&cx=${GOOGLE_SEARCH_ID}\&q\=${searchTerm}` 158 | ); 159 | const data = await response.json(); 160 | return data; 161 | } 162 | 163 | /** 164 | * Call OpenAI's chat API to answer the user's prompt with the context from Google Search 165 | */ 166 | async function getOpenAIChatCompletion(previousChat) { 167 | const response = await makeFetch( 168 | "https://api.openai.com/v1/chat/completions", 169 | { 170 | method: "POST", 171 | headers: { 172 | "Content-Type": "application/json", 173 | Authorization: `Bearer ${OPENAI_API_KEY}`, 174 | }, 175 | body: JSON.stringify({ 176 | model: "gpt-3.5-turbo", 177 | messages: previousChat, 178 | }), 179 | } 180 | ); 181 | 182 | const { choices } = await response.json(); 183 | return choices[0].message.content; 184 | } 185 | 186 | /** 187 | * Helper function to make fetch requests 188 | */ 189 | async function makeFetch(url, options) { 190 | try { 191 | const response = await fetch(url, options); 192 | // The Promise returned from fetch() won’t reject on HTTP error status even if the response is an HTTP 404 or 500. 193 | if (response.ok) { 194 | return response; 195 | } 196 | // for all other status codes (e.g., 404, 500), this will log the error and stop the e2e tests 197 | console.error( 198 | `The ${options.method} ${url}" request failed with code: ${response.status} and message: ${response.statusText}` 199 | ); 200 | } catch (error) { 201 | // if the request is rejected due to e.g., network failures, this will log the error 202 | console.error( 203 | `The ${options.method} ${url}" request failed due to a network error`, 204 | error 205 | ); 206 | } 207 | } 208 | 209 | startCli(); 210 | -------------------------------------------------------------------------------- /license: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) Tobias Büschel (github.com/tobiasbueschel) 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: 6 | 7 | The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. 8 | 9 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 10 | -------------------------------------------------------------------------------- /logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/tobiasbueschel/search-gpt/37b2374a1347c9b03e813e2c3d69f412d77d69e4/logo.png -------------------------------------------------------------------------------- /package-lock.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "search-gpt", 3 | "version": "1.2.0", 4 | "lockfileVersion": 3, 5 | "requires": true, 6 | "packages": { 7 | "": { 8 | "name": "search-gpt", 9 | "version": "1.1.2", 10 | "license": "MIT", 11 | "dependencies": { 12 | "chalk": "^5.2.0", 13 | "dotenv": "^16.0.3", 14 | "gpt-3-encoder": "^1.1.4", 15 | "html-to-text": "^9.0.4", 16 | "node-fetch": "^3.3.0" 17 | }, 18 | "bin": { 19 | "searchgpt": "index.js" 20 | }, 21 | "engines": { 22 | "node": ">=18" 23 | } 24 | }, 25 | "node_modules/@selderee/plugin-htmlparser2": { 26 | "version": "0.10.0", 27 | "resolved": "https://registry.npmjs.org/@selderee/plugin-htmlparser2/-/plugin-htmlparser2-0.10.0.tgz", 28 | "integrity": "sha512-gW69MEamZ4wk1OsOq1nG1jcyhXIQcnrsX5JwixVw/9xaiav8TCyjESAruu1Rz9yyInhgBXxkNwMeygKnN2uxNA==", 29 | "dependencies": { 30 | "domhandler": "^5.0.3", 31 | "selderee": "^0.10.0" 32 | }, 33 | "funding": { 34 | "url": "https://ko-fi.com/killymxi" 35 | } 36 | }, 37 | "node_modules/chalk": { 38 | "version": "5.2.0", 39 | "resolved": "https://registry.npmjs.org/chalk/-/chalk-5.2.0.tgz", 40 | "integrity": "sha512-ree3Gqw/nazQAPuJJEy+avdl7QfZMcUvmHIKgEZkGL+xOBzRvup5Hxo6LHuMceSxOabuJLJm5Yp/92R9eMmMvA==", 41 | "engines": { 42 | "node": "^12.17.0 || ^14.13 || >=16.0.0" 43 | }, 44 | "funding": { 45 | "url": "https://github.com/chalk/chalk?sponsor=1" 46 | } 47 | }, 48 | "node_modules/data-uri-to-buffer": { 49 | "version": "4.0.1", 50 | "resolved": "https://registry.npmjs.org/data-uri-to-buffer/-/data-uri-to-buffer-4.0.1.tgz", 51 | "integrity": "sha512-0R9ikRb668HB7QDxT1vkpuUBtqc53YyAwMwGeUFKRojY/NWKvdZ+9UYtRfGmhqNbRkTSVpMbmyhXipFFv2cb/A==", 52 | "engines": { 53 | "node": ">= 12" 54 | } 55 | }, 56 | "node_modules/deepmerge": { 57 | "version": "4.3.0", 58 | "resolved": "https://registry.npmjs.org/deepmerge/-/deepmerge-4.3.0.tgz", 59 | "integrity": "sha512-z2wJZXrmeHdvYJp/Ux55wIjqo81G5Bp4c+oELTW+7ar6SogWHajt5a9gO3s3IDaGSAXjDk0vlQKN3rms8ab3og==", 60 | "engines": { 61 | "node": ">=0.10.0" 62 | } 63 | }, 64 | "node_modules/dom-serializer": { 65 | "version": "2.0.0", 66 | "resolved": "https://registry.npmjs.org/dom-serializer/-/dom-serializer-2.0.0.tgz", 67 | "integrity": "sha512-wIkAryiqt/nV5EQKqQpo3SToSOV9J0DnbJqwK7Wv/Trc92zIAYZ4FlMu+JPFW1DfGFt81ZTCGgDEabffXeLyJg==", 68 | "dependencies": { 69 | "domelementtype": "^2.3.0", 70 | "domhandler": "^5.0.2", 71 | "entities": "^4.2.0" 72 | }, 73 | "funding": { 74 | "url": "https://github.com/cheeriojs/dom-serializer?sponsor=1" 75 | } 76 | }, 77 | "node_modules/domelementtype": { 78 | "version": "2.3.0", 79 | "resolved": "https://registry.npmjs.org/domelementtype/-/domelementtype-2.3.0.tgz", 80 | "integrity": "sha512-OLETBj6w0OsagBwdXnPdN0cnMfF9opN69co+7ZrbfPGrdpPVNBUj02spi6B1N7wChLQiPn4CSH/zJvXw56gmHw==", 81 | "funding": [ 82 | { 83 | "type": "github", 84 | "url": "https://github.com/sponsors/fb55" 85 | } 86 | ] 87 | }, 88 | "node_modules/domhandler": { 89 | "version": "5.0.3", 90 | "resolved": "https://registry.npmjs.org/domhandler/-/domhandler-5.0.3.tgz", 91 | "integrity": "sha512-cgwlv/1iFQiFnU96XXgROh8xTeetsnJiDsTc7TYCLFd9+/WNkIqPTxiM/8pSd8VIrhXGTf1Ny1q1hquVqDJB5w==", 92 | "dependencies": { 93 | "domelementtype": "^2.3.0" 94 | }, 95 | "engines": { 96 | "node": ">= 4" 97 | }, 98 | "funding": { 99 | "url": "https://github.com/fb55/domhandler?sponsor=1" 100 | } 101 | }, 102 | "node_modules/domutils": { 103 | "version": "3.0.1", 104 | "resolved": "https://registry.npmjs.org/domutils/-/domutils-3.0.1.tgz", 105 | "integrity": "sha512-z08c1l761iKhDFtfXO04C7kTdPBLi41zwOZl00WS8b5eiaebNpY00HKbztwBq+e3vyqWNwWF3mP9YLUeqIrF+Q==", 106 | "dependencies": { 107 | "dom-serializer": "^2.0.0", 108 | "domelementtype": "^2.3.0", 109 | "domhandler": "^5.0.1" 110 | }, 111 | "funding": { 112 | "url": "https://github.com/fb55/domutils?sponsor=1" 113 | } 114 | }, 115 | "node_modules/dotenv": { 116 | "version": "16.0.3", 117 | "resolved": "https://registry.npmjs.org/dotenv/-/dotenv-16.0.3.tgz", 118 | "integrity": "sha512-7GO6HghkA5fYG9TYnNxi14/7K9f5occMlp3zXAuSxn7CKCxt9xbNWG7yF8hTCSUchlfWSe3uLmlPfigevRItzQ==", 119 | "engines": { 120 | "node": ">=12" 121 | } 122 | }, 123 | "node_modules/entities": { 124 | "version": "4.4.0", 125 | "resolved": "https://registry.npmjs.org/entities/-/entities-4.4.0.tgz", 126 | "integrity": "sha512-oYp7156SP8LkeGD0GF85ad1X9Ai79WtRsZ2gxJqtBuzH+98YUV6jkHEKlZkMbcrjJjIVJNIDP/3WL9wQkoPbWA==", 127 | "engines": { 128 | "node": ">=0.12" 129 | }, 130 | "funding": { 131 | "url": "https://github.com/fb55/entities?sponsor=1" 132 | } 133 | }, 134 | "node_modules/fetch-blob": { 135 | "version": "3.2.0", 136 | "resolved": "https://registry.npmjs.org/fetch-blob/-/fetch-blob-3.2.0.tgz", 137 | "integrity": "sha512-7yAQpD2UMJzLi1Dqv7qFYnPbaPx7ZfFK6PiIxQ4PfkGPyNyl2Ugx+a/umUonmKqjhM4DnfbMvdX6otXq83soQQ==", 138 | "funding": [ 139 | { 140 | "type": "github", 141 | "url": "https://github.com/sponsors/jimmywarting" 142 | }, 143 | { 144 | "type": "paypal", 145 | "url": "https://paypal.me/jimmywarting" 146 | } 147 | ], 148 | "dependencies": { 149 | "node-domexception": "^1.0.0", 150 | "web-streams-polyfill": "^3.0.3" 151 | }, 152 | "engines": { 153 | "node": "^12.20 || >= 14.13" 154 | } 155 | }, 156 | "node_modules/formdata-polyfill": { 157 | "version": "4.0.10", 158 | "resolved": "https://registry.npmjs.org/formdata-polyfill/-/formdata-polyfill-4.0.10.tgz", 159 | "integrity": "sha512-buewHzMvYL29jdeQTVILecSaZKnt/RJWjoZCF5OW60Z67/GmSLBkOFM7qh1PI3zFNtJbaZL5eQu1vLfazOwj4g==", 160 | "dependencies": { 161 | "fetch-blob": "^3.1.2" 162 | }, 163 | "engines": { 164 | "node": ">=12.20.0" 165 | } 166 | }, 167 | "node_modules/gpt-3-encoder": { 168 | "version": "1.1.4", 169 | "resolved": "https://registry.npmjs.org/gpt-3-encoder/-/gpt-3-encoder-1.1.4.tgz", 170 | "integrity": "sha512-fSQRePV+HUAhCn7+7HL7lNIXNm6eaFWFbNLOOGtmSJ0qJycyQvj60OvRlH7mee8xAMjBDNRdMXlMwjAbMTDjkg==" 171 | }, 172 | "node_modules/html-to-text": { 173 | "version": "9.0.4", 174 | "resolved": "https://registry.npmjs.org/html-to-text/-/html-to-text-9.0.4.tgz", 175 | "integrity": "sha512-ckrQ5N2yZS7qSgKxUbqrBZ02NxD5cSy7KuYjCNIf+HWbdzY3fbjYjQsoRIl6TiaZ4+XWOi0ggFP8/pmgCK/o+A==", 176 | "dependencies": { 177 | "@selderee/plugin-htmlparser2": "^0.10.0", 178 | "deepmerge": "^4.3.0", 179 | "dom-serializer": "^2.0.0", 180 | "htmlparser2": "^8.0.1", 181 | "selderee": "^0.10.0" 182 | }, 183 | "engines": { 184 | "node": ">=14" 185 | } 186 | }, 187 | "node_modules/htmlparser2": { 188 | "version": "8.0.1", 189 | "resolved": "https://registry.npmjs.org/htmlparser2/-/htmlparser2-8.0.1.tgz", 190 | "integrity": "sha512-4lVbmc1diZC7GUJQtRQ5yBAeUCL1exyMwmForWkRLnwyzWBFxN633SALPMGYaWZvKe9j1pRZJpauvmxENSp/EA==", 191 | "funding": [ 192 | "https://github.com/fb55/htmlparser2?sponsor=1", 193 | { 194 | "type": "github", 195 | "url": "https://github.com/sponsors/fb55" 196 | } 197 | ], 198 | "dependencies": { 199 | "domelementtype": "^2.3.0", 200 | "domhandler": "^5.0.2", 201 | "domutils": "^3.0.1", 202 | "entities": "^4.3.0" 203 | } 204 | }, 205 | "node_modules/leac": { 206 | "version": "0.6.0", 207 | "resolved": "https://registry.npmjs.org/leac/-/leac-0.6.0.tgz", 208 | "integrity": "sha512-y+SqErxb8h7nE/fiEX07jsbuhrpO9lL8eca7/Y1nuWV2moNlXhyd59iDGcRf6moVyDMbmTNzL40SUyrFU/yDpg==", 209 | "funding": { 210 | "url": "https://ko-fi.com/killymxi" 211 | } 212 | }, 213 | "node_modules/node-domexception": { 214 | "version": "1.0.0", 215 | "resolved": "https://registry.npmjs.org/node-domexception/-/node-domexception-1.0.0.tgz", 216 | "integrity": "sha512-/jKZoMpw0F8GRwl4/eLROPA3cfcXtLApP0QzLmUT/HuPCZWyB7IY9ZrMeKw2O/nFIqPQB3PVM9aYm0F312AXDQ==", 217 | "funding": [ 218 | { 219 | "type": "github", 220 | "url": "https://github.com/sponsors/jimmywarting" 221 | }, 222 | { 223 | "type": "github", 224 | "url": "https://paypal.me/jimmywarting" 225 | } 226 | ], 227 | "engines": { 228 | "node": ">=10.5.0" 229 | } 230 | }, 231 | "node_modules/node-fetch": { 232 | "version": "3.3.0", 233 | "resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-3.3.0.tgz", 234 | "integrity": "sha512-BKwRP/O0UvoMKp7GNdwPlObhYGB5DQqwhEDQlNKuoqwVYSxkSZCSbHjnFFmUEtwSKRPU4kNK8PbDYYitwaE3QA==", 235 | "dependencies": { 236 | "data-uri-to-buffer": "^4.0.0", 237 | "fetch-blob": "^3.1.4", 238 | "formdata-polyfill": "^4.0.10" 239 | }, 240 | "engines": { 241 | "node": "^12.20.0 || ^14.13.1 || >=16.0.0" 242 | }, 243 | "funding": { 244 | "type": "opencollective", 245 | "url": "https://opencollective.com/node-fetch" 246 | } 247 | }, 248 | "node_modules/parseley": { 249 | "version": "0.11.0", 250 | "resolved": "https://registry.npmjs.org/parseley/-/parseley-0.11.0.tgz", 251 | "integrity": "sha512-VfcwXlBWgTF+unPcr7yu3HSSA6QUdDaDnrHcytVfj5Z8azAyKBDrYnSIfeSxlrEayndNcLmrXzg+Vxbo6DWRXQ==", 252 | "dependencies": { 253 | "leac": "^0.6.0", 254 | "peberminta": "^0.8.0" 255 | }, 256 | "funding": { 257 | "url": "https://ko-fi.com/killymxi" 258 | } 259 | }, 260 | "node_modules/peberminta": { 261 | "version": "0.8.0", 262 | "resolved": "https://registry.npmjs.org/peberminta/-/peberminta-0.8.0.tgz", 263 | "integrity": "sha512-YYEs+eauIjDH5nUEGi18EohWE0nV2QbGTqmxQcqgZ/0g+laPCQmuIqq7EBLVi9uim9zMgfJv0QBZEnQ3uHw/Tw==", 264 | "funding": { 265 | "url": "https://ko-fi.com/killymxi" 266 | } 267 | }, 268 | "node_modules/selderee": { 269 | "version": "0.10.0", 270 | "resolved": "https://registry.npmjs.org/selderee/-/selderee-0.10.0.tgz", 271 | "integrity": "sha512-DEL/RW/f4qLw/NrVg97xKaEBC8IpzIG2fvxnzCp3Z4yk4jQ3MXom+Imav9wApjxX2dfS3eW7x0DXafJr85i39A==", 272 | "dependencies": { 273 | "parseley": "^0.11.0" 274 | }, 275 | "funding": { 276 | "url": "https://ko-fi.com/killymxi" 277 | } 278 | }, 279 | "node_modules/web-streams-polyfill": { 280 | "version": "3.2.1", 281 | "resolved": "https://registry.npmjs.org/web-streams-polyfill/-/web-streams-polyfill-3.2.1.tgz", 282 | "integrity": "sha512-e0MO3wdXWKrLbL0DgGnUV7WHVuw9OUvL4hjgnPkIeEvESk74gAITi5G606JtZPp39cd8HA9VQzCIvA49LpPN5Q==", 283 | "engines": { 284 | "node": ">= 8" 285 | } 286 | } 287 | } 288 | } 289 | -------------------------------------------------------------------------------- /package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "search-gpt", 3 | "version": "1.2.0", 4 | "description": "Connecting the internet with ChatGPT", 5 | "license": "MIT", 6 | "repository": "https://github.com/tobiasbueschel/search-gpt", 7 | "author": "https://github.com/tobiasbueschel", 8 | "bugs": { 9 | "url": "https://github.com/tobiasbueschel/search-gpt/issues" 10 | }, 11 | "bin": { 12 | "searchgpt": "index.js" 13 | }, 14 | "engines": { 15 | "node": ">=18" 16 | }, 17 | "type": "module", 18 | "keywords": [ 19 | "chatgpt", 20 | "gpt3.5", 21 | "google-search", 22 | "cli", 23 | "generativeai", 24 | "llm" 25 | ], 26 | "dependencies": { 27 | "chalk": "^5.2.0", 28 | "dotenv": "^16.0.3", 29 | "gpt-3-encoder": "^1.1.4", 30 | "html-to-text": "^9.0.4", 31 | "node-fetch": "^3.3.0" 32 | } 33 | } 34 | --------------------------------------------------------------------------------