├── .gitignore ├── LICENSE ├── README.md ├── api ├── __init__.py ├── demo_web_app.py ├── gpt.py ├── requirements.txt └── ui_config.py ├── docs ├── getting-started.md └── priming.md ├── examples ├── moral_of_story.py ├── run_analogies_app.py ├── run_blank_example.py ├── run_command_to_email_app.py ├── run_general_knowledge_q_and_a_app.py ├── run_latex_app.py └── run_recipe_app.py ├── package.json ├── public ├── index.html ├── manifest.json └── robots.txt ├── src ├── App.css ├── App.js ├── App.test.js ├── index.css ├── index.js ├── logo.svg ├── serviceWorker.js └── setupTests.js └── yarn.lock /.gitignore: -------------------------------------------------------------------------------- 1 | # See https://help.github.com/articles/ignoring-files/ for more about ignoring files. 2 | *.cfg 3 | *venv* 4 | 5 | *pycache* 6 | 7 | # dependencies 8 | /node_modules 9 | /.pnp 10 | .pnp.js 11 | 12 | # testing 13 | /coverage 14 | 15 | # production 16 | /build 17 | 18 | # misc 19 | .DS_Store 20 | .env.local 21 | .env.development.local 22 | .env.test.local 23 | .env.production.local 24 | .env 25 | 26 | npm-debug.log* 27 | yarn-debug.log* 28 | yarn-error.log* 29 | /env -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 Shreya Shankar 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # GPT-3 Sandbox: Turn your ideas into demos in a matter of minutes 2 | 3 | Initial release date: 19 July 2020 4 | 5 | Note that this repository is not under any active development; just basic maintenance. 6 | 7 | ## Description 8 | 9 | The goal of this project is to enable users to create cool web demos using the newly released OpenAI GPT-3 API **with just a few lines of Python.** 10 | 11 | This project addresses the following issues: 12 | 13 | 1. Automatically formatting a user's inputs and outputs so that the model can effectively pattern-match 14 | 2. Creating a web app for a user to deploy locally and showcase their idea 15 | 16 | Here's a quick example of priming GPT to convert English to LaTeX: 17 | 18 | ``` 19 | # Construct GPT object and show some examples 20 | gpt = GPT(engine="davinci", 21 | temperature=0.5, 22 | max_tokens=100) 23 | gpt.add_example(Example('Two plus two equals four', '2 + 2 = 4')) 24 | gpt.add_example(Example('The integral from zero to infinity', '\\int_0^{\\infty}')) 25 | gpt.add_example(Example('The gradient of x squared plus two times x with respect to x', '\\nabla_x x^2 + 2x')) 26 | gpt.add_example(Example('The log of two times x', '\\log{2x}')) 27 | gpt.add_example(Example('x squared plus y squared plus equals z squared', 'x^2 + y^2 = z^2')) 28 | 29 | # Define UI configuration 30 | config = UIConfig(description="Text to equation", 31 | button_text="Translate", 32 | placeholder="x squared plus 2 times x") 33 | 34 | demo_web_app(gpt, config) 35 | ``` 36 | 37 | Running this code as a python script would automatically launch a web app for you to test new inputs and outputs with. There are already 3 example scripts in the `examples` directory. 38 | 39 | You can also prime GPT from the UI. for that, pass `show_example_form=True` to `UIConfig` along with other parameters. 40 | 41 | Technical details: the backend is in Flask, and the frontend is in React. Note that this repository is currently not intended for production use. 42 | 43 | ## Background 44 | 45 | GPT-3 ([Brown et al.](https://arxiv.org/abs/2005.14165)) is OpenAI's latest language model. It incrementally builds on model architectures designed in [previous](https://arxiv.org/abs/1706.03762) [research](https://arxiv.org/abs/1810.04805) studies, but its key advance is that it's extremely good at "few-shot" learning. There's a [lot](https://twitter.com/sharifshameem/status/1282676454690451457) [it](https://twitter.com/jsngr/status/1284511080715362304?s=20) [can](https://twitter.com/paraschopra/status/1284801028676653060?s=20) [do](https://www.gwern.net/GPT-3), but one of the biggest pain points is in "priming," or seeding, the model with some inputs such that the model can intelligently create new outputs. Many people have ideas for GPT-3 but struggle to make them work, since priming is a new paradigm of machine learning. Additionally, it takes a nontrivial amount of web development to spin up a demo to showcase a cool idea. We built this project to make our own idea generation easier to experiment with. 46 | 47 | This [developer toolkit](https://www.notion.so/API-Developer-Toolkit-49595ed6ffcd413e93ebff10d7e70fe7) has some great resources for those experimenting with the API, including sample prompts. 48 | 49 | ## Requirements 50 | 51 | Coding-wise, you only need Python. But for the app to run, you will need: 52 | 53 | * API key from the OpenAI API beta invite 54 | * Python 3 55 | * `yarn` 56 | * Node 16 57 | 58 | Instructions to install Python 3 are [here](https://realpython.com/installing-python/), instructions to install `yarn` are [here](https://classic.yarnpkg.com/en/docs/install/#mac-stable) and we recommend using nvm to install (and manage) Node (instructions are [here](https://github.com/nvm-sh/nvm)). 59 | 60 | ## Setup 61 | 62 | First, clone or fork this repository. Then to set up your virtual environment, do the following: 63 | 64 | 1. Create a virtual environment in the root directory: `python -m venv $ENV_NAME` 65 | 2. Activate the virtual environment: ` source $ENV_NAME/bin/activate` (for MacOS, Unix, or Linux users) or ` .\ENV_NAME\Scripts\activate` (for Windows users) 66 | 3. Install requirements: `pip install -r api/requirements.txt` 67 | 4. To add your secret key: create a file anywhere on your computer called `openai.cfg` with the contents `OPENAI_KEY=$YOUR_SECRET_KEY`, where `$YOUR_SECRET_KEY` looks something like `'sk-somerandomcharacters'` (including quotes). If you are unsure what your secret key is, navigate to the [API Keys page](https://beta.openai.com/account/api-keys) and click "Copy" next to a token displayed under "Secret Key". If there is none, click on "Create new secret key" and then copy it. 68 | 5. Set your environment variable to read the secret key: run `export OPENAI_CONFIG=/path/to/config/openai.cfg` (for MacOS, Unix, or Linux users) or `set OPENAI_CONFIG=/path/to/config/openai.cfg` (for Windows users) 69 | 6. Run `yarn install` in the root directory 70 | 71 | If you are a Windows user, to run the demos, you will need to modify the following line inside `api/demo_web_app.py`: 72 | `subprocess.Popen(["yarn", "start"])` to `subprocess.Popen(["yarn", "start"], shell=True)`. 73 | 74 | To verify that your environment is set up properly, run one of the 3 scripts in the `examples` directory: 75 | `python examples/run_latex_app.py`. 76 | 77 | A new tab should pop up in your browser, and you should be able to interact with the UI! To stop this app, run ctrl-c or command-c in your terminal. 78 | 79 | To create your own example, check out the ["getting started" docs](https://github.com/shreyashankar/gpt3-sandbox/blob/master/docs/getting-started.md). 80 | 81 | ## Interactive Priming 82 | 83 | The real power of GPT-3 is in its ability to learn to specialize to tasks given a few examples. However, priming can at times be more of an art than a science. Using the GPT and Example classes, you can easily experiment with different priming examples and immediately see their GPT on GPT-3's performance. Below is an example showing it improve incrementally at translating English to LaTeX as we feed it more examples in the python interpreter: 84 | 85 | ``` 86 | >>> from api import GPT, Example, set_openai_key 87 | >>> gpt = GPT() 88 | >>> set_openai_key(key) 89 | >>> prompt = "integral from a to b of f of x" 90 | >>> print(gpt.get_top_reply(prompt)) 91 | output: integral from a to be of f of x 92 | 93 | >>> gpt.add_example(Example("Two plus two equals four", "2 + 2 = 4")) 94 | >>> print(gpt.get_top_reply(prompt)) 95 | output: 96 | 97 | >>> gpt.add_example(Example('The integral from zero to infinity', '\\int_0^{\\infty}')) 98 | >>> print(gpt.get_top_reply(prompt)) 99 | output: \int_a^b f(x) dx 100 | 101 | ``` 102 | 103 | ## Contributions 104 | 105 | We actively encourage people to contribute by adding their own examples or even adding functionalities to the modules. Please make a pull request if you would like to add something, or create an issue if you have a question. We will update the contributors list on a regular basis. 106 | 107 | Please *do not* leave your secret key in plaintext in your pull request! 108 | 109 | ## Authors 110 | 111 | The following authors have committed 20 lines or more (ordered according to the Github contributors page): 112 | 113 | * Shreya Shankar 114 | * Bora Uyumazturk 115 | * Devin Stein 116 | * Gulan 117 | * Michael Lavelle 118 | 119 | 120 | -------------------------------------------------------------------------------- /api/__init__.py: -------------------------------------------------------------------------------- 1 | """Exports GPT, Example, and UIConfig classes.""" 2 | 3 | from .gpt import * 4 | from .ui_config import * 5 | from .demo_web_app import demo_web_app 6 | -------------------------------------------------------------------------------- /api/demo_web_app.py: -------------------------------------------------------------------------------- 1 | """Runs the web app given a GPT object and UI configuration.""" 2 | 3 | from http import HTTPStatus 4 | import json 5 | import subprocess 6 | import openai 7 | 8 | from flask import Flask, request, Response 9 | 10 | from .gpt import set_openai_key, Example 11 | from .ui_config import UIConfig 12 | 13 | CONFIG_VAR = "OPENAI_CONFIG" 14 | KEY_NAME = "OPENAI_KEY" 15 | 16 | 17 | def demo_web_app(gpt, config=UIConfig()): 18 | """Creates Flask app to serve the React app.""" 19 | app = Flask(__name__) 20 | 21 | app.config.from_envvar(CONFIG_VAR) 22 | set_openai_key(app.config[KEY_NAME]) 23 | 24 | @app.route("/params", methods=["GET"]) 25 | def get_params(): 26 | # pylint: disable=unused-variable 27 | response = config.json() 28 | return response 29 | 30 | def error(err_msg, status_code): 31 | return Response(json.dumps({"error": err_msg}), status=status_code) 32 | 33 | def get_example(example_id): 34 | """Gets a single example or all the examples.""" 35 | # return all examples 36 | if not example_id: 37 | return json.dumps(gpt.get_all_examples()) 38 | 39 | example = gpt.get_example(example_id) 40 | if not example: 41 | return error("id not found", HTTPStatus.NOT_FOUND) 42 | return json.dumps(example.as_dict()) 43 | 44 | def post_example(): 45 | """Adds an empty example.""" 46 | new_example = Example("", "") 47 | gpt.add_example(new_example) 48 | return json.dumps(gpt.get_all_examples()) 49 | 50 | def put_example(args, example_id): 51 | """Modifies an existing example.""" 52 | if not example_id: 53 | return error("id required", HTTPStatus.BAD_REQUEST) 54 | 55 | example = gpt.get_example(example_id) 56 | if not example: 57 | return error("id not found", HTTPStatus.NOT_FOUND) 58 | 59 | if "input" in args: 60 | example.input = args["input"] 61 | if "output" in args: 62 | example.output = args["output"] 63 | 64 | # update the example 65 | gpt.add_example(example) 66 | return json.dumps(example.as_dict()) 67 | 68 | def delete_example(example_id): 69 | """Deletes an example.""" 70 | if not example_id: 71 | return error("id required", HTTPStatus.BAD_REQUEST) 72 | 73 | gpt.delete_example(example_id) 74 | return json.dumps(gpt.get_all_examples()) 75 | 76 | @app.route( 77 | "/examples", 78 | methods=["GET", "POST"], 79 | defaults={"example_id": ""}, 80 | ) 81 | @app.route( 82 | "/examples/", 83 | methods=["GET", "PUT", "DELETE"], 84 | ) 85 | def examples(example_id): 86 | method = request.method 87 | args = request.json 88 | if method == "GET": 89 | return get_example(example_id) 90 | if method == "POST": 91 | return post_example() 92 | if method == "PUT": 93 | return put_example(args, example_id) 94 | if method == "DELETE": 95 | return delete_example(example_id) 96 | return error("Not implemented", HTTPStatus.NOT_IMPLEMENTED) 97 | 98 | @app.route("/translate", methods=["GET", "POST"]) 99 | def translate(): 100 | # pylint: disable=unused-variable 101 | prompt = request.json["prompt"] 102 | response = gpt.submit_request(prompt) 103 | offset = 0 104 | if not gpt.append_output_prefix_to_query: 105 | offset = len(gpt.output_prefix) 106 | return {'text': response['choices'][0]['text'][offset:]} 107 | 108 | subprocess.Popen(["yarn", "start"]) 109 | app.run() 110 | -------------------------------------------------------------------------------- /api/gpt.py: -------------------------------------------------------------------------------- 1 | """Creates the Example and GPT classes for a user to interface with the OpenAI 2 | API.""" 3 | 4 | import openai 5 | import uuid 6 | 7 | 8 | def set_openai_key(key): 9 | """Sets OpenAI key.""" 10 | openai.api_key = key 11 | 12 | 13 | class Example: 14 | """Stores an input, output pair and formats it to prime the model.""" 15 | def __init__(self, inp, out): 16 | self.input = inp 17 | self.output = out 18 | self.id = uuid.uuid4().hex 19 | 20 | def get_input(self): 21 | """Returns the input of the example.""" 22 | return self.input 23 | 24 | def get_output(self): 25 | """Returns the intended output of the example.""" 26 | return self.output 27 | 28 | def get_id(self): 29 | """Returns the unique ID of the example.""" 30 | return self.id 31 | 32 | def as_dict(self): 33 | return { 34 | "input": self.get_input(), 35 | "output": self.get_output(), 36 | "id": self.get_id(), 37 | } 38 | 39 | 40 | class GPT: 41 | """The main class for a user to interface with the OpenAI API. 42 | 43 | A user can add examples and set parameters of the API request. 44 | """ 45 | def __init__(self, 46 | engine='davinci', 47 | temperature=0.5, 48 | max_tokens=100, 49 | input_prefix="input: ", 50 | input_suffix="\n", 51 | output_prefix="output: ", 52 | output_suffix="\n\n", 53 | append_output_prefix_to_query=False): 54 | self.examples = {} 55 | self.engine = engine 56 | self.temperature = temperature 57 | self.max_tokens = max_tokens 58 | self.input_prefix = input_prefix 59 | self.input_suffix = input_suffix 60 | self.output_prefix = output_prefix 61 | self.output_suffix = output_suffix 62 | self.append_output_prefix_to_query = append_output_prefix_to_query 63 | self.stop = (output_suffix + input_prefix).strip() 64 | 65 | def add_example(self, ex): 66 | """Adds an example to the object. 67 | 68 | Example must be an instance of the Example class. 69 | """ 70 | assert isinstance(ex, Example), "Please create an Example object." 71 | self.examples[ex.get_id()] = ex 72 | 73 | def delete_example(self, id): 74 | """Delete example with the specific id.""" 75 | if id in self.examples: 76 | del self.examples[id] 77 | 78 | def get_example(self, id): 79 | """Get a single example.""" 80 | return self.examples.get(id, None) 81 | 82 | def get_all_examples(self): 83 | """Returns all examples as a list of dicts.""" 84 | return {k: v.as_dict() for k, v in self.examples.items()} 85 | 86 | def get_prime_text(self): 87 | """Formats all examples to prime the model.""" 88 | return "".join( 89 | [self.format_example(ex) for ex in self.examples.values()]) 90 | 91 | def get_engine(self): 92 | """Returns the engine specified for the API.""" 93 | return self.engine 94 | 95 | def get_temperature(self): 96 | """Returns the temperature specified for the API.""" 97 | return self.temperature 98 | 99 | def get_max_tokens(self): 100 | """Returns the max tokens specified for the API.""" 101 | return self.max_tokens 102 | 103 | def craft_query(self, prompt): 104 | """Creates the query for the API request.""" 105 | q = self.get_prime_text( 106 | ) + self.input_prefix + prompt + self.input_suffix 107 | if self.append_output_prefix_to_query: 108 | q = q + self.output_prefix 109 | 110 | return q 111 | 112 | def submit_request(self, prompt): 113 | """Calls the OpenAI API with the specified parameters.""" 114 | response = openai.Completion.create(engine=self.get_engine(), 115 | prompt=self.craft_query(prompt), 116 | max_tokens=self.get_max_tokens(), 117 | temperature=self.get_temperature(), 118 | top_p=1, 119 | n=1, 120 | stream=False, 121 | stop=self.stop) 122 | return response 123 | 124 | def get_top_reply(self, prompt): 125 | """Obtains the best result as returned by the API.""" 126 | response = self.submit_request(prompt) 127 | return response['choices'][0]['text'] 128 | 129 | def format_example(self, ex): 130 | """Formats the input, output pair.""" 131 | return self.input_prefix + ex.get_input( 132 | ) + self.input_suffix + self.output_prefix + ex.get_output( 133 | ) + self.output_suffix 134 | -------------------------------------------------------------------------------- /api/requirements.txt: -------------------------------------------------------------------------------- 1 | astroid==2.4.2 2 | certifi==2020.6.20 3 | chardet==3.0.4 4 | click==7.1.2 5 | Flask==1.1.2 6 | idna==2.10 7 | itsdangerous==1.1.0 8 | Jinja2==2.11.3 9 | MarkupSafe==1.1.1 10 | openai==0.2.4 11 | pylint==2.5.3 12 | python-dotenv==0.14.0 13 | requests==2.27.1 14 | six==1.15.0 15 | urllib3==1.26.5 16 | Werkzeug==2.2.3 17 | -------------------------------------------------------------------------------- /api/ui_config.py: -------------------------------------------------------------------------------- 1 | """Class to store customized UI parameters.""" 2 | 3 | 4 | class UIConfig(): 5 | """Stores customized UI parameters.""" 6 | 7 | def __init__(self, description='Description', 8 | button_text='Submit', 9 | placeholder='Default placeholder', 10 | show_example_form=False): 11 | self.description = description 12 | self.button_text = button_text 13 | self.placeholder = placeholder 14 | self.show_example_form = show_example_form 15 | 16 | def get_description(self): 17 | """Returns the input of the example.""" 18 | return self.description 19 | 20 | def get_button_text(self): 21 | """Returns the intended output of the example.""" 22 | return self.button_text 23 | 24 | def get_placeholder(self): 25 | """Returns the intended output of the example.""" 26 | return self.placeholder 27 | 28 | def get_show_example_form(self): 29 | """Returns whether editable example form is shown.""" 30 | return self.show_example_form 31 | 32 | def json(self): 33 | """Used to send the parameter values to the API.""" 34 | return {"description": self.description, 35 | "button_text": self.button_text, 36 | "placeholder": self.placeholder, 37 | "show_example_form": self.show_example_form} 38 | -------------------------------------------------------------------------------- /docs/getting-started.md: -------------------------------------------------------------------------------- 1 | # Getting Started 2 | 3 | ## Creating a GPT-3 Powered Web App 4 | 5 | Note: This is a work in progress, but the essential functions are described here. 6 | 7 | First, you will want to create a `GPT` object, which optionally acceps the parameters `engine`, `temperature`, and `max_tokens` (otherwise defaults to values in the following snippet): 8 | 9 | ``` 10 | from api import GPT 11 | 12 | gpt = GPT(engine="davinci", 13 | temperature=0.5, 14 | max_tokens=100) 15 | ``` 16 | 17 | Since we're mainly interested in constructing a demo, we do not provide an interface for you to change other parameters. Feel free to fork the repository and make as many changes to the code as you would like. 18 | 19 | Once the `GPT` object is created, you need to "prime" it with several examples. The goal of these examples are to show the model some patterns that you are hoping for it to recognize. The `Example` constructor accepts an input string and a corresponding output string. To construct an `Example`, you can run the following code: 20 | 21 | ``` 22 | from api import Example 23 | 24 | ex = Example(inp="Hello", out="Hola") 25 | ``` 26 | 27 | After constructing some examples, you can add them to your `GPT` object by calling the `add_example` method, which only accepts an `Example`: 28 | 29 | ``` 30 | gpt.add_example(ex) 31 | ``` 32 | 33 | Finally, once you've added all of your examples, it's time to run the demo! But first, in order to customize the web app to your idea, you can optionally create a `UIConfig` with `description`, `button_text`, and `placeholder` (text initially shown in the input box) parameters: 34 | 35 | ``` 36 | from api import UIConfig 37 | 38 | config = UIConfig(description="Analogies generator", 39 | button_text="Generate", 40 | placeholder="Memes are like") 41 | ``` 42 | 43 | Now you can run the web app! Call the `demo_web_app` with your `GPT` and (optional) `UIConfig` objects: 44 | 45 | ``` 46 | from api import demo_web_app 47 | 48 | demo_web_app(gpt, config) 49 | ``` 50 | 51 | Save this python script to a file and run the file as you would normally run a Python file: 52 | 53 | `python path_to_file.py` 54 | 55 | in your shell. A web app should pop up in your browser in a few seconds, and you should be able to interact with your primed model. Please open any issues if you have questions! 56 | -------------------------------------------------------------------------------- /docs/priming.md: -------------------------------------------------------------------------------- 1 | ## Priming 2 | 3 | As smart as GPT-3 is, it still doesn't excel at most tasks out of the box. It 4 | benefits greatly from seeing a few examples, a process we like to refer to as "priming". 5 | Finding a set of examples which focuses GPT-3 on your specific use case will inevitably require 6 | a bit of trial-and-error. To make this step easier, we designed our GPT interface to allow for 7 | easy testing and exploration using the python interactive interpreter. Below we walk you through 8 | an example of how to do so, again using the English to LaTeX use case. 9 | 10 | First, open up your python interpreter by running `python` or `python3`. Next you'll need to 11 | import the necessary items from the `api` package. You'll need the `GPT` class, 12 | the `Example` class, and `set_openai_key`: 13 | 14 | ``` 15 | >>> from api import GPT, Example, set_openai_key 16 | ``` 17 | 18 | Next, you'll want to set your open ai key to gain access to the beta. 19 | 20 | ``` 21 | >>> set_openai_key("YOUR_OPENAI_KEY") # omit the Bearer, it should look like "sk-..." 22 | ``` 23 | 24 | Next, initialize your GPT class. You have the option of setting a few of the query 25 | parameters such as `engine` and `temperature`, but we'll just go with the default 26 | setup for simplicity. 27 | 28 | ``` 29 | >>> gpt = GPT() 30 | ``` 31 | 32 | Now we're ready to give it a prompt and see how it does! You can conveniently get 33 | GPT-3's response using the `get_top_reply` method. 34 | 35 | ``` 36 | >>> print(gpt.get_top_reply("sum from one to infinity of one over n squared")) 37 | 38 | output: n squared over n 39 | 40 | ``` 41 | 42 | Clearly this needs some priming. To prime, you call `add_example` on your `gpt` object, 43 | feeding it an instance of the `Example` class. Let's add a few examples and try again. 44 | 45 | ``` 46 | >>> gpt.add_example(Example("four y plus three x cubed", "4y + 3x^3")) 47 | >>> gpt.add_example(Example("integral from a to b", "\\int_a^b")) 48 | >>> print(gpt.get_top_reply("sum from one to infinity of one over n squared")) 49 | output: 1/n^2 50 | 51 | ``` 52 | 53 | Better, but not quite there. Better, but not quite there. Let's give it an expression with a 54 | sum and then see what happens: 55 | 56 | ``` 57 | >>> gpt.add_example(Example("sum from zero to twelve of i", "\\sum_{i=0}^5 i")) 58 | >>> print(gpt.get_top_reply("sum from one to infinity of one over n squared")) 59 | output: \sum_{n=1}^\infty \frac{1}{n^2} 60 | 61 | >>> print(gpt.get_top_reply("sum from one to infinity of one over two to the n")) 62 | output: \sum_{n=1}^\infty \frac{1}{2^n} 63 | ``` 64 | 65 | Finally, it works! Now go and see what other crazy stuff you can do with GPT-3! 66 | 67 | 68 | 69 | 70 | 71 | -------------------------------------------------------------------------------- /examples/moral_of_story.py: -------------------------------------------------------------------------------- 1 | import os 2 | import sys 3 | sys.path.append(os.path.dirname(os.path.dirname(os.path.realpath(__file__)))) 4 | 5 | from api import GPT, Example, UIConfig 6 | from api import demo_web_app 7 | 8 | gpt = GPT(temperature=0.5, max_tokens=500) 9 | 10 | gpt.add_example(Example( 11 | "A boy named John was upset. His father found him crying.When his father asked John why he was crying, he said that he had a lot of problems in his life.His father simply smiled and asked him to get a potato, an egg, and some coffee beans. He placed them in three bowls.He then asked John to feel their texture and then fill each bowl with water.John did as he had been told. His father then boiled all three bowls.Once the bowls had cooled down, John’s father asked him to feel the texture of the different food items again.John noticed that the potato had become soft and its skin was peeling off easily; the egg had become harder and tougher; the coffee beans had completely changed and filled the bowl of water with aroma and flavour.", 12 | "Life will always have problems and pressures, like the boiling water in the story. It’s how you respond and react to these problems that counts the most!" 13 | )) 14 | 15 | gpt.add_example(Example( 16 | "Once upon a time in a circus, five elephants that performed circus tricks. They were kept tied up with weak rope that they could’ve easily escaped, but did not.One day, a man visiting the circus asked the ringmaster: “Why haven’t these elephants broken the rope and run away?”The ringmaster replied: “From when they were young, the elephants were made to believe that they were not strong enough to break the ropes and escape.”It was because of this belief that they did not even try to break the ropes now.", 17 | "Don’t give in to the limitations of society. Believe that you can achieve everything you want to!" 18 | )) 19 | 20 | gpt.add_example(Example( 21 | "A long time ago, there lived a king in Greece named Midas.He was extremely wealthy and had all the gold he could ever need. He also had a daughter whom he loved very much.One day, Midas saw a Satyr (an angel) who was stuck and was in trouble. Midas helped the Satyr and asked for his wish to be granted in return.The Satyr agreed and Midas wished for everything he touched to be turned to gold. His wish was granted.Extremely excited, Midas went home to his wife and daughter touching pebbles, rocks, and plants on the way, which turned into gold.As his daughter hugged him, she turned into a golden statue.Having learnt his lesson, Midas begged the Satyr to reverse the spell who granted that everything would go back to their original state.", 22 | "Stay content and grateful with what you have. Greed will not get you anywhere." 23 | )) 24 | 25 | 26 | 27 | 28 | config = UIConfig(description= "Describe the moral of the short story", 29 | button_text= "Get Moral", 30 | placeholder= "This popular story is about a hare (an animal belonging to the rabbit family), which is known to move quickly and a tortoise, which is known to move slower.The story began when the hare who has won many races proposed a race with the tortoise. The hare simply wanted to prove that he was the best and have the satisfaction of beating him.The tortoise agreed and the race began.The hare got a head-start but became overconfident towards the end of the race. His ego made him believe that he could win the race even if he rested for a while.And so, he took a nap right near the finish line.Meanwhile, the tortoise walked slowly but extremely determined and dedicated. He did not give up for a second and kept persevering despite the odds not being in his favour.While the hare was asleep, the tortoise crossed the finish line and won the race!The best part was that the tortoise did not gloat or put the hare down!") 31 | 32 | demo_web_app(gpt, config) 33 | -------------------------------------------------------------------------------- /examples/run_analogies_app.py: -------------------------------------------------------------------------------- 1 | """Idea taken from https://www.notion.so/Analogies-Generator-9b046963f52f446b9bef84aa4e416a4c""" 2 | 3 | import os 4 | import sys 5 | sys.path.append(os.path.dirname(os.path.dirname(os.path.realpath(__file__)))) 6 | 7 | from api import GPT, Example, UIConfig 8 | from api import demo_web_app 9 | 10 | 11 | # Construct GPT object and show some examples 12 | gpt = GPT(engine="davinci", 13 | temperature=0.5, 14 | max_tokens=100) 15 | 16 | gpt.add_example(Example('Neural networks are like', 17 | 'genetic algorithms in that both are systems that learn from experience.')) 18 | gpt.add_example(Example('Social media is like', 19 | 'a market in that both are systems that coordinate the actions of many individuals.')) 20 | gpt.add_example(Example( 21 | 'A2E is like', 'lipofuscin in that both are byproducts of the normal operation of a system.')) 22 | gpt.add_example(Example('Haskell is like', 23 | 'LISP in that both are functional languages.')) 24 | gpt.add_example(Example('Quaternions are like', 25 | 'matrices in that both are used to represent rotations in three dimensions.')) 26 | gpt.add_example(Example('Quaternions are like', 27 | 'octonions in that both are examples of non-commutative algebra.')) 28 | 29 | # Define UI configuration 30 | config = UIConfig(description="Analogies generator", 31 | button_text="Generate", 32 | placeholder="Memes are like") 33 | 34 | demo_web_app(gpt, config) 35 | -------------------------------------------------------------------------------- /examples/run_blank_example.py: -------------------------------------------------------------------------------- 1 | import os 2 | import sys 3 | 4 | sys.path.append(os.path.dirname(os.path.dirname(os.path.realpath(__file__)))) 5 | 6 | from api import GPT, Example, UIConfig 7 | from api import demo_web_app 8 | 9 | 10 | # Construct GPT object and show some examples 11 | gpt = GPT(engine="davinci", temperature=0.5, max_tokens=100) 12 | 13 | gpt.add_example(Example("Who are you?", "I'm an example.")) 14 | gpt.add_example(Example("What are you?", "I'm an example.")) 15 | 16 | # Define UI configuration 17 | config = UIConfig( 18 | description="Prompt", 19 | button_text="Result", 20 | placeholder="Where are you?", 21 | show_example_form=True, 22 | ) 23 | 24 | demo_web_app(gpt, config) 25 | -------------------------------------------------------------------------------- /examples/run_command_to_email_app.py: -------------------------------------------------------------------------------- 1 | """Idea taken from https://www.notion.so/Sentence-Email-Generator-a36d269ce8e94cc58daf723f8ba8fe3e""" 2 | 3 | import os 4 | import sys 5 | sys.path.append(os.path.dirname(os.path.dirname(os.path.realpath(__file__)))) 6 | 7 | from api import GPT, Example, UIConfig 8 | from api import demo_web_app 9 | 10 | 11 | # Construct GPT object and show some examples 12 | gpt = GPT(engine="davinci", 13 | temperature=0.4, 14 | max_tokens=60) 15 | 16 | gpt.add_example(Example('Thank John for the book.', 17 | 'Dear John, Thank you so much for the book. I really appreciate it. I hope to hang out soon. Your friend, Sarah.')) 18 | 19 | gpt.add_example(Example('Tell TechCorp I appreciate the great service.', 20 | 'To Whom it May Concern, I want you to know that I appreciate the great service at TechCorp. The staff is outstanding and I enjoy every visit. Sincerely, Bill Johnson')) 21 | 22 | gpt.add_example(Example('Invoice Kelly Watkins $500 for design consultation.', 23 | 'Dear Ms. Watkins, This is my invoice for $500 for design consultation. It was a pleasure to work with you. Sincerely, Emily Fields')) 24 | 25 | gpt.add_example(Example('Invite Amanda and Paul to the company event Friday night.', 26 | 'Dear Amanda and Paul, I hope this finds you doing well. I want to invite you to our company event on Friday night. It will be a great opportunity for networking and there will be food and drinks. Should be fun. Best, Ryan')) 27 | 28 | # Define UI configuration 29 | config = UIConfig(description="Command to email generator", 30 | button_text="Generate", 31 | placeholder="Ask RAM Co. if they have new storage units in stock.") 32 | 33 | demo_web_app(gpt, config) 34 | -------------------------------------------------------------------------------- /examples/run_general_knowledge_q_and_a_app.py: -------------------------------------------------------------------------------- 1 | import os 2 | import sys 3 | sys.path.append(os.path.dirname(os.path.dirname(os.path.realpath(__file__)))) 4 | 5 | from api import demo_web_app 6 | from api import GPT, Example, UIConfig 7 | 8 | 9 | question_prefix = 'Q: ' 10 | question_suffix = "\n" 11 | answer_prefix = "A: " 12 | answer_suffix = "\n\n" 13 | 14 | 15 | # Construct GPT object and show some examples 16 | gpt = GPT(engine="davinci", 17 | temperature=0.5, 18 | max_tokens=100, 19 | input_prefix=question_prefix, 20 | input_suffix=question_suffix, 21 | output_prefix=answer_prefix, 22 | output_suffix=answer_suffix, 23 | append_output_prefix_to_query=True) 24 | 25 | gpt.add_example(Example('What is human life expectancy in the United States?', 26 | 'Human life expectancy in the United States is 78 years.')) 27 | gpt.add_example( 28 | Example('Who was president of the United States in 1955?', 'Dwight D. Eisenhower was president of the United States in 1955.')) 29 | gpt.add_example(Example( 30 | 'What party did he belong to?', 'He belonged to the Republican Party.')) 31 | gpt.add_example(Example('Who was president of the United States before George W. Bush?', 32 | 'Bill Clinton was president of the United States before George W. Bush.')) 33 | gpt.add_example(Example('In what year was the Coronation of Queen Elizabeth?', 34 | 'The Coronation of Queen Elizabeth was in 1953.')) 35 | 36 | 37 | # Define UI configuration 38 | config = UIConfig(description="Question to Answer", 39 | button_text="Answer", 40 | placeholder="Who wrote the song 'Hey Jude'?") 41 | 42 | demo_web_app(gpt, config) 43 | -------------------------------------------------------------------------------- /examples/run_latex_app.py: -------------------------------------------------------------------------------- 1 | import os 2 | import sys 3 | sys.path.append(os.path.dirname(os.path.dirname(os.path.realpath(__file__)))) 4 | 5 | from api import GPT, Example, UIConfig 6 | from api import demo_web_app 7 | 8 | 9 | # Construct GPT object and show some examples 10 | gpt = GPT(engine="davinci", 11 | temperature=0.5, 12 | max_tokens=100) 13 | 14 | gpt.add_example(Example('Two plus two equals four', '2 + 2 = 4')) 15 | gpt.add_example( 16 | Example('The integral from zero to infinity', '\\int_0^{\\infty}')) 17 | gpt.add_example(Example( 18 | 'The gradient of x squared plus two times x with respect to x', '\\nabla_x x^2 + 2x')) 19 | gpt.add_example(Example('The log of two times x', '\\log{2x}')) 20 | gpt.add_example( 21 | Example('x squared plus y squared plus equals z squared', 'x^2 + y^2 = z^2')) 22 | gpt.add_example( 23 | Example('The sum from zero to twelve of i squared', '\\sum_{i=0}^{12} i^2')) 24 | gpt.add_example(Example('E equals m times c squared', 'E = mc^2')) 25 | gpt.add_example(Example('H naught of t', 'H_0(t)')) 26 | gpt.add_example(Example('f of n equals 1 over (b-a) if n is 0 otherwise 5', 27 | 'f(n) = \\begin{cases} 1/(b-a) &\\mbox{if } n \\equiv 0 \\\ # 5 \\end{cases}')) 28 | 29 | # Define UI configuration 30 | config = UIConfig(description="Text to equation", 31 | button_text="Translate", 32 | placeholder="x squared plus 2 times x") 33 | 34 | demo_web_app(gpt, config) 35 | -------------------------------------------------------------------------------- /examples/run_recipe_app.py: -------------------------------------------------------------------------------- 1 | import os 2 | import sys 3 | sys.path.append(os.path.dirname(os.path.dirname(os.path.realpath(__file__)))) 4 | 5 | from api import GPT, Example, UIConfig 6 | from api import demo_web_app 7 | 8 | gpt = GPT(temperature=0.5, max_tokens=500) 9 | 10 | gpt.add_example(Example( 11 | "how to roast eggplant", 12 | "How do you cook eggplant in the oven? Well, there are a couple ways. To roast whole eggplants in the oven, leave the skin on and roast at 400 degrees F (200 degrees C) until the skin gets wrinkly and begins to collapse in on the softened fruit. This method will also produce velvety smooth eggplant dips or spreads." 13 | )) 14 | 15 | gpt.add_example(Example( 16 | "how to bake eggplant", 17 | "To bake eggplant, you'll cut the eggplant into rounds or strips and prepare them as the recipe indicates -- for example, you can dredge them in egg and breadcrumbs or simply brush them with olive oil and bake them in a 350 degree F oven." 18 | )) 19 | 20 | gpt.add_example(Example( 21 | "how to make puerto rican steamed rice", 22 | "Bring vegetable oil, water, and salt to a boil in a saucepan over high heat. Add rice, and cook until the water has just about cooked out; stir. Reduce heat to medium-low. Cover, and cook for 20 to 25 minutes. Stir again, and serve. Rice may be a little sticky and may stick to bottom of pot." 23 | )) 24 | 25 | gpt.add_example(Example( 26 | "how to make oatmeal peanut butter cookies", 27 | "Preheat oven to 350 degrees F (175 degrees C). In a large bowl, cream together shortening, margarine, brown sugar, white sugar, and peanut butter until smooth. Beat in the eggs one at a time until well blended. Combine the flour, baking soda, and salt; stir into the creamed mixture. Mix in the oats until just combined. Drop by teaspoonfuls onto ungreased cookie sheets. Bake for 10 to 15 minutes in the preheated oven, or until just light brown. Don't over-bake. Cool and store in an airtight container." 28 | )) 29 | 30 | 31 | config = UIConfig(description= "How to cook stuff", 32 | button_text= "show me", 33 | placeholder= "how to make a breakfast burrito") 34 | 35 | demo_web_app(gpt, config) 36 | -------------------------------------------------------------------------------- /package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "gpt3-sandbox", 3 | "version": "0.1.0", 4 | "private": true, 5 | "dependencies": { 6 | "@testing-library/jest-dom": "^4.2.4", 7 | "@testing-library/react": "^9.3.2", 8 | "@testing-library/user-event": "^7.1.2", 9 | "axios": "^0.21.2", 10 | "bootstrap": "^4.5.0", 11 | "lodash": "^4.17.21", 12 | "react": "^16.13.1", 13 | "react-bootstrap": "^1.2.2", 14 | "react-dom": "^16.13.1", 15 | "react-latex": "^2.0.0", 16 | "react-latex-next": "^1.2.0", 17 | "react-scripts": "^3.4.1" 18 | }, 19 | "scripts": { 20 | "start": "react-scripts start", 21 | "start-api": "cd api && venv/bin/flask run --no-debugger", 22 | "build": "react-scripts build", 23 | "test": "react-scripts test", 24 | "eject": "react-scripts eject" 25 | }, 26 | "eslintConfig": { 27 | "extends": "react-app" 28 | }, 29 | "browserslist": { 30 | "production": [ 31 | ">0.2%", 32 | "not dead", 33 | "not op_mini all" 34 | ], 35 | "development": [ 36 | "last 1 chrome version", 37 | "last 1 firefox version", 38 | "last 1 safari version" 39 | ] 40 | }, 41 | "proxy": "http://localhost:5000" 42 | } 43 | -------------------------------------------------------------------------------- /public/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 11 | 15 | 16 | 24 | GPT-3 Sandbox 25 | 26 | 27 | 28 |
29 | 37 | 38 | -------------------------------------------------------------------------------- /public/manifest.json: -------------------------------------------------------------------------------- 1 | { 2 | "short_name": "GPT-3 Sandbox", 3 | "name": "GPT-3 Sandbox", 4 | "icons": [], 5 | "start_url": ".", 6 | "display": "standalone", 7 | "theme_color": "#000000", 8 | "background_color": "#ffffff" 9 | } 10 | -------------------------------------------------------------------------------- /public/robots.txt: -------------------------------------------------------------------------------- 1 | # https://www.robotstxt.org/robotstxt.html 2 | User-agent: * 3 | Disallow: 4 | -------------------------------------------------------------------------------- /src/App.css: -------------------------------------------------------------------------------- 1 | .App { 2 | text-align: center; 3 | } 4 | 5 | .App-logo { 6 | height: 40vmin; 7 | pointer-events: none; 8 | } 9 | 10 | @media (prefers-reduced-motion: no-preference) { 11 | .App-logo { 12 | animation: App-logo-spin infinite 20s linear; 13 | } 14 | } 15 | 16 | .App-header { 17 | background-color: #282c34; 18 | min-height: 100vh; 19 | display: flex; 20 | flex-direction: column; 21 | align-items: center; 22 | justify-content: center; 23 | font-size: calc(10px + 2vmin); 24 | color: white; 25 | } 26 | 27 | .App-link { 28 | color: #61dafb; 29 | } 30 | 31 | @keyframes App-logo-spin { 32 | from { 33 | transform: rotate(0deg); 34 | } 35 | to { 36 | transform: rotate(360deg); 37 | } 38 | } 39 | 40 | #loading { 41 | margin-left: 5px; 42 | } 43 | -------------------------------------------------------------------------------- /src/App.js: -------------------------------------------------------------------------------- 1 | import React from "react"; 2 | import { Form, Button, Row, Col } from "react-bootstrap"; 3 | import axios from "axios"; 4 | import { debounce } from "lodash"; 5 | 6 | import "bootstrap/dist/css/bootstrap.min.css"; 7 | 8 | const UI_PARAMS_API_URL = "/params"; 9 | const TRANSLATE_API_URL = "/translate"; 10 | const EXAMPLE_API_URL = "/examples"; 11 | 12 | const DEBOUNCE_INPUT = 250; 13 | 14 | class App extends React.Component { 15 | constructor(props) { 16 | super(props); 17 | this.state = { 18 | output: "", 19 | input: "", 20 | buttonText: "Submit", 21 | description: "Description", 22 | showExampleForm: false, 23 | examples: {}, 24 | }; 25 | // Bind the event handlers 26 | this.handleInputChange = this.handleInputChange.bind(this); 27 | this.handleClick = this.handleClick.bind(this); 28 | } 29 | 30 | componentDidMount() { 31 | // Call API for the UI params 32 | axios 33 | .get(UI_PARAMS_API_URL) 34 | .then( 35 | ({ 36 | data: { placeholder, button_text, description, show_example_form }, 37 | }) => { 38 | this.setState({ 39 | input: placeholder, 40 | buttonText: button_text, 41 | description: description, 42 | showExampleForm: show_example_form, 43 | }); 44 | if (this.state.showExampleForm) { 45 | axios.get(EXAMPLE_API_URL).then(({ data: examples }) => { 46 | this.setState({ examples }); 47 | }); 48 | } 49 | } 50 | ); 51 | const load = document.getElementById("loading"); 52 | load.style.visibility = "visible"; 53 | } 54 | 55 | updateExample(id, body) { 56 | axios.put(`${EXAMPLE_API_URL}/${id}`, body); 57 | } 58 | 59 | debouncedUpdateExample = debounce(this.updateExample, DEBOUNCE_INPUT); 60 | 61 | handleExampleChange = (id, field) => (e) => { 62 | const text = e.target.value; 63 | 64 | let body = { [field]: text }; 65 | let examples = { ...this.state.examples }; 66 | examples[id][field] = text; 67 | 68 | this.setState({ examples }); 69 | this.debouncedUpdateExample(id, body); 70 | }; 71 | 72 | handleExampleDelete = (id) => (e) => { 73 | e.preventDefault(); 74 | axios.delete(`${EXAMPLE_API_URL}/${id}`).then(({ data: examples }) => { 75 | this.setState({ examples }); 76 | }); 77 | }; 78 | 79 | handleExampleAdd = (e) => { 80 | e.preventDefault(); 81 | axios.post(EXAMPLE_API_URL).then(({ data: examples }) => { 82 | this.setState({ examples }); 83 | }); 84 | }; 85 | 86 | handleInputChange(e) { 87 | this.setState({ input: e.target.value }); 88 | } 89 | 90 | handleClick(e) { 91 | e.preventDefault(); 92 | const load = document.getElementById("loading"); 93 | load.style.visibility = "visible"; 94 | let body = { 95 | prompt: this.state.input, 96 | }; 97 | axios.post(TRANSLATE_API_URL, body).then(({ data: { text } }) => { 98 | this.setState({ output: text }); 99 | load.style.visibility = "hidden"; 100 | }); 101 | } 102 | 103 | render() { 104 | const showExampleForm = this.state.showExampleForm; 105 | return ( 106 |
107 | 108 | 109 |
119 |
120 | 121 | {showExampleForm && ( 122 |
123 |

Examples

124 | {Object.values(this.state.examples).map((example) => ( 125 | 126 | 130 | 131 | Example Input 132 | 133 | 134 | 144 | 145 | 146 | 150 | 151 | Example Output 152 | 153 | 154 | 164 | 165 | 166 | 167 | 168 | 176 | 177 | 178 | 179 | ))} 180 | 181 | 182 | 189 | 190 | 191 |
192 | )} 193 | {this.state.description} 194 | 201 |
202 | 203 | 206 |
211 | Loading... 212 |
213 |
214 |
221 | {this.state.output} 222 |
223 |
224 | 225 |
226 | ); 227 | } 228 | } 229 | 230 | export default App; 231 | -------------------------------------------------------------------------------- /src/App.test.js: -------------------------------------------------------------------------------- 1 | import React from 'react'; 2 | import { render } from '@testing-library/react'; 3 | import App from './App'; 4 | 5 | test('renders learn react link', () => { 6 | const { getByText } = render(); 7 | const linkElement = getByText(/learn react/i); 8 | expect(linkElement).toBeInTheDocument(); 9 | }); 10 | -------------------------------------------------------------------------------- /src/index.css: -------------------------------------------------------------------------------- 1 | body { 2 | margin: 0; 3 | font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen', 4 | 'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue', 5 | sans-serif; 6 | -webkit-font-smoothing: antialiased; 7 | -moz-osx-font-smoothing: grayscale; 8 | } 9 | 10 | code { 11 | font-family: source-code-pro, Menlo, Monaco, Consolas, 'Courier New', 12 | monospace; 13 | } 14 | -------------------------------------------------------------------------------- /src/index.js: -------------------------------------------------------------------------------- 1 | import React from 'react'; 2 | import ReactDOM from 'react-dom'; 3 | import './index.css'; 4 | import App from './App'; 5 | import * as serviceWorker from './serviceWorker'; 6 | 7 | ReactDOM.render( 8 | 9 | 10 | , 11 | document.getElementById('root') 12 | ); 13 | 14 | // If you want your app to work offline and load faster, you can change 15 | // unregister() to register() below. Note this comes with some pitfalls. 16 | // Learn more about service workers: https://bit.ly/CRA-PWA 17 | serviceWorker.unregister(); 18 | -------------------------------------------------------------------------------- /src/logo.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | -------------------------------------------------------------------------------- /src/serviceWorker.js: -------------------------------------------------------------------------------- 1 | // This optional code is used to register a service worker. 2 | // register() is not called by default. 3 | 4 | // This lets the app load faster on subsequent visits in production, and gives 5 | // it offline capabilities. However, it also means that developers (and users) 6 | // will only see deployed updates on subsequent visits to a page, after all the 7 | // existing tabs open on the page have been closed, since previously cached 8 | // resources are updated in the background. 9 | 10 | // To learn more about the benefits of this model and instructions on how to 11 | // opt-in, read https://bit.ly/CRA-PWA 12 | 13 | const isLocalhost = Boolean( 14 | window.location.hostname === 'localhost' || 15 | // [::1] is the IPv6 localhost address. 16 | window.location.hostname === '[::1]' || 17 | // 127.0.0.0/8 are considered localhost for IPv4. 18 | window.location.hostname.match( 19 | /^127(?:\.(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}$/ 20 | ) 21 | ); 22 | 23 | export function register(config) { 24 | if (process.env.NODE_ENV === 'production' && 'serviceWorker' in navigator) { 25 | // The URL constructor is available in all browsers that support SW. 26 | const publicUrl = new URL(process.env.PUBLIC_URL, window.location.href); 27 | if (publicUrl.origin !== window.location.origin) { 28 | // Our service worker won't work if PUBLIC_URL is on a different origin 29 | // from what our page is served on. This might happen if a CDN is used to 30 | // serve assets; see https://github.com/facebook/create-react-app/issues/2374 31 | return; 32 | } 33 | 34 | window.addEventListener('load', () => { 35 | const swUrl = `${process.env.PUBLIC_URL}/service-worker.js`; 36 | 37 | if (isLocalhost) { 38 | // This is running on localhost. Let's check if a service worker still exists or not. 39 | checkValidServiceWorker(swUrl, config); 40 | 41 | // Add some additional logging to localhost, pointing developers to the 42 | // service worker/PWA documentation. 43 | navigator.serviceWorker.ready.then(() => { 44 | console.log( 45 | 'This web app is being served cache-first by a service ' + 46 | 'worker. To learn more, visit https://bit.ly/CRA-PWA' 47 | ); 48 | }); 49 | } else { 50 | // Is not localhost. Just register service worker 51 | registerValidSW(swUrl, config); 52 | } 53 | }); 54 | } 55 | } 56 | 57 | function registerValidSW(swUrl, config) { 58 | navigator.serviceWorker 59 | .register(swUrl) 60 | .then(registration => { 61 | registration.onupdatefound = () => { 62 | const installingWorker = registration.installing; 63 | if (installingWorker == null) { 64 | return; 65 | } 66 | installingWorker.onstatechange = () => { 67 | if (installingWorker.state === 'installed') { 68 | if (navigator.serviceWorker.controller) { 69 | // At this point, the updated precached content has been fetched, 70 | // but the previous service worker will still serve the older 71 | // content until all client tabs are closed. 72 | console.log( 73 | 'New content is available and will be used when all ' + 74 | 'tabs for this page are closed. See https://bit.ly/CRA-PWA.' 75 | ); 76 | 77 | // Execute callback 78 | if (config && config.onUpdate) { 79 | config.onUpdate(registration); 80 | } 81 | } else { 82 | // At this point, everything has been precached. 83 | // It's the perfect time to display a 84 | // "Content is cached for offline use." message. 85 | console.log('Content is cached for offline use.'); 86 | 87 | // Execute callback 88 | if (config && config.onSuccess) { 89 | config.onSuccess(registration); 90 | } 91 | } 92 | } 93 | }; 94 | }; 95 | }) 96 | .catch(error => { 97 | console.error('Error during service worker registration:', error); 98 | }); 99 | } 100 | 101 | function checkValidServiceWorker(swUrl, config) { 102 | // Check if the service worker can be found. If it can't reload the page. 103 | fetch(swUrl, { 104 | headers: { 'Service-Worker': 'script' }, 105 | }) 106 | .then(response => { 107 | // Ensure service worker exists, and that we really are getting a JS file. 108 | const contentType = response.headers.get('content-type'); 109 | if ( 110 | response.status === 404 || 111 | (contentType != null && contentType.indexOf('javascript') === -1) 112 | ) { 113 | // No service worker found. Probably a different app. Reload the page. 114 | navigator.serviceWorker.ready.then(registration => { 115 | registration.unregister().then(() => { 116 | window.location.reload(); 117 | }); 118 | }); 119 | } else { 120 | // Service worker found. Proceed as normal. 121 | registerValidSW(swUrl, config); 122 | } 123 | }) 124 | .catch(() => { 125 | console.log( 126 | 'No internet connection found. App is running in offline mode.' 127 | ); 128 | }); 129 | } 130 | 131 | export function unregister() { 132 | if ('serviceWorker' in navigator) { 133 | navigator.serviceWorker.ready 134 | .then(registration => { 135 | registration.unregister(); 136 | }) 137 | .catch(error => { 138 | console.error(error.message); 139 | }); 140 | } 141 | } 142 | -------------------------------------------------------------------------------- /src/setupTests.js: -------------------------------------------------------------------------------- 1 | // jest-dom adds custom jest matchers for asserting on DOM nodes. 2 | // allows you to do things like: 3 | // expect(element).toHaveTextContent(/react/i) 4 | // learn more: https://github.com/testing-library/jest-dom 5 | import '@testing-library/jest-dom/extend-expect'; 6 | --------------------------------------------------------------------------------