├── .gitignore ├── README.md ├── backend ├── .env.example ├── Apple_90_5weeks.csv ├── Microsoft_30_5weeks.csv ├── Pipfile ├── Pipfile.lock ├── README.md ├── Tesla_30_5weeks.csv ├── Tesla_90_5weeks.csv ├── best_weights │ ├── AAPL_wsize30_sc_dict.p │ ├── AAPL_wsize60_sc_dict.p │ ├── AAPL_wsize90_sc_dict.p │ ├── AMZN_wsize30_sc_dict.p │ ├── AMZN_wsize60_sc_dict.p │ ├── AMZN_wsize90_sc_dict.p │ ├── GOOGL_wsize30_sc_dict.p │ ├── GOOGL_wsize60_sc_dict.p │ ├── GOOGL_wsize90_sc_dict.p │ ├── MSFT_wsize30_sc_dict.p │ ├── MSFT_wsize60_sc_dict.p │ ├── MSFT_wsize90_sc_dict.p │ ├── TSLA_wsize30_sc_dict.p │ ├── TSLA_wsize60_sc_dict.p │ ├── TSLA_wsize90_sc_dict.p │ ├── best_weights_AAPL_wsize30.hdf5 │ ├── best_weights_AAPL_wsize60.hdf5 │ ├── best_weights_AAPL_wsize90.hdf5 │ ├── best_weights_AMZN_wsize30.hdf5 │ ├── best_weights_AMZN_wsize60.hdf5 │ ├── best_weights_AMZN_wsize90.hdf5 │ ├── best_weights_GOOGL_wsize30.hdf5 │ ├── best_weights_GOOGL_wsize60.hdf5 │ ├── best_weights_GOOGL_wsize90.hdf5 │ ├── best_weights_MSFT_wsize30.hdf5 │ ├── best_weights_MSFT_wsize60.hdf5 │ ├── best_weights_MSFT_wsize90.hdf5 │ ├── best_weights_TSLA_wsize30.hdf5 │ ├── best_weights_TSLA_wsize60.hdf5 │ └── best_weights_TSLA_wsize90.hdf5 ├── data_proc.py ├── extrapolate_backend.py ├── extrapolate_predict_keras_old.py ├── finBert │ ├── model │ │ └── config.json │ └── sentiment.py ├── main_keras.py ├── models_keras.py ├── predict_keras.py ├── runs.sh ├── server.py ├── starter_code.py └── utils.py └── frontend ├── README.md ├── package.json ├── public ├── favicon.ico ├── index.html └── manifest.json ├── src ├── App.css ├── App.js ├── components │ ├── Financial │ │ ├── Financial.css │ │ ├── Financial.js │ │ └── combined1.csv │ ├── Misc │ │ └── Landing.js │ ├── Prediction │ │ ├── Prediction.css │ │ └── Prediction.js │ └── Sentiment │ │ ├── Sentiment.css │ │ └── Sentiment.js ├── index.css └── index.js └── yarn.lock /.gitignore: -------------------------------------------------------------------------------- 1 | # dependencies 2 | frontend/node_modules 3 | /.pnp 4 | .pnp.js 5 | backend/reddit 6 | backend/stock_data 7 | backend/twitter 8 | 9 | # testing 10 | /coverage 11 | 12 | # production 13 | /build 14 | 15 | # misc 16 | .DS_Store 17 | *.bin 18 | 19 | 20 | npm-debug.log* 21 | yarn-debug.log* 22 | yarn-error.log* 23 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Stock Visualization 2 | 3 | # React + D3 + Flask Project 4 | 5 | CSE 6242 project Stock price prediction tool 6 | 7 | React + D3 + Flask project. 8 | Note: you will need to download files from the provided links in order to make the code run. 9 | 10 | ![alt text](https://i.imgur.com/YuE6Vns.png) 11 | 12 | ## Description 13 | 14 | This package contains all the code necessary to run our Stock Visualization UI. It contains a backend written in Python Flask that contains our machine learning models and weights. It also contains a frontend written in React that has our visualizations. 15 | 16 | The backend contains our trained machine learning models which are LSTMs with different window sizes trained on financial and sentiment data for five stocks. We extract sentiment with the help of finBERT. 17 | 18 | The frontend contains visualizations that will help someone analyze stock price versus sentiment, predict future stock prices by giving future predictions of sentiment, and help determine sentiment over the past week on Twitter for a certain stock. 19 | 20 | ## Installation 21 | 22 | ### Backend 23 | 24 | 1. Download the pytorch_model.bin file from here 25 | https://drive.google.com/file/d/1BZjW13BIMty_WhPx7uabzC7Kp6wPGtpK/view?usp=sharing 26 | 27 | 2. Place the downloaded file in the backend/finBert/model/ directory (relatively backend/finBert/model). This model is around 400 MB. 28 | This directory should now have 2 files config.json, pytorch_model.bin 29 | 3. Download the archive.zip file from here 30 | https://drive.google.com/file/d/1ApRVntmOTog9XyfVavP7SuaFFlGOMUul/view?usp=sharing 31 | 32 | 4. Unzip the archive folder and move the 3 directories twitter, reddit, stock_data 33 | into the backend directory. These 3 directories are around 1.3 GB total. 34 | 35 | 5. The backend server is run using pipenv please ensure you have 36 | python 3.7 and pipenv on your computer 37 | https://www.python.org/downloads/release/python-370/ 38 | https://pypi.org/project/pipenv/ 39 | 40 | 6. Enter backend directory `cd backend` 41 | 7. Rename .env.example to .env `mv .env.example .env` 42 | 8. Run `pipenv --python 3.7` 43 | 9. Run `pipenv install` 44 | 10. Run `pipenv run dev` (Ignore any errors related to GPU) 45 | 11. Now the backend server is running on http://localhost:5000 46 | 47 | #### NOTE 48 | 49 | This repo uses a .env file in the backend folder which contains my credentials in order to use the Twitter API. 50 | These credentials will be valid until May 7, 2021. After this date you will need to apply for a twitter developer 51 | account, create a project, and generate your own credentials to store in the .env file. 52 | https://cran.r-project.org/web/packages/rtweet/vignettes/auth.html 53 | 54 | ### Frontend 55 | 56 | Please make sure you have Node.js with npm and yarn installed and it is updated to the most 57 | recent stable release. 58 | 59 | Open Terminal 60 | 61 | #### With yarn 62 | 63 | 1. Enter frontend directory `cd frontend` 64 | 2. Run `yarn install` 65 | 3. Run `yarn start` 66 | 4. It might take a couple of min to start, if the window does not open automatically navigate to localhost:3000. 67 | 68 | ## Execution 69 | 70 | The first graph is the Financial dashboard. Here the user can select a company and view detailed information about the stock price and sentiment on a particular day. They can see the breakdown of sentiments from tweets that day and can adjust the time period to get a detailed view. 71 | 72 | The second graph is the Prediction graph. Here the user can experiment with different sentiment related settings to analyze predictions about the future price of a stock. The user can fine tune sentiment on a week by week basis for a certain company and see our predicted average price for that week given the user inputs. The user can see a graph that shows our prediction model for the past year to inspire confidence in the model and can then see the prediction. This graph simulates the current date as being the start of January 2020. 73 | 74 | The third graph is the Recent Sentiment graph. This allows the user to enter any stock ticker and see how sentiment is about that ticker on Twitter over the past 7 days. This allows users to get an idea about how sentiment looks right now. This graph is limited because we are using the free version of twitter API which only returns 100 results per request and we are filtering by their definition of “popular” tweets which have a certain number of favorites. This is done to ensure we are not only analyzing 100 tweets from the current day with little to none activity. However, expanding this functionality with the correct Twitter API credentials (pro or educational account) would expand the number of results returned from these queries, but getting the sentiment using our model would take longer. 75 | 76 | #### Note to use the third graph please use UPPERCASE stock symbols with no special characters such as '$' 77 | 78 | ## Instructions to load data and running training data 79 | 80 | All the data will now be at this link: https://drive.google.com/drive/folders/1RQlCXTDjg-_fbt9_nhTIWSsGnP4_sh4K?usp=sharing 81 | 82 | To get the data: 83 | 84 | 1. Download the twitter(`company_tweets_2015_to_2019.csv`, e.g. `TSLA_tweets_2015_to_2019.csv`) and reddit data(`company_reddit_2015_to_2019.csv`, e.g. `TSLA_reddit_2015_to_2019.csv`) from google drive and put it in the same path as starter_code. 85 | 2. Run `starter_code.py company` or any other company (should enter the stock name). It creates two directories (`twitter/company/` and `reddit/company/`) and adds the data for each year to those directories. 86 | 3. Run `load_data.py` with the following arguments: `load_data(stock_name, stock_path, reddit_path, twitter_path, year_list, return_columns_list)` 87 | 88 | - For example, for getting data of TSLA for years 2015 to 1018 and the desired outputs listed below, run `load_data("TSLA", "stock_data/tesla/", "reddit/TSLA/", "twitter/TSLA/", [2015, 2016, 2017, 2018], ['Close','positive_reddit','negative_reddit', 'neutral_reddit', 'count_reddit','comment_num_reddit', 'positive_twitter', 'negative_twitter','neutral_twitter', 'count_twitter', 'comment_num_twitter','retweet_num_twitter', 'like_num_twitter'])` 89 | - If you want all the columns in the returned array, pass an empty list `[]` as `return_columns_list` in `load_data()`. 90 | 91 | After setting up the data and you know the stock data path, reddit data path and twitter data path for the companies you want to train models on: 92 | 93 | 1. Use the training command: `python3 main_keras.py --window_size 90 --company TSLA --stockdir stock_data/tesla/ --redditdir reddit/TSLA/ --twitterdir twitter/TSLA/ --trainyears 2015 2016 2017 2018 --testyears 2019` . This example trains a model on TESLA with window size 90 for 2015 - 2018 and evaluates on 2019. 94 | 2. Use the prediction command: `python3 predict_keras.py --window_size 90 --company TSLA --modelpath weights/best_weights_TSLA_wsize90.hdf5 --scpath weights/TSLA_wsize90_sc_dict.p --stockdir stock_data/tesla/ --redditdir reddit/TSLA/ --twitterdir twitter/TSLA/ --years 2019` . Make sure to give the right model path and scaler path for the arguments provided. 95 | 3. For predicting into the future, you can use the extrapolate_backend script: `python3 extrapolate_backend.py --window_size 90 --company TSLA --modelpath weights/best_weights_TSLA_wsize90.hdf5 --scpath weights/TSLA_wsize90_sc_dict.p --fdpath Tesla_5_weeks.csv --stockdir stock_data/tesla/ --redditdir reddit/TSLA/ --twitterdir twitter/TSLA/ --years 2019` 96 | 97 | #### Note 98 | 99 | Our D3 visualizations uses code from 100 | https://bl.ocks.org/mbostock/34f08d5e11952a80609169b7917d4172 101 | https://gist.github.com/EfratVil/92f894ac0ba265192411e73f633a3e2f 102 | http://bl.ocks.org/Potherca/b9f8b3d0a24e0b20f16d 103 | http://bl.ocks.org/williaster/10ef968ccfdc71c30ef8 104 | https://github.com/arnauddri/d3-stock 105 | https://www.d3-graph-gallery.com/graph/line_basic.html 106 | https://bl.ocks.org/ProQuestionAsker/8382f70af7f4a7355827c6dc4ee8817d 107 | -------------------------------------------------------------------------------- /backend/.env.example: -------------------------------------------------------------------------------- 1 | consumer_key= 'LYdupuQBh5JIc4jCr32frQoKQ' 2 | consumer_secret= '2PKkzIFBE16W1OaCQweQqBdoAH7xe1LhAuwWhod762w8pFKYub' 3 | access_token= '2410009376-kCT02w7N2Tzarqwcqj65wNrJYXWsPRUfT3LabuT' 4 | access_token_secret= 'ZVMZAw6jJCNB7Y9LAO37IPN72apZPMFxjJe9aaKB61o5L' 5 | TOKENIZERS_PARALLELISM=(true | false) -------------------------------------------------------------------------------- /backend/Apple_90_5weeks.csv: -------------------------------------------------------------------------------- 1 | Week,Reddit Sentiment,Number of Reddit Posts,Twitter Sentiment,Number of Tweets,Tweet Activity 2 | 1,0.5,0.5,1,0.5,0.5 3 | 2,0.5,1,0.5,0.5,0.5 4 | 3,0.5,0.5,1,0.5,0.5 5 | 4,0.5,0.5,0.5,1,1 6 | 5,1,1,0.5,0.5,0.5 7 | -------------------------------------------------------------------------------- /backend/Microsoft_30_5weeks.csv: -------------------------------------------------------------------------------- 1 | Week,Reddit Sentiment,Number of Reddit Posts,Twitter Sentiment,Number of Tweets,Tweet Activity 2 | 1,0.5,0.5,0.5,0.5,0.5 3 | 2,0.5,0.5,0.5,0.5,0.5 4 | 3,0.5,0.5,0.5,0.5,0.5 5 | 4,0.5,0.5,0.5,0.5,0.5 6 | 5,0.5,0.5,0.5,0.5,0.5 7 | -------------------------------------------------------------------------------- /backend/Pipfile: -------------------------------------------------------------------------------- 1 | [[source]] 2 | url = "https://pypi.org/simple" 3 | verify_ssl = true 4 | name = "pypi" 5 | 6 | [packages] 7 | flask = "*" 8 | flask-cors = "*" 9 | tweepy = "*" 10 | pandas = "*" 11 | python-dotenv = "*" 12 | nltk = "*" 13 | transformers = "*" 14 | torch = "*" 15 | matplotlib = "*" 16 | keras = "*" 17 | numpy = "*" 18 | tensorflow = "*" 19 | scikit-learn = "*" 20 | 21 | [dev-packages] 22 | 23 | [requires] 24 | python_version = "3.7" 25 | 26 | [scripts] 27 | dev = "python server.py" 28 | -------------------------------------------------------------------------------- /backend/Pipfile.lock: -------------------------------------------------------------------------------- 1 | { 2 | "_meta": { 3 | "hash": { 4 | "sha256": "9be6c6c709af891552639d272071fe3191ef893671b5044f793e8aeeebd8b8a3" 5 | }, 6 | "pipfile-spec": 6, 7 | "requires": { 8 | "python_version": "3.7" 9 | }, 10 | "sources": [ 11 | { 12 | "name": "pypi", 13 | "url": "https://pypi.org/simple", 14 | "verify_ssl": true 15 | } 16 | ] 17 | }, 18 | "default": { 19 | "absl-py": { 20 | "hashes": [ 21 | "sha256:afe94e3c751ff81aad55d33ab6e630390da32780110b5af72ae81ecff8418d9e", 22 | "sha256:b44f68984a5ceb2607d135a615999b93924c771238a63920d17d3387b0d229d5" 23 | ], 24 | "version": "==0.12.0" 25 | }, 26 | "astunparse": { 27 | "hashes": [ 28 | "sha256:5ad93a8456f0d084c3456d059fd9a92cce667963232cbf763eac3bc5b7940872", 29 | "sha256:c2652417f2c8b5bb325c885ae329bdf3f86424075c4fd1a128674bc6fba4b8e8" 30 | ], 31 | "version": "==1.6.3" 32 | }, 33 | "cachetools": { 34 | "hashes": [ 35 | "sha256:1d9d5f567be80f7c07d765e21b814326d78c61eb0c3a637dffc0e5d1796cb2e2", 36 | "sha256:f469e29e7aa4cff64d8de4aad95ce76de8ea1125a16c68e0d93f65c3c3dc92e9" 37 | ], 38 | "markers": "python_version ~= '3.5'", 39 | "version": "==4.2.1" 40 | }, 41 | "certifi": { 42 | "hashes": [ 43 | "sha256:1a4995114262bffbc2413b159f2a1a480c969de6e6eb13ee966d470af86af59c", 44 | "sha256:719a74fb9e33b9bd44cc7f3a8d94bc35e4049deebe19ba7d8e108280cfd59830" 45 | ], 46 | "version": "==2020.12.5" 47 | }, 48 | "chardet": { 49 | "hashes": [ 50 | "sha256:0d6f53a15db4120f2b08c94f11e7d93d2c911ee118b6b30a04ec3ee8310179fa", 51 | "sha256:f864054d66fd9118f2e67044ac8981a54775ec5b67aed0441892edb553d21da5" 52 | ], 53 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", 54 | "version": "==4.0.0" 55 | }, 56 | "click": { 57 | "hashes": [ 58 | "sha256:d2b5255c7c6349bc1bd1e59e08cd12acbbd63ce649f2588755783aa94dfb6b1a", 59 | "sha256:dacca89f4bfadd5de3d7489b7c8a566eee0d3676333fbb50030263894c38c0dc" 60 | ], 61 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", 62 | "version": "==7.1.2" 63 | }, 64 | "cycler": { 65 | "hashes": [ 66 | "sha256:1d8a5ae1ff6c5cf9b93e8811e581232ad8920aeec647c37316ceac982b08cb2d", 67 | "sha256:cd7b2d1018258d7247a71425e9f26463dfb444d411c39569972f4ce586b0c9d8" 68 | ], 69 | "version": "==0.10.0" 70 | }, 71 | "filelock": { 72 | "hashes": [ 73 | "sha256:18d82244ee114f543149c66a6e0c14e9c4f8a1044b5cdaadd0f82159d6a6ff59", 74 | "sha256:929b7d63ec5b7d6b71b0fa5ac14e030b3f70b75747cef1b10da9b879fef15836" 75 | ], 76 | "version": "==3.0.12" 77 | }, 78 | "flask": { 79 | "hashes": [ 80 | "sha256:4efa1ae2d7c9865af48986de8aeb8504bf32c7f3d6fdc9353d34b21f4b127060", 81 | "sha256:8a4fdd8936eba2512e9c85df320a37e694c93945b33ef33c89946a340a238557" 82 | ], 83 | "index": "pypi", 84 | "version": "==1.1.2" 85 | }, 86 | "flask-cors": { 87 | "hashes": [ 88 | "sha256:74efc975af1194fc7891ff5cd85b0f7478be4f7f59fe158102e91abb72bb4438", 89 | "sha256:b60839393f3b84a0f3746f6cdca56c1ad7426aa738b70d6c61375857823181de" 90 | ], 91 | "index": "pypi", 92 | "version": "==3.0.10" 93 | }, 94 | "flatbuffers": { 95 | "hashes": [ 96 | "sha256:63bb9a722d5e373701913e226135b28a6f6ac200d5cc7b4d919fa38d73b44610", 97 | "sha256:9e9ef47fa92625c4721036e7c4124182668dc6021d9e7c73704edd395648deb9" 98 | ], 99 | "version": "==1.12" 100 | }, 101 | "gast": { 102 | "hashes": [ 103 | "sha256:8f46f5be57ae6889a4e16e2ca113b1703ef17f2b0abceb83793eaba9e1351a45", 104 | "sha256:b881ef288a49aa81440d2c5eb8aeefd4c2bb8993d5f50edae7413a85bfdb3b57" 105 | ], 106 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 107 | "version": "==0.3.3" 108 | }, 109 | "google-auth": { 110 | "hashes": [ 111 | "sha256:010f011c4e27d3d5eb01106fba6aac39d164842dfcd8709955c4638f5b11ccf8", 112 | "sha256:f30a672a64d91cc2e3137765d088c5deec26416246f7a9e956eaf69a8d7ed49c" 113 | ], 114 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'", 115 | "version": "==1.29.0" 116 | }, 117 | "google-auth-oauthlib": { 118 | "hashes": [ 119 | "sha256:09832c6e75032f93818edf1affe4746121d640c625a5bef9b5c96af676e98eee", 120 | "sha256:0e92aacacfb94978de3b7972cf4b0f204c3cd206f74ddd0dc0b31e91164e6317" 121 | ], 122 | "markers": "python_version >= '3.6'", 123 | "version": "==0.4.4" 124 | }, 125 | "google-pasta": { 126 | "hashes": [ 127 | "sha256:4612951da876b1a10fe3960d7226f0c7682cf901e16ac06e473b267a5afa8954", 128 | "sha256:b32482794a366b5366a32c92a9a9201b107821889935a02b3e51f6b432ea84ed", 129 | "sha256:c9f2c8dfc8f96d0d5808299920721be30c9eec37f2389f28904f454565c8a16e" 130 | ], 131 | "version": "==0.2.0" 132 | }, 133 | "grpcio": { 134 | "hashes": [ 135 | "sha256:01d3046fe980be25796d368f8fc5ff34b7cf5e1444f3789a017a7fe794465639", 136 | "sha256:07b430fa68e5eecd78e2ad529ab80f6a234b55fc1b675fe47335ccbf64c6c6c8", 137 | "sha256:0e3edd8cdb71809d2455b9dbff66b4dd3d36c321e64bfa047da5afdfb0db332b", 138 | "sha256:0f3f09269ffd3fded430cd89ba2397eabbf7e47be93983b25c187cdfebb302a7", 139 | "sha256:1376a60f9bfce781b39973f100b5f67e657b5be479f2fd8a7d2a408fc61c085c", 140 | "sha256:14c0f017bfebbc18139551111ac58ecbde11f4bc375b73a53af38927d60308b6", 141 | "sha256:182c64ade34c341398bf71ec0975613970feb175090760ab4f51d1e9a5424f05", 142 | "sha256:1ada89326a364a299527c7962e5c362dbae58c67b283fe8383c4d952b26565d5", 143 | "sha256:1ce6f5ff4f4a548c502d5237a071fa617115df58ea4b7bd41dac77c1ab126e9c", 144 | "sha256:1d384a61f96a1fc6d5d3e0b62b0a859abc8d4c3f6d16daba51ebf253a3e7df5d", 145 | "sha256:25959a651420dd4a6fd7d3e8dee53f4f5fd8c56336a64963428e78b276389a59", 146 | "sha256:28677f057e2ef11501860a7bc15de12091d40b95dd0fddab3c37ff1542e6b216", 147 | "sha256:378fe80ec5d9353548eb2a8a43ea03747a80f2e387c4f177f2b3ff6c7d898753", 148 | "sha256:3afb058b6929eba07dba9ae6c5b555aa1d88cb140187d78cc510bd72d0329f28", 149 | "sha256:4396b1d0f388ae875eaf6dc05cdcb612c950fd9355bc34d38b90aaa0665a0d4b", 150 | "sha256:4775bc35af9cd3b5033700388deac2e1d611fa45f4a8dcb93667d94cb25f0444", 151 | "sha256:5bddf9d53c8df70061916c3bfd2f468ccf26c348bb0fb6211531d895ed5e4c72", 152 | "sha256:6d869a3e8e62562b48214de95e9231c97c53caa7172802236cd5d60140d7cddd", 153 | "sha256:6f7947dad606c509d067e5b91a92b250aa0530162ab99e4737090f6b17eb12c4", 154 | "sha256:7cda998b7b551503beefc38db9be18c878cfb1596e1418647687575cdefa9273", 155 | "sha256:99bac0e2c820bf446662365df65841f0c2a55b0e2c419db86eaf5d162ddae73e", 156 | "sha256:9c0d8f2346c842088b8cbe3e14985b36e5191a34bf79279ba321a4bf69bd88b7", 157 | "sha256:a8004b34f600a8a51785e46859cd88f3386ef67cccd1cfc7598e3d317608c643", 158 | "sha256:ac7028d363d2395f3d755166d0161556a3f99500a5b44890421ccfaaf2aaeb08", 159 | "sha256:be98e3198ec765d0a1e27f69d760f69374ded8a33b953dcfe790127731f7e690", 160 | "sha256:c31e8a219650ddae1cd02f5a169e1bffe66a429a8255d3ab29e9363c73003b62", 161 | "sha256:c4966d746dccb639ef93f13560acbe9630681c07f2b320b7ec03fe2c8f0a1f15", 162 | "sha256:c58825a3d8634cd634d8f869afddd4d5742bdb59d594aea4cea17b8f39269a55", 163 | "sha256:ce617e1c4a39131f8527964ac9e700eb199484937d7a0b3e52655a3ba50d5fb9", 164 | "sha256:e28e4c0d4231beda5dee94808e3a224d85cbaba3cfad05f2192e6f4ec5318053", 165 | "sha256:e467af6bb8f5843f5a441e124b43474715cfb3981264e7cd227343e826dcc3ce", 166 | "sha256:e6786f6f7be0937614577edcab886ddce91b7c1ea972a07ef9972e9f9ecbbb78", 167 | "sha256:e811ce5c387256609d56559d944a974cc6934a8eea8c76e7c86ec388dc06192d", 168 | "sha256:ec10d5f680b8e95a06f1367d73c5ddcc0ed04a3f38d6e4c9346988fb0cea2ffa", 169 | "sha256:ef9bd7fdfc0a063b4ed0efcab7906df5cae9bbcf79d05c583daa2eba56752b00", 170 | "sha256:f03dfefa9075dd1c6c5cc27b1285c521434643b09338d8b29e1d6a27b386aa82", 171 | "sha256:f12900be4c3fd2145ba94ab0d80b7c3d71c9e6414cfee2f31b1c20188b5c281f", 172 | "sha256:f53f2dfc8ff9a58a993e414a016c8b21af333955ae83960454ad91798d467c7b", 173 | "sha256:f7d508691301027033215d3662dab7e178f54d5cca2329f26a71ae175d94b83f" 174 | ], 175 | "version": "==1.32.0" 176 | }, 177 | "h5py": { 178 | "hashes": [ 179 | "sha256:063947eaed5f271679ed4ffa36bb96f57bc14f44dd4336a827d9a02702e6ce6b", 180 | "sha256:13c87efa24768a5e24e360a40e0bc4c49bcb7ce1bb13a3a7f9902cec302ccd36", 181 | "sha256:16ead3c57141101e3296ebeed79c9c143c32bdd0e82a61a2fc67e8e6d493e9d1", 182 | "sha256:3dad1730b6470fad853ef56d755d06bb916ee68a3d8272b3bab0c1ddf83bb99e", 183 | "sha256:51ae56894c6c93159086ffa2c94b5b3388c0400548ab26555c143e7cfa05b8e5", 184 | "sha256:54817b696e87eb9e403e42643305f142cd8b940fe9b3b490bbf98c3b8a894cf4", 185 | "sha256:549ad124df27c056b2e255ea1c44d30fb7a17d17676d03096ad5cd85edb32dc1", 186 | "sha256:64f74da4a1dd0d2042e7d04cf8294e04ddad686f8eba9bb79e517ae582f6668d", 187 | "sha256:6998be619c695910cb0effe5eb15d3a511d3d1a5d217d4bd0bebad1151ec2262", 188 | "sha256:6ef7ab1089e3ef53ca099038f3c0a94d03e3560e6aff0e9d6c64c55fb13fc681", 189 | "sha256:769e141512b54dee14ec76ed354fcacfc7d97fea5a7646b709f7400cf1838630", 190 | "sha256:79b23f47c6524d61f899254f5cd5e486e19868f1823298bc0c29d345c2447172", 191 | "sha256:7be5754a159236e95bd196419485343e2b5875e806fe68919e087b6351f40a70", 192 | "sha256:84412798925dc870ffd7107f045d7659e60f5d46d1c70c700375248bf6bf512d", 193 | "sha256:86868dc07b9cc8cb7627372a2e6636cdc7a53b7e2854ad020c9e9d8a4d3fd0f5", 194 | "sha256:8bb1d2de101f39743f91512a9750fb6c351c032e5cd3204b4487383e34da7f75", 195 | "sha256:a5f82cd4938ff8761d9760af3274acf55afc3c91c649c50ab18fcff5510a14a5", 196 | "sha256:aac4b57097ac29089f179bbc2a6e14102dd210618e94d77ee4831c65f82f17c0", 197 | "sha256:bffbc48331b4a801d2f4b7dac8a72609f0b10e6e516e5c480a3e3241e091c878", 198 | "sha256:c0d4b04bbf96c47b6d360cd06939e72def512b20a18a8547fa4af810258355d5", 199 | "sha256:c54a2c0dd4957776ace7f95879d81582298c5daf89e77fb8bee7378f132951de", 200 | "sha256:cbf28ae4b5af0f05aa6e7551cee304f1d317dbed1eb7ac1d827cee2f1ef97a99", 201 | "sha256:d35f7a3a6cefec82bfdad2785e78359a0e6a5fbb3f605dd5623ce88082ccd681", 202 | "sha256:d3c59549f90a891691991c17f8e58c8544060fdf3ccdea267100fa5f561ff62f", 203 | "sha256:d7ae7a0576b06cb8e8a1c265a8bc4b73d05fdee6429bffc9a26a6eb531e79d72", 204 | "sha256:ecf4d0b56ee394a0984de15bceeb97cbe1fe485f1ac205121293fc44dcf3f31f", 205 | "sha256:f0e25bb91e7a02efccb50aba6591d3fe2c725479e34769802fcdd4076abfa917", 206 | "sha256:f23951a53d18398ef1344c186fb04b26163ca6ce449ebd23404b153fd111ded9", 207 | "sha256:ff7d241f866b718e4584fa95f520cb19405220c501bd3a53ee11871ba5166ea2" 208 | ], 209 | "version": "==2.10.0" 210 | }, 211 | "idna": { 212 | "hashes": [ 213 | "sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6", 214 | "sha256:b97d804b1e9b523befed77c48dacec60e6dcb0b5391d57af6a65a312a90648c0" 215 | ], 216 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 217 | "version": "==2.10" 218 | }, 219 | "importlib-metadata": { 220 | "hashes": [ 221 | "sha256:8c501196e49fb9df5df43833bdb1e4328f64847763ec8a50703148b73784d581", 222 | "sha256:d7eb1dea6d6a6086f8be21784cc9e3bcfa55872b52309bc5fad53a8ea444465d" 223 | ], 224 | "markers": "python_version < '3.8' and python_version < '3.8'", 225 | "version": "==4.0.1" 226 | }, 227 | "itsdangerous": { 228 | "hashes": [ 229 | "sha256:321b033d07f2a4136d3ec762eac9f16a10ccd60f53c0c91af90217ace7ba1f19", 230 | "sha256:b12271b2047cb23eeb98c8b5622e2e5c5e9abd9784a153e9d8ef9cb4dd09d749" 231 | ], 232 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 233 | "version": "==1.1.0" 234 | }, 235 | "jinja2": { 236 | "hashes": [ 237 | "sha256:03e47ad063331dd6a3f04a43eddca8a966a26ba0c5b7207a9a9e4e08f1b29419", 238 | "sha256:a6d58433de0ae800347cab1fa3043cebbabe8baa9d29e668f1c768cb87a333c6" 239 | ], 240 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", 241 | "version": "==2.11.3" 242 | }, 243 | "joblib": { 244 | "hashes": [ 245 | "sha256:9c17567692206d2f3fb9ecf5e991084254fe631665c450b443761c4186a613f7", 246 | "sha256:feeb1ec69c4d45129954f1b7034954241eedfd6ba39b5e9e4b6883be3332d5e5" 247 | ], 248 | "markers": "python_version >= '3.6'", 249 | "version": "==1.0.1" 250 | }, 251 | "keras": { 252 | "hashes": [ 253 | "sha256:05e2faf6885f7899482a7d18fc00ba9655fe2c9296a35ad96949a07a9c27d1bb", 254 | "sha256:fedd729b52572fb108a98e3d97e1bac10a81d3917d2103cc20ab2a5f03beb973" 255 | ], 256 | "index": "pypi", 257 | "version": "==2.4.3" 258 | }, 259 | "keras-preprocessing": { 260 | "hashes": [ 261 | "sha256:7b82029b130ff61cc99b55f3bd27427df4838576838c5b2f65940e4fcec99a7b", 262 | "sha256:add82567c50c8bc648c14195bf544a5ce7c1f76761536956c3d2978970179ef3" 263 | ], 264 | "version": "==1.1.2" 265 | }, 266 | "kiwisolver": { 267 | "hashes": [ 268 | "sha256:0cd53f403202159b44528498de18f9285b04482bab2a6fc3f5dd8dbb9352e30d", 269 | "sha256:1e1bc12fb773a7b2ffdeb8380609f4f8064777877b2225dec3da711b421fda31", 270 | "sha256:225e2e18f271e0ed8157d7f4518ffbf99b9450fca398d561eb5c4a87d0986dd9", 271 | "sha256:232c9e11fd7ac3a470d65cd67e4359eee155ec57e822e5220322d7b2ac84fbf0", 272 | "sha256:31dfd2ac56edc0ff9ac295193eeaea1c0c923c0355bf948fbd99ed6018010b72", 273 | "sha256:33449715e0101e4d34f64990352bce4095c8bf13bed1b390773fc0a7295967b3", 274 | "sha256:401a2e9afa8588589775fe34fc22d918ae839aaaf0c0e96441c0fdbce6d8ebe6", 275 | "sha256:44a62e24d9b01ba94ae7a4a6c3fb215dc4af1dde817e7498d901e229aaf50e4e", 276 | "sha256:50af681a36b2a1dee1d3c169ade9fdc59207d3c31e522519181e12f1b3ba7000", 277 | "sha256:563c649cfdef27d081c84e72a03b48ea9408c16657500c312575ae9d9f7bc1c3", 278 | "sha256:5989db3b3b34b76c09253deeaf7fbc2707616f130e166996606c284395da3f18", 279 | "sha256:5a7a7dbff17e66fac9142ae2ecafb719393aaee6a3768c9de2fd425c63b53e21", 280 | "sha256:5c3e6455341008a054cccee8c5d24481bcfe1acdbc9add30aa95798e95c65621", 281 | "sha256:5f6ccd3dd0b9739edcf407514016108e2280769c73a85b9e59aa390046dbf08b", 282 | "sha256:72c99e39d005b793fb7d3d4e660aed6b6281b502e8c1eaf8ee8346023c8e03bc", 283 | "sha256:78751b33595f7f9511952e7e60ce858c6d64db2e062afb325985ddbd34b5c131", 284 | "sha256:834ee27348c4aefc20b479335fd422a2c69db55f7d9ab61721ac8cd83eb78882", 285 | "sha256:8be8d84b7d4f2ba4ffff3665bcd0211318aa632395a1a41553250484a871d454", 286 | "sha256:950a199911a8d94683a6b10321f9345d5a3a8433ec58b217ace979e18f16e248", 287 | "sha256:a357fd4f15ee49b4a98b44ec23a34a95f1e00292a139d6015c11f55774ef10de", 288 | "sha256:a53d27d0c2a0ebd07e395e56a1fbdf75ffedc4a05943daf472af163413ce9598", 289 | "sha256:acef3d59d47dd85ecf909c359d0fd2c81ed33bdff70216d3956b463e12c38a54", 290 | "sha256:b38694dcdac990a743aa654037ff1188c7a9801ac3ccc548d3341014bc5ca278", 291 | "sha256:b9edd0110a77fc321ab090aaa1cfcaba1d8499850a12848b81be2222eab648f6", 292 | "sha256:c08e95114951dc2090c4a630c2385bef681cacf12636fb0241accdc6b303fd81", 293 | "sha256:c5518d51a0735b1e6cee1fdce66359f8d2b59c3ca85dc2b0813a8aa86818a030", 294 | "sha256:c8fd0f1ae9d92b42854b2979024d7597685ce4ada367172ed7c09edf2cef9cb8", 295 | "sha256:ca3820eb7f7faf7f0aa88de0e54681bddcb46e485beb844fcecbcd1c8bd01689", 296 | "sha256:cf8b574c7b9aa060c62116d4181f3a1a4e821b2ec5cbfe3775809474113748d4", 297 | "sha256:d3155d828dec1d43283bd24d3d3e0d9c7c350cdfcc0bd06c0ad1209c1bbc36d0", 298 | "sha256:f8d6f8db88049a699817fd9178782867bf22283e3813064302ac59f61d95be05", 299 | "sha256:fd34fbbfbc40628200730bc1febe30631347103fc8d3d4fa012c21ab9c11eca9" 300 | ], 301 | "markers": "python_version >= '3.6'", 302 | "version": "==1.3.1" 303 | }, 304 | "markdown": { 305 | "hashes": [ 306 | "sha256:31b5b491868dcc87d6c24b7e3d19a0d730d59d3e46f4eea6430a321bed387a49", 307 | "sha256:96c3ba1261de2f7547b46a00ea8463832c921d3f9d6aba3f255a6f71386db20c" 308 | ], 309 | "markers": "python_version >= '3.6'", 310 | "version": "==3.3.4" 311 | }, 312 | "markupsafe": { 313 | "hashes": [ 314 | "sha256:00bc623926325b26bb9605ae9eae8a215691f33cae5df11ca5424f06f2d1f473", 315 | "sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161", 316 | "sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235", 317 | "sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5", 318 | "sha256:13d3144e1e340870b25e7b10b98d779608c02016d5184cfb9927a9f10c689f42", 319 | "sha256:195d7d2c4fbb0ee8139a6cf67194f3973a6b3042d742ebe0a9ed36d8b6f0c07f", 320 | "sha256:22c178a091fc6630d0d045bdb5992d2dfe14e3259760e713c490da5323866c39", 321 | "sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff", 322 | "sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b", 323 | "sha256:2beec1e0de6924ea551859edb9e7679da6e4870d32cb766240ce17e0a0ba2014", 324 | "sha256:3b8a6499709d29c2e2399569d96719a1b21dcd94410a586a18526b143ec8470f", 325 | "sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1", 326 | "sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e", 327 | "sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183", 328 | "sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66", 329 | "sha256:596510de112c685489095da617b5bcbbac7dd6384aeebeda4df6025d0256a81b", 330 | "sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1", 331 | "sha256:6788b695d50a51edb699cb55e35487e430fa21f1ed838122d722e0ff0ac5ba15", 332 | "sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1", 333 | "sha256:6f1e273a344928347c1290119b493a1f0303c52f5a5eae5f16d74f48c15d4a85", 334 | "sha256:6fffc775d90dcc9aed1b89219549b329a9250d918fd0b8fa8d93d154918422e1", 335 | "sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e", 336 | "sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b", 337 | "sha256:7c1699dfe0cf8ff607dbdcc1e9b9af1755371f92a68f706051cc8c37d447c905", 338 | "sha256:7fed13866cf14bba33e7176717346713881f56d9d2bcebab207f7a036f41b850", 339 | "sha256:84dee80c15f1b560d55bcfe6d47b27d070b4681c699c572af2e3c7cc90a3b8e0", 340 | "sha256:88e5fcfb52ee7b911e8bb6d6aa2fd21fbecc674eadd44118a9cc3863f938e735", 341 | "sha256:8defac2f2ccd6805ebf65f5eeb132adcf2ab57aa11fdf4c0dd5169a004710e7d", 342 | "sha256:98bae9582248d6cf62321dcb52aaf5d9adf0bad3b40582925ef7c7f0ed85fceb", 343 | "sha256:98c7086708b163d425c67c7a91bad6e466bb99d797aa64f965e9d25c12111a5e", 344 | "sha256:9add70b36c5666a2ed02b43b335fe19002ee5235efd4b8a89bfcf9005bebac0d", 345 | "sha256:9bf40443012702a1d2070043cb6291650a0841ece432556f784f004937f0f32c", 346 | "sha256:a6a744282b7718a2a62d2ed9d993cad6f5f585605ad352c11de459f4108df0a1", 347 | "sha256:acf08ac40292838b3cbbb06cfe9b2cb9ec78fce8baca31ddb87aaac2e2dc3bc2", 348 | "sha256:ade5e387d2ad0d7ebf59146cc00c8044acbd863725f887353a10df825fc8ae21", 349 | "sha256:b00c1de48212e4cc9603895652c5c410df699856a2853135b3967591e4beebc2", 350 | "sha256:b1282f8c00509d99fef04d8ba936b156d419be841854fe901d8ae224c59f0be5", 351 | "sha256:b1dba4527182c95a0db8b6060cc98ac49b9e2f5e64320e2b56e47cb2831978c7", 352 | "sha256:b2051432115498d3562c084a49bba65d97cf251f5a331c64a12ee7e04dacc51b", 353 | "sha256:b7d644ddb4dbd407d31ffb699f1d140bc35478da613b441c582aeb7c43838dd8", 354 | "sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6", 355 | "sha256:bf5aa3cbcfdf57fa2ee9cd1822c862ef23037f5c832ad09cfea57fa846dec193", 356 | "sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f", 357 | "sha256:caabedc8323f1e93231b52fc32bdcde6db817623d33e100708d9a68e1f53b26b", 358 | "sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f", 359 | "sha256:cdb132fc825c38e1aeec2c8aa9338310d29d337bebbd7baa06889d09a60a1fa2", 360 | "sha256:d53bc011414228441014aa71dbec320c66468c1030aae3a6e29778a3382d96e5", 361 | "sha256:d73a845f227b0bfe8a7455ee623525ee656a9e2e749e4742706d80a6065d5e2c", 362 | "sha256:d9be0ba6c527163cbed5e0857c451fcd092ce83947944d6c14bc95441203f032", 363 | "sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7", 364 | "sha256:e8313f01ba26fbbe36c7be1966a7b7424942f670f38e666995b88d012765b9be", 365 | "sha256:feb7b34d6325451ef96bc0e36e1a6c0c1c64bc1fbec4b854f4529e51887b1621" 366 | ], 367 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 368 | "version": "==1.1.1" 369 | }, 370 | "matplotlib": { 371 | "hashes": [ 372 | "sha256:1f83a32e4b6045191f9d34e4dc68c0a17c870b57ef9cca518e516da591246e79", 373 | "sha256:2eee37340ca1b353e0a43a33da79d0cd4bcb087064a0c3c3d1329cdea8fbc6f3", 374 | "sha256:53ceb12ef44f8982b45adc7a0889a7e2df1d758e8b360f460e435abe8a8cd658", 375 | "sha256:574306171b84cd6854c83dc87bc353cacc0f60184149fb00c9ea871eca8c1ecb", 376 | "sha256:7561fd541477d41f3aa09457c434dd1f7604f3bd26d7858d52018f5dfe1c06d1", 377 | "sha256:7a54efd6fcad9cb3cd5ef2064b5a3eeb0b63c99f26c346bdcf66e7c98294d7cc", 378 | "sha256:7f16660edf9a8bcc0f766f51c9e1b9d2dc6ceff6bf636d2dbd8eb925d5832dfd", 379 | "sha256:81e6fe8b18ef5be67f40a1d4f07d5a4ed21d3878530193898449ddef7793952f", 380 | "sha256:84a10e462120aa7d9eb6186b50917ed5a6286ee61157bfc17c5b47987d1a9068", 381 | "sha256:84d4c4f650f356678a5d658a43ca21a41fca13f9b8b00169c0b76e6a6a948908", 382 | "sha256:86dc94e44403fa0f2b1dd76c9794d66a34e821361962fe7c4e078746362e3b14", 383 | "sha256:90dbc007f6389bcfd9ef4fe5d4c78c8d2efe4e0ebefd48b4f221cdfed5672be2", 384 | "sha256:9f374961a3996c2d1b41ba3145462c3708a89759e604112073ed6c8bdf9f622f", 385 | "sha256:a18cc1ab4a35b845cf33b7880c979f5c609fd26c2d6e74ddfacb73dcc60dd956", 386 | "sha256:a97781453ac79409ddf455fccf344860719d95142f9c334f2a8f3fff049ffec3", 387 | "sha256:a989022f89cda417f82dbf65e0a830832afd8af743d05d1414fb49549287ff04", 388 | "sha256:ac2a30a09984c2719f112a574b6543ccb82d020fd1b23b4d55bf4759ba8dd8f5", 389 | "sha256:be4430b33b25e127fc4ea239cc386389de420be4d63e71d5359c20b562951ce1", 390 | "sha256:c45e7bf89ea33a2adaef34774df4e692c7436a18a48bcb0e47a53e698a39fa39" 391 | ], 392 | "index": "pypi", 393 | "version": "==3.4.1" 394 | }, 395 | "nltk": { 396 | "hashes": [ 397 | "sha256:240e23ab1ab159ef9940777d30c7c72d7e76d91877099218a7585370c11f6b9e", 398 | "sha256:57d556abed621ab9be225cc6d2df1edce17572efb67a3d754630c9f8381503eb" 399 | ], 400 | "index": "pypi", 401 | "version": "==3.6.2" 402 | }, 403 | "numpy": { 404 | "hashes": [ 405 | "sha256:012426a41bc9ab63bb158635aecccc7610e3eff5d31d1eb43bc099debc979d94", 406 | "sha256:06fab248a088e439402141ea04f0fffb203723148f6ee791e9c75b3e9e82f080", 407 | "sha256:0eef32ca3132a48e43f6a0f5a82cb508f22ce5a3d6f67a8329c81c8e226d3f6e", 408 | "sha256:1ded4fce9cfaaf24e7a0ab51b7a87be9038ea1ace7f34b841fe3b6894c721d1c", 409 | "sha256:2e55195bc1c6b705bfd8ad6f288b38b11b1af32f3c8289d6c50d47f950c12e76", 410 | "sha256:2ea52bd92ab9f768cc64a4c3ef8f4b2580a17af0a5436f6126b08efbd1838371", 411 | "sha256:36674959eed6957e61f11c912f71e78857a8d0604171dfd9ce9ad5cbf41c511c", 412 | "sha256:384ec0463d1c2671170901994aeb6dce126de0a95ccc3976c43b0038a37329c2", 413 | "sha256:39b70c19ec771805081578cc936bbe95336798b7edf4732ed102e7a43ec5c07a", 414 | "sha256:400580cbd3cff6ffa6293df2278c75aef2d58d8d93d3c5614cd67981dae68ceb", 415 | "sha256:43d4c81d5ffdff6bae58d66a3cd7f54a7acd9a0e7b18d97abb255defc09e3140", 416 | "sha256:50a4a0ad0111cc1b71fa32dedd05fa239f7fb5a43a40663269bb5dc7877cfd28", 417 | "sha256:603aa0706be710eea8884af807b1b3bc9fb2e49b9f4da439e76000f3b3c6ff0f", 418 | "sha256:6149a185cece5ee78d1d196938b2a8f9d09f5a5ebfbba66969302a778d5ddd1d", 419 | "sha256:759e4095edc3c1b3ac031f34d9459fa781777a93ccc633a472a5468587a190ff", 420 | "sha256:7fb43004bce0ca31d8f13a6eb5e943fa73371381e53f7074ed21a4cb786c32f8", 421 | "sha256:811daee36a58dc79cf3d8bdd4a490e4277d0e4b7d103a001a4e73ddb48e7e6aa", 422 | "sha256:8b5e972b43c8fc27d56550b4120fe6257fdc15f9301914380b27f74856299fea", 423 | "sha256:99abf4f353c3d1a0c7a5f27699482c987cf663b1eac20db59b8c7b061eabd7fc", 424 | "sha256:a0d53e51a6cb6f0d9082decb7a4cb6dfb33055308c4c44f53103c073f649af73", 425 | "sha256:a12ff4c8ddfee61f90a1633a4c4afd3f7bcb32b11c52026c92a12e1325922d0d", 426 | "sha256:a4646724fba402aa7504cd48b4b50e783296b5e10a524c7a6da62e4a8ac9698d", 427 | "sha256:a76f502430dd98d7546e1ea2250a7360c065a5fdea52b2dffe8ae7180909b6f4", 428 | "sha256:a9d17f2be3b427fbb2bce61e596cf555d6f8a56c222bd2ca148baeeb5e5c783c", 429 | "sha256:ab83f24d5c52d60dbc8cd0528759532736b56db58adaa7b5f1f76ad551416a1e", 430 | "sha256:aeb9ed923be74e659984e321f609b9ba54a48354bfd168d21a2b072ed1e833ea", 431 | "sha256:c843b3f50d1ab7361ca4f0b3639bf691569493a56808a0b0c54a051d260b7dbd", 432 | "sha256:cae865b1cae1ec2663d8ea56ef6ff185bad091a5e33ebbadd98de2cfa3fa668f", 433 | "sha256:cc6bd4fd593cb261332568485e20a0712883cf631f6f5e8e86a52caa8b2b50ff", 434 | "sha256:cf2402002d3d9f91c8b01e66fbb436a4ed01c6498fffed0e4c7566da1d40ee1e", 435 | "sha256:d051ec1c64b85ecc69531e1137bb9751c6830772ee5c1c426dbcfe98ef5788d7", 436 | "sha256:d6631f2e867676b13026e2846180e2c13c1e11289d67da08d71cacb2cd93d4aa", 437 | "sha256:dbd18bcf4889b720ba13a27ec2f2aac1981bd41203b3a3b27ba7a33f88ae4827", 438 | "sha256:df609c82f18c5b9f6cb97271f03315ff0dbe481a2a02e56aeb1b1a985ce38e60" 439 | ], 440 | "index": "pypi", 441 | "version": "==1.19.5" 442 | }, 443 | "oauthlib": { 444 | "hashes": [ 445 | "sha256:bee41cc35fcca6e988463cacc3bcb8a96224f470ca547e697b604cc697b2f889", 446 | "sha256:df884cd6cbe20e32633f1db1072e9356f53638e4361bef4e8b03c9127c9328ea" 447 | ], 448 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 449 | "version": "==3.1.0" 450 | }, 451 | "opt-einsum": { 452 | "hashes": [ 453 | "sha256:2455e59e3947d3c275477df7f5205b30635e266fe6dc300e3d9f9646bfcea147", 454 | "sha256:59f6475f77bbc37dcf7cd748519c0ec60722e91e63ca114e68821c0c54a46549" 455 | ], 456 | "markers": "python_version >= '3.5'", 457 | "version": "==3.3.0" 458 | }, 459 | "packaging": { 460 | "hashes": [ 461 | "sha256:5b327ac1320dc863dca72f4514ecc086f31186744b84a230374cc1fd776feae5", 462 | "sha256:67714da7f7bc052e064859c05c595155bd1ee9f69f76557e21f051443c20947a" 463 | ], 464 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 465 | "version": "==20.9" 466 | }, 467 | "pandas": { 468 | "hashes": [ 469 | "sha256:167693a80abc8eb28051fbd184c1b7afd13ce2c727a5af47b048f1ea3afefff4", 470 | "sha256:2111c25e69fa9365ba80bbf4f959400054b2771ac5d041ed19415a8b488dc70a", 471 | "sha256:298f0553fd3ba8e002c4070a723a59cdb28eda579f3e243bc2ee397773f5398b", 472 | "sha256:2b063d41803b6a19703b845609c0b700913593de067b552a8b24dd8eeb8c9895", 473 | "sha256:2cb7e8f4f152f27dc93f30b5c7a98f6c748601ea65da359af734dd0cf3fa733f", 474 | "sha256:52d2472acbb8a56819a87aafdb8b5b6d2b3386e15c95bde56b281882529a7ded", 475 | "sha256:612add929bf3ba9d27b436cc8853f5acc337242d6b584203f207e364bb46cb12", 476 | "sha256:649ecab692fade3cbfcf967ff936496b0cfba0af00a55dfaacd82bdda5cb2279", 477 | "sha256:68d7baa80c74aaacbed597265ca2308f017859123231542ff8a5266d489e1858", 478 | "sha256:8d4c74177c26aadcfb4fd1de6c1c43c2bf822b3e0fc7a9b409eeaf84b3e92aaa", 479 | "sha256:971e2a414fce20cc5331fe791153513d076814d30a60cd7348466943e6e909e4", 480 | "sha256:9db70ffa8b280bb4de83f9739d514cd0735825e79eef3a61d312420b9f16b758", 481 | "sha256:b730add5267f873b3383c18cac4df2527ac4f0f0eed1c6cf37fcb437e25cf558", 482 | "sha256:bd659c11a4578af740782288cac141a322057a2e36920016e0fc7b25c5a4b686", 483 | "sha256:c601c6fdebc729df4438ec1f62275d6136a0dd14d332fc0e8ce3f7d2aadb4dd6", 484 | "sha256:d0877407359811f7b853b548a614aacd7dea83b0c0c84620a9a643f180060950" 485 | ], 486 | "index": "pypi", 487 | "version": "==1.2.4" 488 | }, 489 | "pillow": { 490 | "hashes": [ 491 | "sha256:01425106e4e8cee195a411f729cff2a7d61813b0b11737c12bd5991f5f14bcd5", 492 | "sha256:031a6c88c77d08aab84fecc05c3cde8414cd6f8406f4d2b16fed1e97634cc8a4", 493 | "sha256:083781abd261bdabf090ad07bb69f8f5599943ddb539d64497ed021b2a67e5a9", 494 | "sha256:0d19d70ee7c2ba97631bae1e7d4725cdb2ecf238178096e8c82ee481e189168a", 495 | "sha256:0e04d61f0064b545b989126197930807c86bcbd4534d39168f4aa5fda39bb8f9", 496 | "sha256:12e5e7471f9b637762453da74e390e56cc43e486a88289995c1f4c1dc0bfe727", 497 | "sha256:22fd0f42ad15dfdde6c581347eaa4adb9a6fc4b865f90b23378aa7914895e120", 498 | "sha256:238c197fc275b475e87c1453b05b467d2d02c2915fdfdd4af126145ff2e4610c", 499 | "sha256:3b570f84a6161cf8865c4e08adf629441f56e32f180f7aa4ccbd2e0a5a02cba2", 500 | "sha256:463822e2f0d81459e113372a168f2ff59723e78528f91f0bd25680ac185cf797", 501 | "sha256:4d98abdd6b1e3bf1a1cbb14c3895226816e666749ac040c4e2554231068c639b", 502 | "sha256:5afe6b237a0b81bd54b53f835a153770802f164c5570bab5e005aad693dab87f", 503 | "sha256:5b70110acb39f3aff6b74cf09bb4169b167e2660dabc304c1e25b6555fa781ef", 504 | "sha256:5cbf3e3b1014dddc45496e8cf38b9f099c95a326275885199f427825c6522232", 505 | "sha256:624b977355cde8b065f6d51b98497d6cd5fbdd4f36405f7a8790e3376125e2bb", 506 | "sha256:63728564c1410d99e6d1ae8e3b810fe012bc440952168af0a2877e8ff5ab96b9", 507 | "sha256:66cc56579fd91f517290ab02c51e3a80f581aba45fd924fcdee01fa06e635812", 508 | "sha256:6c32cc3145928c4305d142ebec682419a6c0a8ce9e33db900027ddca1ec39178", 509 | "sha256:8bb1e155a74e1bfbacd84555ea62fa21c58e0b4e7e6b20e4447b8d07990ac78b", 510 | "sha256:95d5ef984eff897850f3a83883363da64aae1000e79cb3c321915468e8c6add5", 511 | "sha256:a013cbe25d20c2e0c4e85a9daf438f85121a4d0344ddc76e33fd7e3965d9af4b", 512 | "sha256:a787ab10d7bb5494e5f76536ac460741788f1fbce851068d73a87ca7c35fc3e1", 513 | "sha256:a7d5e9fad90eff8f6f6106d3b98b553a88b6f976e51fce287192a5d2d5363713", 514 | "sha256:aac00e4bc94d1b7813fe882c28990c1bc2f9d0e1aa765a5f2b516e8a6a16a9e4", 515 | "sha256:b91c36492a4bbb1ee855b7d16fe51379e5f96b85692dc8210831fbb24c43e484", 516 | "sha256:c03c07ed32c5324939b19e36ae5f75c660c81461e312a41aea30acdd46f93a7c", 517 | "sha256:c5236606e8570542ed424849f7852a0ff0bce2c4c8d0ba05cc202a5a9c97dee9", 518 | "sha256:c6b39294464b03457f9064e98c124e09008b35a62e3189d3513e5148611c9388", 519 | "sha256:cb7a09e173903541fa888ba010c345893cd9fc1b5891aaf060f6ca77b6a3722d", 520 | "sha256:d68cb92c408261f806b15923834203f024110a2e2872ecb0bd2a110f89d3c602", 521 | "sha256:dc38f57d8f20f06dd7c3161c59ca2c86893632623f33a42d592f097b00f720a9", 522 | "sha256:e98eca29a05913e82177b3ba3d198b1728e164869c613d76d0de4bde6768a50e", 523 | "sha256:f217c3954ce5fd88303fc0c317af55d5e0204106d86dea17eb8205700d47dec2" 524 | ], 525 | "markers": "python_version >= '3.6'", 526 | "version": "==8.2.0" 527 | }, 528 | "protobuf": { 529 | "hashes": [ 530 | "sha256:0277f62b1e42210cafe79a71628c1d553348da81cbd553402a7f7549c50b11d0", 531 | "sha256:07eec4e2ccbc74e95bb9b3afe7da67957947ee95bdac2b2e91b038b832dd71f0", 532 | "sha256:1c0e9e56202b9dccbc094353285a252e2b7940b74fdf75f1b4e1b137833fabd7", 533 | "sha256:1f0b5d156c3df08cc54bc2c8b8b875648ea4cd7ebb2a9a130669f7547ec3488c", 534 | "sha256:2dc0e8a9e4962207bdc46a365b63a3f1aca6f9681a5082a326c5837ef8f4b745", 535 | "sha256:3053f13207e7f13dc7be5e9071b59b02020172f09f648e85dc77e3fcb50d1044", 536 | "sha256:4a054b0b5900b7ea7014099e783fb8c4618e4209fffcd6050857517b3f156e18", 537 | "sha256:510e66491f1a5ac5953c908aa8300ec47f793130097e4557482803b187a8ee05", 538 | "sha256:5ff9fa0e67fcab442af9bc8d4ec3f82cb2ff3be0af62dba047ed4187f0088b7d", 539 | "sha256:90270fe5732c1f1ff664a3bd7123a16456d69b4e66a09a139a00443a32f210b8", 540 | "sha256:a0a08c6b2e6d6c74a6eb5bf6184968eefb1569279e78714e239d33126e753403", 541 | "sha256:c5566f956a26cda3abdfacc0ca2e21db6c9f3d18f47d8d4751f2209d6c1a5297", 542 | "sha256:dab75b56a12b1ceb3e40808b5bd9dfdaef3a1330251956e6744e5b6ed8f8830b", 543 | "sha256:efa4c4d4fc9ba734e5e85eaced70e1b63fb3c8d08482d839eb838566346f1737", 544 | "sha256:f17b352d7ce33c81773cf81d536ca70849de6f73c96413f17309f4b43ae7040b", 545 | "sha256:f42c2f5fb67da5905bfc03733a311f72fa309252bcd77c32d1462a1ad519521e", 546 | "sha256:f6077db37bfa16494dca58a4a02bfdacd87662247ad6bc1f7f8d13ff3f0013e1", 547 | "sha256:f80afc0a0ba13339bbab25ca0409e9e2836b12bb012364c06e97c2df250c3343", 548 | "sha256:f9cadaaa4065d5dd4d15245c3b68b967b3652a3108e77f292b58b8c35114b56c", 549 | "sha256:fad4f971ec38d8df7f4b632c819bf9bbf4f57cfd7312cf526c69ce17ef32436a" 550 | ], 551 | "version": "==3.15.8" 552 | }, 553 | "pyasn1": { 554 | "hashes": [ 555 | "sha256:014c0e9976956a08139dc0712ae195324a75e142284d5f87f1a87ee1b068a359", 556 | "sha256:03840c999ba71680a131cfaee6fab142e1ed9bbd9c693e285cc6aca0d555e576", 557 | "sha256:0458773cfe65b153891ac249bcf1b5f8f320b7c2ce462151f8fa74de8934becf", 558 | "sha256:08c3c53b75eaa48d71cf8c710312316392ed40899cb34710d092e96745a358b7", 559 | "sha256:39c7e2ec30515947ff4e87fb6f456dfc6e84857d34be479c9d4a4ba4bf46aa5d", 560 | "sha256:5c9414dcfede6e441f7e8f81b43b34e834731003427e5b09e4e00e3172a10f00", 561 | "sha256:6e7545f1a61025a4e58bb336952c5061697da694db1cae97b116e9c46abcf7c8", 562 | "sha256:78fa6da68ed2727915c4767bb386ab32cdba863caa7dbe473eaae45f9959da86", 563 | "sha256:7ab8a544af125fb704feadb008c99a88805126fb525280b2270bb25cc1d78a12", 564 | "sha256:99fcc3c8d804d1bc6d9a099921e39d827026409a58f2a720dcdb89374ea0c776", 565 | "sha256:aef77c9fb94a3ac588e87841208bdec464471d9871bd5050a287cc9a475cd0ba", 566 | "sha256:e89bf84b5437b532b0803ba5c9a5e054d21fec423a89952a74f87fa2c9b7bce2", 567 | "sha256:fec3e9d8e36808a28efb59b489e4528c10ad0f480e57dcc32b4de5c9d8c9fdf3" 568 | ], 569 | "version": "==0.4.8" 570 | }, 571 | "pyasn1-modules": { 572 | "hashes": [ 573 | "sha256:0845a5582f6a02bb3e1bde9ecfc4bfcae6ec3210dd270522fee602365430c3f8", 574 | "sha256:0fe1b68d1e486a1ed5473f1302bd991c1611d319bba158e98b106ff86e1d7199", 575 | "sha256:15b7c67fabc7fc240d87fb9aabf999cf82311a6d6fb2c70d00d3d0604878c811", 576 | "sha256:426edb7a5e8879f1ec54a1864f16b882c2837bfd06eee62f2c982315ee2473ed", 577 | "sha256:65cebbaffc913f4fe9e4808735c95ea22d7a7775646ab690518c056784bc21b4", 578 | "sha256:905f84c712230b2c592c19470d3ca8d552de726050d1d1716282a1f6146be65e", 579 | "sha256:a50b808ffeb97cb3601dd25981f6b016cbb3d31fbf57a8b8a87428e6158d0c74", 580 | "sha256:a99324196732f53093a84c4369c996713eb8c89d360a496b599fb1a9c47fc3eb", 581 | "sha256:b80486a6c77252ea3a3e9b1e360bc9cf28eaac41263d173c032581ad2f20fe45", 582 | "sha256:c29a5e5cc7a3f05926aff34e097e84f8589cd790ce0ed41b67aed6857b26aafd", 583 | "sha256:cbac4bc38d117f2a49aeedec4407d23e8866ea4ac27ff2cf7fb3e5b570df19e0", 584 | "sha256:f39edd8c4ecaa4556e989147ebf219227e2cd2e8a43c7e7fcb1f1c18c5fd6a3d", 585 | "sha256:fe0644d9ab041506b62782e92b06b8c68cca799e1a9636ec398675459e031405" 586 | ], 587 | "version": "==0.2.8" 588 | }, 589 | "pyparsing": { 590 | "hashes": [ 591 | "sha256:c203ec8783bf771a155b207279b9bccb8dea02d8f0c9e5f8ead507bc3246ecc1", 592 | "sha256:ef9d7589ef3c200abe66653d3f1ab1033c3c419ae9b9bdb1240a85b024efc88b" 593 | ], 594 | "markers": "python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2'", 595 | "version": "==2.4.7" 596 | }, 597 | "pysocks": { 598 | "hashes": [ 599 | "sha256:08e69f092cc6dbe92a0fdd16eeb9b9ffbc13cadfe5ca4c7bd92ffb078b293299", 600 | "sha256:2725bd0a9925919b9b51739eea5f9e2bae91e83288108a9ad338b2e3a4435ee5", 601 | "sha256:3f8804571ebe159c380ac6de37643bb4685970655d3bba243530d6558b799aa0" 602 | ], 603 | "version": "==1.7.1" 604 | }, 605 | "python-dateutil": { 606 | "hashes": [ 607 | "sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c", 608 | "sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a" 609 | ], 610 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'", 611 | "version": "==2.8.1" 612 | }, 613 | "python-dotenv": { 614 | "hashes": [ 615 | "sha256:471b782da0af10da1a80341e8438fca5fadeba2881c54360d5fd8d03d03a4f4a", 616 | "sha256:49782a97c9d641e8a09ae1d9af0856cc587c8d2474919342d5104d85be9890b2" 617 | ], 618 | "index": "pypi", 619 | "version": "==0.17.0" 620 | }, 621 | "pytz": { 622 | "hashes": [ 623 | "sha256:83a4a90894bf38e243cf052c8b58f381bfe9a7a483f6a9cab140bc7f702ac4da", 624 | "sha256:eb10ce3e7736052ed3623d49975ce333bcd712c7bb19a58b9e2089d4057d0798" 625 | ], 626 | "version": "==2021.1" 627 | }, 628 | "pyyaml": { 629 | "hashes": [ 630 | "sha256:08682f6b72c722394747bddaf0aa62277e02557c0fd1c42cb853016a38f8dedf", 631 | "sha256:0f5f5786c0e09baddcd8b4b45f20a7b5d61a7e7e99846e3c799b05c7c53fa696", 632 | "sha256:129def1b7c1bf22faffd67b8f3724645203b79d8f4cc81f674654d9902cb4393", 633 | "sha256:294db365efa064d00b8d1ef65d8ea2c3426ac366c0c4368d930bf1c5fb497f77", 634 | "sha256:3b2b1824fe7112845700f815ff6a489360226a5609b96ec2190a45e62a9fc922", 635 | "sha256:3bd0e463264cf257d1ffd2e40223b197271046d09dadf73a0fe82b9c1fc385a5", 636 | "sha256:4465124ef1b18d9ace298060f4eccc64b0850899ac4ac53294547536533800c8", 637 | "sha256:49d4cdd9065b9b6e206d0595fee27a96b5dd22618e7520c33204a4a3239d5b10", 638 | "sha256:4e0583d24c881e14342eaf4ec5fbc97f934b999a6828693a99157fde912540cc", 639 | "sha256:5accb17103e43963b80e6f837831f38d314a0495500067cb25afab2e8d7a4018", 640 | "sha256:607774cbba28732bfa802b54baa7484215f530991055bb562efbed5b2f20a45e", 641 | "sha256:6c78645d400265a062508ae399b60b8c167bf003db364ecb26dcab2bda048253", 642 | "sha256:72a01f726a9c7851ca9bfad6fd09ca4e090a023c00945ea05ba1638c09dc3347", 643 | "sha256:74c1485f7707cf707a7aef42ef6322b8f97921bd89be2ab6317fd782c2d53183", 644 | "sha256:895f61ef02e8fed38159bb70f7e100e00f471eae2bc838cd0f4ebb21e28f8541", 645 | "sha256:8c1be557ee92a20f184922c7b6424e8ab6691788e6d86137c5d93c1a6ec1b8fb", 646 | "sha256:bb4191dfc9306777bc594117aee052446b3fa88737cd13b7188d0e7aa8162185", 647 | "sha256:bfb51918d4ff3d77c1c856a9699f8492c612cde32fd3bcd344af9be34999bfdc", 648 | "sha256:c20cfa2d49991c8b4147af39859b167664f2ad4561704ee74c1de03318e898db", 649 | "sha256:cb333c16912324fd5f769fff6bc5de372e9e7a202247b48870bc251ed40239aa", 650 | "sha256:d2d9808ea7b4af864f35ea216be506ecec180628aced0704e34aca0b040ffe46", 651 | "sha256:d483ad4e639292c90170eb6f7783ad19490e7a8defb3e46f97dfe4bacae89122", 652 | "sha256:dd5de0646207f053eb0d6c74ae45ba98c3395a571a2891858e87df7c9b9bd51b", 653 | "sha256:e1d4970ea66be07ae37a3c2e48b5ec63f7ba6804bdddfdbd3cfd954d25a82e63", 654 | "sha256:e4fac90784481d221a8e4b1162afa7c47ed953be40d31ab4629ae917510051df", 655 | "sha256:fa5ae20527d8e831e8230cbffd9f8fe952815b2b7dae6ffec25318803a7528fc", 656 | "sha256:fd7f6999a8070df521b6384004ef42833b9bd62cfee11a09bda1079b4b704247", 657 | "sha256:fdc842473cd33f45ff6bce46aea678a54e3d21f1b61a7750ce3c498eedfe25d6", 658 | "sha256:fe69978f3f768926cfa37b867e3843918e012cf83f680806599ddce33c2c68b0" 659 | ], 660 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'", 661 | "version": "==5.4.1" 662 | }, 663 | "regex": { 664 | "hashes": [ 665 | "sha256:01afaf2ec48e196ba91b37451aa353cb7eda77efe518e481707e0515025f0cd5", 666 | "sha256:11d773d75fa650cd36f68d7ca936e3c7afaae41b863b8c387a22aaa78d3c5c79", 667 | "sha256:18c071c3eb09c30a264879f0d310d37fe5d3a3111662438889ae2eb6fc570c31", 668 | "sha256:1e1c20e29358165242928c2de1482fb2cf4ea54a6a6dea2bd7a0e0d8ee321500", 669 | "sha256:281d2fd05555079448537fe108d79eb031b403dac622621c78944c235f3fcf11", 670 | "sha256:314d66636c494ed9c148a42731b3834496cc9a2c4251b1661e40936814542b14", 671 | "sha256:32e65442138b7b76dd8173ffa2cf67356b7bc1768851dded39a7a13bf9223da3", 672 | "sha256:339456e7d8c06dd36a22e451d58ef72cef293112b559010db3d054d5560ef439", 673 | "sha256:3916d08be28a1149fb97f7728fca1f7c15d309a9f9682d89d79db75d5e52091c", 674 | "sha256:3a9cd17e6e5c7eb328517969e0cb0c3d31fd329298dd0c04af99ebf42e904f82", 675 | "sha256:47bf5bf60cf04d72bf6055ae5927a0bd9016096bf3d742fa50d9bf9f45aa0711", 676 | "sha256:4c46e22a0933dd783467cf32b3516299fb98cfebd895817d685130cc50cd1093", 677 | "sha256:4c557a7b470908b1712fe27fb1ef20772b78079808c87d20a90d051660b1d69a", 678 | "sha256:52ba3d3f9b942c49d7e4bc105bb28551c44065f139a65062ab7912bef10c9afb", 679 | "sha256:563085e55b0d4fb8f746f6a335893bda5c2cef43b2f0258fe1020ab1dd874df8", 680 | "sha256:598585c9f0af8374c28edd609eb291b5726d7cbce16be6a8b95aa074d252ee17", 681 | "sha256:619d71c59a78b84d7f18891fe914446d07edd48dc8328c8e149cbe0929b4e000", 682 | "sha256:67bdb9702427ceddc6ef3dc382455e90f785af4c13d495f9626861763ee13f9d", 683 | "sha256:6d1b01031dedf2503631d0903cb563743f397ccaf6607a5e3b19a3d76fc10480", 684 | "sha256:741a9647fcf2e45f3a1cf0e24f5e17febf3efe8d4ba1281dcc3aa0459ef424dc", 685 | "sha256:7c2a1af393fcc09e898beba5dd59196edaa3116191cc7257f9224beaed3e1aa0", 686 | "sha256:7d9884d86dd4dd489e981d94a65cd30d6f07203d90e98f6f657f05170f6324c9", 687 | "sha256:90f11ff637fe8798933fb29f5ae1148c978cccb0452005bf4c69e13db951e765", 688 | "sha256:919859aa909429fb5aa9cf8807f6045592c85ef56fdd30a9a3747e513db2536e", 689 | "sha256:96fcd1888ab4d03adfc9303a7b3c0bd78c5412b2bfbe76db5b56d9eae004907a", 690 | "sha256:97f29f57d5b84e73fbaf99ab3e26134e6687348e95ef6b48cfd2c06807005a07", 691 | "sha256:980d7be47c84979d9136328d882f67ec5e50008681d94ecc8afa8a65ed1f4a6f", 692 | "sha256:a91aa8619b23b79bcbeb37abe286f2f408d2f2d6f29a17237afda55bb54e7aac", 693 | "sha256:ade17eb5d643b7fead300a1641e9f45401c98eee23763e9ed66a43f92f20b4a7", 694 | "sha256:b9c3db21af35e3b3c05764461b262d6f05bbca08a71a7849fd79d47ba7bc33ed", 695 | "sha256:bd28bc2e3a772acbb07787c6308e00d9626ff89e3bfcdebe87fa5afbfdedf968", 696 | "sha256:bf5824bfac591ddb2c1f0a5f4ab72da28994548c708d2191e3b87dd207eb3ad7", 697 | "sha256:c0502c0fadef0d23b128605d69b58edb2c681c25d44574fc673b0e52dce71ee2", 698 | "sha256:c38c71df845e2aabb7fb0b920d11a1b5ac8526005e533a8920aea97efb8ec6a4", 699 | "sha256:ce15b6d103daff8e9fee13cf7f0add05245a05d866e73926c358e871221eae87", 700 | "sha256:d3029c340cfbb3ac0a71798100ccc13b97dddf373a4ae56b6a72cf70dfd53bc8", 701 | "sha256:e512d8ef5ad7b898cdb2d8ee1cb09a8339e4f8be706d27eaa180c2f177248a10", 702 | "sha256:e8e5b509d5c2ff12f8418006d5a90e9436766133b564db0abaec92fd27fcee29", 703 | "sha256:ee54ff27bf0afaf4c3b3a62bcd016c12c3fdb4ec4f413391a90bd38bc3624605", 704 | "sha256:fa4537fb4a98fe8fde99626e4681cc644bdcf2a795038533f9f711513a862ae6", 705 | "sha256:fd45ff9293d9274c5008a2054ecef86a9bfe819a67c7be1afb65e69b405b3042" 706 | ], 707 | "version": "==2021.4.4" 708 | }, 709 | "requests": { 710 | "extras": [ 711 | "socks" 712 | ], 713 | "hashes": [ 714 | "sha256:27973dd4a904a4f13b263a19c866c13b92a39ed1c964655f025f3f8d3d75b804", 715 | "sha256:c210084e36a42ae6b9219e00e48287def368a26d03a048ddad7bfee44f75871e" 716 | ], 717 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", 718 | "version": "==2.25.1" 719 | }, 720 | "requests-oauthlib": { 721 | "hashes": [ 722 | "sha256:7f71572defaecd16372f9006f33c2ec8c077c3cfa6f5911a9a90202beb513f3d", 723 | "sha256:b4261601a71fd721a8bd6d7aa1cc1d6a8a93b4a9f5e96626f8e4d91e8beeaa6a", 724 | "sha256:fa6c47b933f01060936d87ae9327fead68768b69c6c9ea2109c48be30f2d4dbc" 725 | ], 726 | "version": "==1.3.0" 727 | }, 728 | "rsa": { 729 | "hashes": [ 730 | "sha256:78f9a9bf4e7be0c5ded4583326e7461e3a3c5aae24073648b4bdfa797d78c9d2", 731 | "sha256:9d689e6ca1b3038bc82bf8d23e944b6b6037bc02301a574935b2dd946e0353b9" 732 | ], 733 | "markers": "python_version >= '3.6'", 734 | "version": "==4.7.2" 735 | }, 736 | "sacremoses": { 737 | "hashes": [ 738 | "sha256:58176cc28391830789b763641d0f458819bebe88681dac72b41a19c0aedc07e9", 739 | "sha256:fa93db44bc04542553ba6090818b892f603d02aa0d681e6c5c3023baf17e8564" 740 | ], 741 | "version": "==0.0.45" 742 | }, 743 | "scikit-learn": { 744 | "hashes": [ 745 | "sha256:0567a2d29ad08af98653300c623bd8477b448fe66ced7198bef4ed195925f082", 746 | "sha256:087dfede39efb06ab30618f9ab55a0397f29c38d63cd0ab88d12b500b7d65fd7", 747 | "sha256:1adf483e91007a87171d7ce58c34b058eb5dab01b5fee6052f15841778a8ecd8", 748 | "sha256:259ec35201e82e2db1ae2496f229e63f46d7f1695ae68eef9350b00dc74ba52f", 749 | "sha256:3c4f07f47c04e81b134424d53c3f5e16dfd7f494e44fd7584ba9ce9de2c5e6c1", 750 | "sha256:4562dcf4793e61c5d0f89836d07bc37521c3a1889da8f651e2c326463c4bd697", 751 | "sha256:4ddd2b6f7449a5d539ff754fa92d75da22de261fd8fdcfb3596799fadf255101", 752 | "sha256:54be0a60a5a35005ad69c75902e0f5c9f699db4547ead427e97ef881c3242e6f", 753 | "sha256:5580eba7345a4d3b097be2f067cc71a306c44bab19e8717a30361f279c929bea", 754 | "sha256:7b04691eb2f41d2c68dbda8d1bd3cb4ef421bdc43aaa56aeb6c762224552dfb6", 755 | "sha256:826b92bf45b8ad80444814e5f4ac032156dd481e48d7da33d611f8fe96d5f08b", 756 | "sha256:83b21ff053b1ff1c018a2d24db6dd3ea339b1acfbaa4d9c881731f43748d8b3b", 757 | "sha256:8772b99d683be8f67fcc04789032f1b949022a0e6880ee7b75a7ec97dbbb5d0b", 758 | "sha256:895dbf2030aa7337649e36a83a007df3c9811396b4e2fa672a851160f36ce90c", 759 | "sha256:8aa1b3ac46b80eaa552b637eeadbbce3be5931e4b5002b964698e33a1b589e1e", 760 | "sha256:9599a3f3bf33f73fed0fe06d1dfa4e6081365a58c1c807acb07271be0dce9733", 761 | "sha256:99349d77f54e11f962d608d94dfda08f0c9e5720d97132233ebdf35be2858b2d", 762 | "sha256:9a24d1ccec2a34d4cd3f2a1f86409f3f5954cc23d4d2270ba0d03cf018aa4780", 763 | "sha256:9bed8a1ef133c8e2f13966a542cb8125eac7f4b67dcd234197c827ba9c7dd3e0", 764 | "sha256:9c6097b6a9b2bafc5e0f31f659e6ab5e131383209c30c9e978c5b8abdac5ed2a", 765 | "sha256:9dfa564ef27e8e674aa1cc74378416d580ac4ede1136c13dd555a87996e13422", 766 | "sha256:a0334a1802e64d656022c3bfab56a73fbd6bf4b1298343f3688af2151810bbdf", 767 | "sha256:a29460499c1e62b7a830bb57ca42e615375a6ab1bcad053cd25b493588348ea8", 768 | "sha256:a36e159a0521e13bbe15ca8c8d038b3a1dd4c7dad18d276d76992e03b92cf643", 769 | "sha256:abe835a851610f87201819cb315f8d554e1a3e8128912783a31e87264ba5ffb7", 770 | "sha256:c13ebac42236b1c46397162471ea1c46af68413000e28b9309f8c05722c65a09", 771 | "sha256:c3deb3b19dd9806acf00cf0d400e84562c227723013c33abefbbc3cf906596e9", 772 | "sha256:c658432d8a20e95398f6bb95ff9731ce9dfa343fdf21eea7ec6a7edfacd4b4d9", 773 | "sha256:c7f4eb77504ac586d8ac1bde1b0c04b504487210f95297235311a0ab7edd7e38", 774 | "sha256:d54dbaadeb1425b7d6a66bf44bee2bb2b899fe3e8850b8e94cfb9c904dcb46d0", 775 | "sha256:ddb52d088889f5596bc4d1de981f2eca106b58243b6679e4782f3ba5096fd645", 776 | "sha256:ed9d65594948678827f4ff0e7ae23344e2f2b4cabbca057ccaed3118fdc392ca", 777 | "sha256:fab31f48282ebf54dd69f6663cd2d9800096bad1bb67bbc9c9ac84eb77b41972" 778 | ], 779 | "index": "pypi", 780 | "version": "==0.24.1" 781 | }, 782 | "scipy": { 783 | "hashes": [ 784 | "sha256:03f1fd3574d544456325dae502facdf5c9f81cbfe12808a5e67a737613b7ba8c", 785 | "sha256:0c81ea1a95b4c9e0a8424cf9484b7b8fa7ef57169d7bcc0dfcfc23e3d7c81a12", 786 | "sha256:1fba8a214c89b995e3721670e66f7053da82e7e5d0fe6b31d8e4b19922a9315e", 787 | "sha256:37f4c2fb904c0ba54163e03993ce3544c9c5cde104bcf90614f17d85bdfbb431", 788 | "sha256:50e5bcd9d45262725e652611bb104ac0919fd25ecb78c22f5282afabd0b2e189", 789 | "sha256:6ca1058cb5bd45388041a7c3c11c4b2bd58867ac9db71db912501df77be2c4a4", 790 | "sha256:77f7a057724545b7e097bfdca5c6006bed8580768cd6621bb1330aedf49afba5", 791 | "sha256:816951e73d253a41fa2fd5f956f8e8d9ac94148a9a2039e7db56994520582bf2", 792 | "sha256:96620240b393d155097618bcd6935d7578e85959e55e3105490bbbf2f594c7ad", 793 | "sha256:993c86513272bc84c451349b10ee4376652ab21f312b0554fdee831d593b6c02", 794 | "sha256:adf7cee8e5c92b05f2252af498f77c7214a2296d009fc5478fc432c2f8fb953b", 795 | "sha256:bc52d4d70863141bb7e2f8fd4d98e41d77375606cde50af65f1243ce2d7853e8", 796 | "sha256:c1d3f771c19af00e1a36f749bd0a0690cc64632783383bc68f77587358feb5a4", 797 | "sha256:d744657c27c128e357de2f0fd532c09c84cd6e4933e8232895a872e67059ac37", 798 | "sha256:e3e9742bad925c421d39e699daa8d396c57535582cba90017d17f926b61c1552", 799 | "sha256:e547f84cd52343ac2d56df0ab08d3e9cc202338e7d09fafe286d6c069ddacb31", 800 | "sha256:e89091e6a8e211269e23f049473b2fde0c0e5ae0dd5bd276c3fc91b97da83480", 801 | "sha256:e9da33e21c9bc1b92c20b5328adb13e5f193b924c9b969cd700c8908f315aa59", 802 | "sha256:ffdfb09315896c6e9ac739bb6e13a19255b698c24e6b28314426fd40a1180822" 803 | ], 804 | "markers": "python_version < '3.10' and python_version >= '3.7'", 805 | "version": "==1.6.2" 806 | }, 807 | "six": { 808 | "hashes": [ 809 | "sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259", 810 | "sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced" 811 | ], 812 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'", 813 | "version": "==1.15.0" 814 | }, 815 | "tensorboard": { 816 | "hashes": [ 817 | "sha256:e167460085b6528956b33bab1c970c989cdce47a6616273880733f5e7bde452e" 818 | ], 819 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1'", 820 | "version": "==2.5.0" 821 | }, 822 | "tensorboard-data-server": { 823 | "hashes": [ 824 | "sha256:2d723d73e3a3b0a4498f56c64c39e2e26ac192414891df22c9f152b7058fd6bc", 825 | "sha256:a4b8e1c3fc85237b3afeef450db06c9a9b25f5854ad27c21667a90808acd1822", 826 | "sha256:b620e520d3d535ceb896557acca0029fd7fd2f9f408af35abc2d2dad91f0345d" 827 | ], 828 | "markers": "python_version >= '3.6'", 829 | "version": "==0.6.0" 830 | }, 831 | "tensorboard-plugin-wit": { 832 | "hashes": [ 833 | "sha256:2a80d1c551d741e99b2f197bb915d8a133e24adb8da1732b840041860f91183a" 834 | ], 835 | "version": "==1.8.0" 836 | }, 837 | "tensorflow": { 838 | "hashes": [ 839 | "sha256:0e427b1350be6dbe572f971947c5596fdbb152081f227808d8becd894bf40282", 840 | "sha256:22723b8e1fa83b34f56c349b16a57aaff913b404451fcf70981f2b1d6e0c64fc", 841 | "sha256:2357112319303da1b5459a621fd0503c2b2cd97b6c33c4903abd46b3c3e380e2", 842 | "sha256:36d5acd60aac48e34bd545d0ce1fb8b3fceebff6b8782436defd0f71c12203bd", 843 | "sha256:4a04081647b89a8fb602895b29ffc559e3c20aac8bde1d4c5ecd2a65adce5d35", 844 | "sha256:55368ba0bedb513ba0e36a2543a588b5276e9b2ca99fa3232a9a176601a7bab5", 845 | "sha256:e1f2799cc86861680d8515167f103e2207a8cab92a4afe5471e4839330591f08", 846 | "sha256:eedcf578afde5e6e69c75d796bed41093451cd1ab54afb438760e40fb74a09de", 847 | "sha256:efa9daa4b3701a4e439b24b74c1e4b66844aee8ae5263fb3cc12281ac9cc9f67" 848 | ], 849 | "index": "pypi", 850 | "version": "==2.4.1" 851 | }, 852 | "tensorflow-estimator": { 853 | "hashes": [ 854 | "sha256:5b7b7bf2debe19a8794adacc43e8ba6459daa4efaf54d3302623994a359b17f0" 855 | ], 856 | "version": "==2.4.0" 857 | }, 858 | "termcolor": { 859 | "hashes": [ 860 | "sha256:1d6d69ce66211143803fbc56652b41d73b4a400a2891d7bf7a1cdf4c02de613b" 861 | ], 862 | "version": "==1.1.0" 863 | }, 864 | "threadpoolctl": { 865 | "hashes": [ 866 | "sha256:38b74ca20ff3bb42caca8b00055111d74159ee95c4370882bbff2b93d24da725", 867 | "sha256:ddc57c96a38beb63db45d6c159b5ab07b6bced12c45a1f07b2b92f272aebfa6b" 868 | ], 869 | "markers": "python_version >= '3.5'", 870 | "version": "==2.1.0" 871 | }, 872 | "tokenizers": { 873 | "hashes": [ 874 | "sha256:03056431783e72df80de68648573f97a70701d17fa22336c6d761b5d4b7be9ff", 875 | "sha256:05c90ade1b9cc41aaee6056c5e460dc5150f12b602bdc6bfa3758fb965ca7788", 876 | "sha256:15d9b959fd3b9e9e7c6d6d7d909bca5d7397a170a50d99ac8ce4e2ab590b137a", 877 | "sha256:1a22bf899728eeb74ee2bb1ba9eff61898ec02e623a690ed28002762d19ab9b4", 878 | "sha256:1ab8c467e4fe16bba33022feefcd6322642a58e4c8c123fd692c20e17f339964", 879 | "sha256:2094eb8e3608858eb4bd29c32c39969ae63ad9749d8aca9b34e82cba852acaf1", 880 | "sha256:2b592146caff20c283dadf2da99520b1dfde4af8ce964a8adcb4e990923fa423", 881 | "sha256:39e24555d5a2d9df87fd75303e1fd9ba3f995ac8aeb543c511d601d26a54726a", 882 | "sha256:3fb22df976701452db3ba652bd647518a043e58d4209d18273163fbc53252a3b", 883 | "sha256:419bb33bb3690239b93b76b06eba1eb822aa72f4e63293d2f15c60505f6ee0d0", 884 | "sha256:474883e8e0be431394e0ccfb70e97c1856e8c5bc80536f7b2faa3b0785d59afd", 885 | "sha256:4865d34d4897eed4ca4a758971fb14911cf5022e270b53c028fa9312fe440e2b", 886 | "sha256:52c2479975fd5025d399493403c7aedce853da20cec04a32a829c1c12c28e2f1", 887 | "sha256:6229fcc8473fd225e8e09742c354dacacd57dbdc73075e4c9d71f925cd171090", 888 | "sha256:77c4c41f2147c930c66014ca43b6935133781ae1923d62e70c797e71b0ee2598", 889 | "sha256:79119578bcd1d8ec836ddd3dbb305f32084d60e9f67e93a10ca33c67eeaa89fc", 890 | "sha256:7ba26369bc30f9d28d9ff42dcb1b57d9995157a9bb2975b95acda4220195d7aa", 891 | "sha256:86077426c615a814f7456569eade33c12c93131d02fdf548994dcedf41bdbbf1", 892 | "sha256:8a575022e066878bede82bb5d5244b17c6ebda15dbb50229d86f9e8267ddd40e", 893 | "sha256:8b4ae84fb410b5f5abb3a604b3274e2d6994b21f07c379b1c1659561e026bad8", 894 | "sha256:9124a1f77e176cb2a2571bae4c3bf8d4c40975c1681e2ba346fdca5d6a3aa843", 895 | "sha256:953a4e483524fd37fd66208e21dce85d4829bfe294d8b6224d2f00f61aa9950c", 896 | "sha256:9619846026b16967465e5221206f86bdc58cf65b0f92548d048e97925361121e", 897 | "sha256:9e8f32a2ef1902f769da6215ae8beabd632676a1551fb171b5aa6d4c11fd3a02", 898 | "sha256:a323d93fd5e57060428fecb6d73ab13223822f8ffa1ede282070b47a4bda2cea", 899 | "sha256:a3de6ecfbd739ee3d59280c0c930c0c5a716df1cf0cdf68beb379066931866bd", 900 | "sha256:aadcf38b97114d035e389f5aee4edf59e81666ad65de26c06592d76f184bb66c", 901 | "sha256:bac17cceebb2a6947d380e1b7bce8fc33098a979071a1291adc456fb25434924", 902 | "sha256:bb58ad982f8f72052362a5384145e87559899dcc0b06264e71dd137869037e6e", 903 | "sha256:be25827c0506d92927dc0ef4d2ce0c4653a351735546f8b22548535c3d2f7a6c", 904 | "sha256:bed7c5c2c786a2e9b3265006f15a13d8e04dcdfcf9ba13add0d7194a50346393", 905 | "sha256:c0f5bbc2e614468bcb605f2aa4a6dbcdf21629c6ff6ae81f1d9fa9683934ce8e", 906 | "sha256:c429c25c3dfe1ea9ad6e21a49d648910335ef4188c5e8226e5aa2ba2bd13921c", 907 | "sha256:cd408266f13856dc648ed2dcc889ac17feffc28da2ebb03f1977b88935e86c9a", 908 | "sha256:cf7f1aad957fed36e4a90fc094e3adc03fdd45fbb058c1cde25721e3e66235f8", 909 | "sha256:e3e74a9fa40b92a9817fe05ea91bf20075f45ad8cf7c0d3eb738170a27059508", 910 | "sha256:ea54eb0071f13fa7c6c3b88997a843d01c067158b994115759c27827e683fb82", 911 | "sha256:f1553e029f326eb74f36d67a38ef77a7f03068a494a0faa4e16d0d832f25b760", 912 | "sha256:f7b62497dce161babdb9197fa4b26e401bac9541b62fe0d0957134fefeb1b01c", 913 | "sha256:f9fe9c5556ccab03c9d42ed299bd8901c95d22373676437bfeb4656c2b5e42bc", 914 | "sha256:fb1dae213c8531d6af071dd021c7225be73803a0cbe609aed5074be04118aa6c" 915 | ], 916 | "version": "==0.10.2" 917 | }, 918 | "torch": { 919 | "hashes": [ 920 | "sha256:1388b30fbd262c1a053d6c9ace73bb0bd8f5871b4892b6f3e02d1d7bc9768563", 921 | "sha256:16f2630d9604c4ee28ea7d6e388e2264cd7bc6031c6ecd796bae3f56b5efa9a3", 922 | "sha256:225ee4238c019b28369c71977327deeeb2bd1c6b8557e6fcf631b8866bdc5447", 923 | "sha256:3e4190c04dfd89c59bad06d5fe451446643a65e6d2607cc989eb1001ee76e12f", 924 | "sha256:4ace9c5bb94d5a7b9582cd089993201658466e9c59ff88bd4e9e08f6f072d1cf", 925 | "sha256:55137feb2f5a0dc7aced5bba690dcdb7652054ad3452b09a2bbb59f02a11e9ff", 926 | "sha256:5c2e9a33d44cdb93ebd739b127ffd7da786bf5f740539539195195b186a05f6c", 927 | "sha256:6ffa1e7ae079c7cb828712cb0cdaae5cc4fb87c16a607e6d14526b62c20bcc17", 928 | "sha256:8ad2252bf09833dcf46a536a78544e349b8256a370e03a98627ebfb118d9555b", 929 | "sha256:95b7bbbacc3f28fe438f418392ceeae146a01adc03b29d44917d55214ac234c9", 930 | "sha256:a50ea8ed900927fb30cadb63aa7a32fdd59c7d7abe5012348dfbe35a8355c083", 931 | "sha256:c6ede2ae4dcd8214b63e047efabafa92493605205a947574cf358216ca4e440a", 932 | "sha256:ce7d435426f3dd14f95710d779aa46e9cd5e077d512488e813f7589fdc024f78", 933 | "sha256:dac4d10494e74f7e553c92d7263e19ea501742c4825ddd26c4decfa27be95981", 934 | "sha256:e7ad1649adb7dc2a450e70a3e51240b84fa4746c69c8f98989ce0c254f9fba3a", 935 | "sha256:f23eeb1a48cc39209d986c418ad7e02227eee973da45c0c42d36b1aec72f4940" 936 | ], 937 | "index": "pypi", 938 | "version": "==1.8.1" 939 | }, 940 | "tqdm": { 941 | "hashes": [ 942 | "sha256:daec693491c52e9498632dfbe9ccfc4882a557f5fa08982db1b4d3adbe0887c3", 943 | "sha256:ebdebdb95e3477ceea267decfc0784859aa3df3e27e22d23b83e9b272bf157ae" 944 | ], 945 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", 946 | "version": "==4.60.0" 947 | }, 948 | "transformers": { 949 | "hashes": [ 950 | "sha256:0a57d1cd9301a617c7015d7184228984abdfb1ae2158c29cfb32582219756d23", 951 | "sha256:3508e3b032cf0f5342c67836de4b121aa5c435c959472a28054ba895ea59cca7" 952 | ], 953 | "index": "pypi", 954 | "version": "==4.5.1" 955 | }, 956 | "tweepy": { 957 | "hashes": [ 958 | "sha256:5e22003441a11f6f4c2ea4d05ec5532f541e9f5d874c3908270f0c28e649b53a", 959 | "sha256:76e6954b806ca470dda877f57db8792fff06a0beba0ed43efc3805771e39f06a" 960 | ], 961 | "index": "pypi", 962 | "version": "==3.10.0" 963 | }, 964 | "typing-extensions": { 965 | "hashes": [ 966 | "sha256:7cb407020f00f7bfc3cb3e7881628838e69d8f3fcab2f64742a5e76b2f841918", 967 | "sha256:99d4073b617d30288f569d3f13d2bd7548c3a7e4c8de87db09a9d29bb3a4a60c", 968 | "sha256:dafc7639cde7f1b6e1acc0f457842a83e722ccca8eef5270af2d74792619a89f" 969 | ], 970 | "markers": "python_version < '3.8'", 971 | "version": "==3.7.4.3" 972 | }, 973 | "urllib3": { 974 | "hashes": [ 975 | "sha256:2f4da4594db7e1e110a944bb1b551fdf4e6c136ad42e4234131391e21eb5b0df", 976 | "sha256:e7b021f7241115872f92f43c6508082facffbd1c048e3c6e2bb9c2a157e28937" 977 | ], 978 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4' and python_version < '4'", 979 | "version": "==1.26.4" 980 | }, 981 | "werkzeug": { 982 | "hashes": [ 983 | "sha256:2de2a5db0baeae7b2d2664949077c2ac63fbd16d98da0ff71837f7d1dea3fd43", 984 | "sha256:6c80b1e5ad3665290ea39320b91e1be1e0d5f60652b964a3070216de83d2e47c" 985 | ], 986 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", 987 | "version": "==1.0.1" 988 | }, 989 | "wheel": { 990 | "hashes": [ 991 | "sha256:78b5b185f0e5763c26ca1e324373aadd49182ca90e825f7853f4b2509215dc0e", 992 | "sha256:e11eefd162658ea59a60a0f6c7d493a7190ea4b9a85e335b33489d9f17e0245e" 993 | ], 994 | "markers": "python_version >= '3'", 995 | "version": "==0.36.2" 996 | }, 997 | "wrapt": { 998 | "hashes": [ 999 | "sha256:b62ffa81fb85f4332a4f609cab4ac40709470da05643a082ec1eb88e6d9b97d7" 1000 | ], 1001 | "version": "==1.12.1" 1002 | }, 1003 | "zipp": { 1004 | "hashes": [ 1005 | "sha256:3607921face881ba3e026887d8150cca609d517579abe052ac81fc5aeffdbd76", 1006 | "sha256:51cb66cc54621609dd593d1787f286ee42a5c0adbb4b29abea5a63edc3e03098" 1007 | ], 1008 | "markers": "python_version >= '3.6'", 1009 | "version": "==3.4.1" 1010 | } 1011 | }, 1012 | "develop": {} 1013 | } 1014 | -------------------------------------------------------------------------------- /backend/README.md: -------------------------------------------------------------------------------- 1 | # cse6242-project 2 | All the data will now be at this link: https://drive.google.com/drive/folders/1RQlCXTDjg-_fbt9_nhTIWSsGnP4_sh4K?usp=sharing 3 | 4 | To get the data: 5 | 1. Download the twitter(`company_tweets_2015_to_2019.csv`, e.g. `TSLA_tweets_2015_to_2019.csv`) and reddit data(`company_reddit_2015_to_2019.csv`, e.g. `TSLA_reddit_2015_to_2019.csv`) from google drive and put it in the same path as starter_code. 6 | 2. Run `starter_code.py company` or any other company (should enter the stock name). It creates two directories (`twitter/company/` and `reddit/company/`) and adds the data for each year to those directories. 7 | 3. Run `load_data.py` with the following arguments: `load_data(stock_name, stock_path, reddit_path, twitter_path, year_list, return_columns_list)` 8 | - For example, for getting data of TSLA for years 2015 to 1018 and the desired outputs listed below, run ```load_data("TSLA", "stock_data/tesla/", "reddit/TSLA/", "twitter/TSLA/", [2015, 2016, 2017, 2018], ['Close','positive_reddit','negative_reddit', 'neutral_reddit', 'count_reddit','comment_num_reddit', 'positive_twitter', 'negative_twitter','neutral_twitter', 'count_twitter', 'comment_num_twitter','retweet_num_twitter', 'like_num_twitter'])``` 9 | - If you want all the columns in the returned array, pass an empty list `[]` as `return_columns_list` in `load_data()`. 10 | -------------------------------------------------------------------------------- /backend/Tesla_30_5weeks.csv: -------------------------------------------------------------------------------- 1 | Week,Reddit Sentiment,Number of Reddit Posts,Twitter Sentiment,Number of Tweets,Tweet Activity 2 | 1,0.5,0.5,0.5,0.5,0.5 3 | 2,0.96,0.5,0.5,0.5,0.5 4 | 3,0.5,0.84,0.5,0.5,0.5 5 | 4,0.5,0.5,0.93,0.5,0.98 6 | 5,0.81,0.87,0.5,0.96,0.5 7 | -------------------------------------------------------------------------------- /backend/Tesla_90_5weeks.csv: -------------------------------------------------------------------------------- 1 | Week,Reddit Sentiment,Number of Reddit Posts,Twitter Sentiment,Number of Tweets,Tweet Activity 2 | 1,0.5,0.5,0.5,0.5,0.5 3 | 2,1,0.5,0.5,1,0.5 4 | 3,0.5,1,0.5,0.5,1 5 | 4,1,0.5,1,0.5,0.5 6 | 5,0.5,1,0.5,0.5,1 7 | -------------------------------------------------------------------------------- /backend/best_weights/AAPL_wsize30_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/AAPL_wsize30_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/AAPL_wsize60_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/AAPL_wsize60_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/AAPL_wsize90_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/AAPL_wsize90_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/AMZN_wsize30_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/AMZN_wsize30_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/AMZN_wsize60_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/AMZN_wsize60_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/AMZN_wsize90_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/AMZN_wsize90_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/GOOGL_wsize30_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/GOOGL_wsize30_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/GOOGL_wsize60_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/GOOGL_wsize60_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/GOOGL_wsize90_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/GOOGL_wsize90_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/MSFT_wsize30_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/MSFT_wsize30_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/MSFT_wsize60_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/MSFT_wsize60_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/MSFT_wsize90_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/MSFT_wsize90_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/TSLA_wsize30_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/TSLA_wsize30_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/TSLA_wsize60_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/TSLA_wsize60_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/TSLA_wsize90_sc_dict.p: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/TSLA_wsize90_sc_dict.p -------------------------------------------------------------------------------- /backend/best_weights/best_weights_AAPL_wsize30.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_AAPL_wsize30.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_AAPL_wsize60.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_AAPL_wsize60.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_AAPL_wsize90.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_AAPL_wsize90.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_AMZN_wsize30.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_AMZN_wsize30.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_AMZN_wsize60.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_AMZN_wsize60.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_AMZN_wsize90.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_AMZN_wsize90.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_GOOGL_wsize30.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_GOOGL_wsize30.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_GOOGL_wsize60.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_GOOGL_wsize60.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_GOOGL_wsize90.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_GOOGL_wsize90.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_MSFT_wsize30.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_MSFT_wsize30.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_MSFT_wsize60.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_MSFT_wsize60.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_MSFT_wsize90.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_MSFT_wsize90.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_TSLA_wsize30.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_TSLA_wsize30.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_TSLA_wsize60.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_TSLA_wsize60.hdf5 -------------------------------------------------------------------------------- /backend/best_weights/best_weights_TSLA_wsize90.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/backend/best_weights/best_weights_TSLA_wsize90.hdf5 -------------------------------------------------------------------------------- /backend/data_proc.py: -------------------------------------------------------------------------------- 1 | import pandas as pd 2 | import datetime as dt 3 | import sys 4 | 5 | def get_grouped_df(source, df): 6 | # print('\n-- get_grouped_df -- ') 7 | 8 | df = df.loc[:, ~df.columns.str.contains('^Unnamed')] 9 | df.loc[:, 'post_date'] = pd.to_datetime(df['post_date'], unit='s').dt.date 10 | 11 | cols = ['positive_'+source, 'negative_'+source, 'neutral_'+source, 'count_'+source] 12 | 13 | grouped_df = pd.DataFrame([], columns=cols) 14 | 15 | grouped_df['count_'+source] = df['prediction'].groupby(df['post_date']).count() 16 | 17 | grouped_df[['negative_'+source,'neutral_'+source,'positive_'+source]] = \ 18 | df.groupby(['post_date','prediction'], as_index=False)\ 19 | .size()\ 20 | .pivot(index='post_date', columns='prediction', values='size')\ 21 | [['negative','neutral','positive']]\ 22 | .fillna(0) 23 | 24 | if source == 'twitter': 25 | grouped_df['comment_num_'+source] = df.groupby(['post_date']).agg({'comment_num':'sum'})['comment_num'] 26 | grouped_df['retweet_num_'+source] = df.groupby(['post_date']).agg({'retweet_num':'sum'})['retweet_num'] 27 | grouped_df['like_num_'+source] = df.groupby(['post_date']).agg({'like_num':'sum'})['like_num'] 28 | 29 | if source == 'reddit': 30 | grouped_df['comment_num_'+source] = df.groupby(['post_date']).agg({'comment_num':'sum'})['comment_num'] 31 | 32 | grouped_df[['positive_'+source, 'negative_'+source, 'neutral_'+source]] = \ 33 | grouped_df[['positive_'+source, 'negative_'+source, 'neutral_'+source]].div(grouped_df['count_'+source],0) 34 | # grouped_df.loc[:,grouped_df.columns!='count_'+source] = grouped_df.loc[:, grouped_df.columns!='count_'+source].div(grouped_df['count_'+source],0) 35 | 36 | return grouped_df 37 | 38 | 39 | def load_data(company, stock_path, reddit_path, twitter_path, year_list, return_cols): 40 | # print('\n-- load_data -- ') 41 | 42 | fin_df = pd.DataFrame([]) 43 | 44 | if stock_path != False: 45 | for y in year_list: 46 | df = pd.read_csv(stock_path + company+"_"+str(y)+'.csv') 47 | fin_df = fin_df.append(df) 48 | 49 | fin_df = fin_df.drop('Adj Close', 1) 50 | fin_df['Date'] = pd.to_datetime(fin_df['Date']) 51 | fin_df.set_index('Date', inplace=True) 52 | 53 | # adding financial dataframe to final_df, then we're gonna add twitter and reddit data to 54 | # final_df (if they exist) 55 | final_df = fin_df.copy() 56 | 57 | 58 | # adding reddit data 59 | if reddit_path != False: 60 | reddit_df = pd.DataFrame([]) 61 | cols = ['post_date','comment_num','prediction'] 62 | for y in year_list: 63 | df = pd.read_csv(reddit_path + str(company) + "_reddit_" + str(y)+'.csv', usecols=cols) 64 | reddit_df = reddit_df.append(df) 65 | 66 | grouped_reddit_df = get_grouped_df('reddit', reddit_df) 67 | final_df = final_df.merge(grouped_reddit_df, 68 | left_index=True, 69 | right_index=True, 70 | how='left') 71 | 72 | # adding twitter data 73 | if twitter_path != False: 74 | twitter_df = pd.DataFrame([]) 75 | cols = [ 'post_date','comment_num', 'retweet_num', 'like_num', 'prediction'] 76 | for y in year_list: 77 | df = pd.read_csv(twitter_path + str(company) +"_tweets_" + str(y)+'.csv', usecols=cols) 78 | twitter_df = twitter_df.append(df) 79 | 80 | grouped_twitter_df = get_grouped_df('twitter', twitter_df) 81 | final_df = final_df.merge(grouped_twitter_df, 82 | left_index=True, 83 | right_index=True, 84 | how='left') 85 | 86 | # fillna with 0 87 | final_df.fillna(0, inplace=True) 88 | if len(return_cols) == 0: 89 | return_cols = final_df.columns 90 | 91 | 92 | return final_df.index.tolist(), final_df[return_cols].to_numpy() 93 | 94 | # if __name__ == '__main__': 95 | # print(load_data("TSLA", "stock_data/tesla/", "reddit/TSLA/","twitter/TSLA/", [2015,2016,2017,2018], [])) 96 | 97 | 98 | 99 | -------------------------------------------------------------------------------- /backend/extrapolate_backend.py: -------------------------------------------------------------------------------- 1 | from models_keras import get_LSTM_model 2 | from data_proc import load_data 3 | from utils import init_scalers, fit_scalers, transform_scalers, viz_test_vs_pred 4 | 5 | import math 6 | import matplotlib.pyplot as plt 7 | import keras as keras 8 | import pandas as pd 9 | import numpy as np 10 | import pickle 11 | import random 12 | 13 | from sklearn.preprocessing import MinMaxScaler 14 | import argparse 15 | 16 | 17 | def extrapolate_func(window_size, company, modelpath, scpath, fdpath, stockdir, redditdir, twitterdir, years): 18 | 19 | ############### Model loading #################### 20 | 21 | model = get_LSTM_model(window_size, 11) 22 | model.load_weights(modelpath) 23 | 24 | sc_dict = pickle.load(open(scpath, "rb")) 25 | 26 | #### Get last entry from train data to kick start test data 27 | dates_pre_test, X_pre_test_indi = load_data(company, stockdir, redditdir, twitterdir, [years[0] - 1], ['Close','positive_reddit','negative_reddit', 'neutral_reddit', 'count_reddit', 'positive_twitter', 'negative_twitter','neutral_twitter', 'count_twitter', 'retweet_num_twitter', 'like_num_twitter']) 28 | 29 | ############### Evaluate on test set (remove this later) #################### 30 | dates_test, X_test_indi = load_data(company, stockdir, redditdir, twitterdir, years, ['Close','positive_reddit','negative_reddit', 'neutral_reddit', 'count_reddit', 'positive_twitter', 'negative_twitter','neutral_twitter', 'count_twitter', 'retweet_num_twitter', 'like_num_twitter']) 31 | 32 | dates_pre_test_win = dates_pre_test[-window_size:] 33 | X_pre_test_indi_win = X_pre_test_indi[-window_size:] 34 | 35 | dates_test = dates_pre_test_win + dates_test 36 | X_test_indi = np.concatenate((X_pre_test_indi_win, X_test_indi)) 37 | 38 | 39 | 40 | # print(dates_test) 41 | print(len(dates_test)) 42 | print(X_test_indi.shape) 43 | # exit() 44 | 45 | ## test: Scale data, combine data into sequences then send into the model (remove this later) 46 | X_test_indi = transform_scalers(X_test_indi, sc_dict) 47 | 48 | X_test_orig = [] 49 | y_test_orig = [] 50 | for i in range(window_size, X_test_indi.shape[0]): 51 | X_test_orig.append(X_test_indi[i-window_size:i, :]) 52 | y_test_orig.append(X_test_indi[i, :][0]) # just closing price 53 | 54 | ### All test elements 55 | X_test_prev, y_test_prev = np.array(X_test_orig), np.array(y_test_orig) 56 | predicted_stock_price_prev = model.predict(X_test_prev) 57 | predicted_stock_price_prev = sc_dict["Close"].inverse_transform(predicted_stock_price_prev.reshape(-1, 1)).reshape(-1) 58 | y_test_prev = sc_dict["Close"].inverse_transform(y_test_prev.reshape(-1, 1)).reshape(-1) 59 | 60 | ## Get the last element for extrapolating 61 | X_test_elem, y_test_elem = np.array(X_test_orig[-1]), np.array(y_test_orig[-1]) 62 | X_test_elem = np.expand_dims(X_test_elem, axis=0) 63 | y_test_elem = np.expand_dims(y_test_elem, axis=0) 64 | print(X_test_elem.shape, y_test_elem.shape) 65 | print(X_test_elem, y_test_elem) 66 | 67 | X_test_curr = X_test_elem 68 | curr_date = dates_test[-1] 69 | 70 | last_entry = X_test_curr[0][-1].reshape(1, 1, -1) 71 | 72 | future_dates = [] 73 | y_future_preds = [] 74 | 75 | future_df = pd.read_csv(fdpath, sep=',').values 76 | num_weeks_future = future_df.shape[0] 77 | 78 | def calc_ratios_from_score(sent_score): 79 | 80 | pos_score, neutral_score, neg_score = 0.05, 0.05, 0.05 81 | sent_score = max(sent_score, 0) 82 | 83 | if sent_score > 0.75: 84 | pos_score = (sent_score - 0.75) / 0.25 85 | neutral_score = (3/4) * (1 - pos_score) 86 | neg_score = (1/4) * (1 - pos_score) 87 | 88 | elif sent_score < 0.25: 89 | neg_score = (0.25 - sent_score) / 0.25 90 | neutral_score = (3/4) * (1 - neg_score) 91 | pos_score = (1/4) * (1 - neg_score) 92 | 93 | else: 94 | if sent_score < 0.5: 95 | neutral_score = (sent_score - 0.25) / 0.25 96 | neg_score = (3/4) * (1 - neutral_score) 97 | pos_score = (1/4) * (1 - neutral_score) 98 | else: 99 | neutral_score = (0.75 - sent_score) / 0.25 100 | pos_score = (3/4) * (1 - neutral_score) 101 | neg_score = (1/4) * (1 - neutral_score) 102 | 103 | pos_score += (np.random.rand(1)[0] / 20) 104 | neutral_score += (np.random.rand(1)[0] / 20) 105 | neg_score += (np.random.rand(1)[0] / 20) 106 | 107 | sum_total = pos_score + neutral_score + neg_score 108 | final_ratios = [pos_score / sum_total, neg_score / sum_total, neutral_score / sum_total] 109 | 110 | return final_ratios 111 | 112 | 113 | for i in range(num_weeks_future - 1): 114 | 115 | curr_date = curr_date + pd.to_timedelta(2, unit='d') 116 | curr_week_future = future_df[i] 117 | next_week_future = future_df[i+1] 118 | 119 | reddit_sents = np.linspace(curr_week_future[1], next_week_future[1], 5) 120 | reddit_counts = np.linspace(curr_week_future[2], next_week_future[2], 5) 121 | twitter_sents = np.linspace(curr_week_future[3], next_week_future[3], 5) 122 | twitter_counts = np.linspace(curr_week_future[4], next_week_future[4], 5) 123 | twitter_activity = np.linspace(curr_week_future[5], next_week_future[5], 5) 124 | 125 | for day in range(5): 126 | # get random noise 127 | rand_val = random.uniform(-1, 1) / 10 128 | rand_val_2 = random.uniform(-1, 1) / 10 129 | rand_val_3 = random.uniform(-1, 1) / 10 130 | rand_val_4 = random.uniform(-1, 1) / 10 131 | 132 | # create new entry 133 | curr_date = curr_date + pd.to_timedelta(1, unit='d') 134 | 135 | predicted_stock_price_norm = model.predict(X_test_curr) 136 | predicted_stock_price_curr = sc_dict["Close"].inverse_transform(predicted_stock_price_norm.reshape(-1, 1)).reshape(-1) 137 | 138 | future_dates.append(curr_date) 139 | y_future_preds.append(predicted_stock_price_curr) 140 | 141 | print(curr_date, predicted_stock_price_curr) 142 | 143 | # Add predicted close price at 0th position 144 | new_entry = last_entry.copy() 145 | new_entry[0][0][0] = predicted_stock_price_norm[0][0] 146 | 147 | # Add reddit pos, neg, neutral 148 | reddit_sent_ratios_day = calc_ratios_from_score(reddit_sents[day]) 149 | # pos 150 | new_entry[0][0][1] = reddit_sent_ratios_day[0] 151 | # neg 152 | new_entry[0][0][2] = reddit_sent_ratios_day[1] 153 | # neutral 154 | new_entry[0][0][3] = reddit_sent_ratios_day[2] 155 | 156 | # Add count reddit 157 | reddit_ct_day = reddit_counts[day] 158 | new_entry[0][0][4] = reddit_ct_day + rand_val 159 | 160 | # Add twitter pos, neg, neutral 161 | twitter_sent_ratios_day = calc_ratios_from_score(twitter_sents[day]) 162 | # pos 163 | new_entry[0][0][5] = twitter_sent_ratios_day[0] 164 | # neg 165 | new_entry[0][0][6] = twitter_sent_ratios_day[1] 166 | # neutral 167 | new_entry[0][0][7] = twitter_sent_ratios_day[2] 168 | 169 | # Add count twitter 170 | twitter_ct_day = twitter_counts[day] 171 | new_entry[0][0][8] = twitter_ct_day + rand_val_2 172 | 173 | # Add retweet twitter 174 | twitter_retweet_day = twitter_activity[day] 175 | new_entry[0][0][9] = twitter_retweet_day + rand_val_3 176 | 177 | # Add likes twitter 178 | twitter_likes_day = twitter_activity[day] 179 | new_entry[0][0][10] = twitter_likes_day + rand_val_4 180 | 181 | 182 | print(new_entry) 183 | 184 | 185 | X_test_curr = np.concatenate((X_test_curr, new_entry), axis=1)[:, 1:, :] 186 | 187 | 188 | # Plotting 189 | all_dates = dates_test[window_size:] + future_dates 190 | # print(all_dates) 191 | all_preds = np.concatenate((predicted_stock_price_prev, np.array(y_future_preds).reshape(-1))) 192 | all_labels = np.concatenate((y_test_prev, np.array(y_future_preds).reshape(-1))) 193 | print(len(all_dates)) 194 | print(len(all_preds)) 195 | print(len(all_labels)) 196 | result = {"dates": all_dates, "actual": y_test_prev.tolist(), "pred": all_preds.tolist()} 197 | # for i in range(len(all_dates)): 198 | # result(all_dates[i], all_preds[i], all_labels[i])) 199 | return result 200 | # plt.plot(all_dates, all_labels, color = "red", label = "Real Stock Price") 201 | # plt.plot(all_dates, all_preds, color = "blue", label = "Predicted Stock Price") 202 | 203 | # plt.xlabel('Time') 204 | # plt.ylabel('Stock Price') 205 | # plt.legend() 206 | # plt.show() 207 | 208 | # print(extrapolate_func(90, "TSLA", "weights/best_weights_TSLA_wsize90.hdf5", 209 | # "weights/TSLA_wsize90_sc_dict.p", "Tesla_5_weeks.csv", "stock_data/tesla/", 210 | # "reddit/TSLA/", "twitter/TSLA/", [2019])) 211 | # Parser 212 | # parser = argparse.ArgumentParser() 213 | # parser.add_argument('--window_size', default=100, type=int, help='window size') 214 | 215 | # parser.add_argument('--company', type=str, help='company', required=True) 216 | # parser.add_argument('--modelpath', type=str, help='model path', required=True) 217 | # parser.add_argument('--scpath', type=str, help='scaler path', required=True) 218 | # parser.add_argument('--fdpath', type=str, help='future data csv path', required=True) 219 | 220 | # # Remove this later and just use one entry in a csv 221 | # parser.add_argument('--stockdir', type=str, help='stock') 222 | # parser.add_argument('--redditdir', type=str, help='reddit') 223 | # parser.add_argument('--twitterdir', type=str, help='twitter') 224 | # parser.add_argument('--years', nargs='+', help='Years') 225 | 226 | # args = parser.parse_args() 227 | # args.years = [int(year) for year in args.years] 228 | 229 | # extrapolate_func(args.window_size, args.company, args.modelpath, args.scpath, args.fdpath, args.stockdir, args.redditdir, args.twitterdir, args.years) 230 | 231 | 232 | 233 | -------------------------------------------------------------------------------- /backend/extrapolate_predict_keras_old.py: -------------------------------------------------------------------------------- 1 | from models_keras import get_LSTM_model 2 | from data_proc import load_data 3 | from utils import init_scalers, fit_scalers, transform_scalers, viz_test_vs_pred 4 | 5 | import math 6 | import matplotlib.pyplot as plt 7 | import keras as keras 8 | import pandas as pd 9 | import numpy as np 10 | import pickle 11 | 12 | from sklearn.preprocessing import MinMaxScaler 13 | import argparse 14 | 15 | # Parser 16 | parser = argparse.ArgumentParser() 17 | parser.add_argument('--window_size', default=100, type=int, help='window size') 18 | parser.add_argument('--futuredays', default=100, type=int, help='window size') 19 | parser.add_argument('--futuresentiment', type=str, help='positive | negative | neutral', required=True) 20 | 21 | parser.add_argument('--company', type=str, help='company', required=True) 22 | parser.add_argument('--modelpath', type=str, help='model path', required=True) 23 | parser.add_argument('--scpath', type=str, help='scaler path', required=True) 24 | 25 | # Remove this later and just use one entry in a csv 26 | parser.add_argument('--stockdir', type=str, help='stock') 27 | parser.add_argument('--redditdir', type=str, help='reddit') 28 | parser.add_argument('--twitterdir', type=str, help='twitter') 29 | parser.add_argument('--years', nargs='+', help='Years') 30 | 31 | args = parser.parse_args() 32 | args.years = [int(year) for year in args.years] 33 | 34 | ############### Model loading #################### 35 | 36 | model = get_LSTM_model(args.window_size, 11) 37 | model.load_weights(args.modelpath) 38 | 39 | sc_dict = pickle.load(open(args.scpath, "rb")) 40 | 41 | ############### Evaluate on test set (remove this later) #################### 42 | dates_test, X_test_indi = load_data(args.company, args.stockdir, args.redditdir, args.twitterdir, args.years, ['Close','positive_reddit','negative_reddit', 'neutral_reddit', 'count_reddit', 'positive_twitter', 'negative_twitter','neutral_twitter', 'count_twitter', 'retweet_num_twitter', 'like_num_twitter']) 43 | 44 | ## test: Scale data, combine data into sequences then send into the model (remove this later) 45 | X_test_indi = transform_scalers(X_test_indi, sc_dict) 46 | 47 | X_test_orig = [] 48 | y_test_orig = [] 49 | for i in range(args.window_size, X_test_indi.shape[0]): 50 | X_test_orig.append(X_test_indi[i-args.window_size:i, :]) 51 | y_test_orig.append(X_test_indi[i, :][0]) # just closing price 52 | 53 | ### All test elements 54 | X_test_prev, y_test_prev = np.array(X_test_orig), np.array(y_test_orig) 55 | predicted_stock_price_prev = model.predict(X_test_prev) 56 | predicted_stock_price_prev = sc_dict["Close"].inverse_transform(predicted_stock_price_prev.reshape(-1, 1)).reshape(-1) 57 | y_test_prev = sc_dict["Close"].inverse_transform(y_test_prev.reshape(-1, 1)).reshape(-1) 58 | 59 | 60 | # viz_test_vs_pred("trial", dates_test[args.window_size:], y_test_prev, predicted_stock_price_prev, True) 61 | # print(dates_test) 62 | # import pdb; pdb.set_trace() 63 | 64 | ## Get the last element for extrapolating 65 | X_test_elem, y_test_elem = np.array(X_test_orig[-1]), np.array(y_test_orig[-1]) 66 | X_test_elem = np.expand_dims(X_test_elem, axis=0) 67 | y_test_elem = np.expand_dims(y_test_elem, axis=0) 68 | print(X_test_elem.shape, y_test_elem.shape) 69 | print(X_test_elem, y_test_elem) 70 | 71 | X_test_curr = X_test_elem 72 | y_test_curr = y_test_elem 73 | curr_date = dates_test[-1] 74 | 75 | last_entry = X_test_curr[0][-1].reshape(1, 1, -1) 76 | 77 | future_dates = [] 78 | y_future_preds = [] 79 | 80 | trial = args.futuresentiment 81 | 82 | for i in range(args.futuredays): 83 | 84 | # ignoring weekends at the moment 85 | curr_date = curr_date + pd.to_timedelta(1, unit='d') 86 | 87 | predicted_stock_price_norm = model.predict(X_test_curr) 88 | predicted_stock_price_curr = sc_dict["Close"].inverse_transform(predicted_stock_price_norm.reshape(-1, 1)).reshape(-1) 89 | 90 | future_dates.append(curr_date) 91 | y_future_preds.append(predicted_stock_price_curr) 92 | 93 | print(curr_date, predicted_stock_price_curr) 94 | 95 | new_entry = last_entry.copy() 96 | new_entry[0][0][0] = predicted_stock_price_norm[0][0] 97 | 98 | if trial == "positive": 99 | rand_val = np.random.rand(1)[0] / 5 100 | rand_val2 = np.random.rand(1)[0] / 5 101 | 102 | new_entry[0][0][1] = 0.7 + rand_val 103 | new_entry[0][0][2] = 0.15 - (rand_val / 2) 104 | new_entry[0][0][3] = 0.15 - (rand_val / 2) 105 | new_entry[0][0][5] = 0.7 + rand_val2 106 | new_entry[0][0][6] = 0.15 - (rand_val2 / 2) 107 | new_entry[0][0][7] = 0.15 - (rand_val2 / 2) 108 | 109 | elif trial == "negative": 110 | 111 | rand_val = np.random.rand(1)[0] / 5 112 | rand_val2 = np.random.rand(1)[0] / 5 113 | 114 | new_entry[0][0][1] = 0.15 + (rand_val / 2) 115 | new_entry[0][0][2] = 0.7 - rand_val 116 | new_entry[0][0][3] = 0.15 - (rand_val / 2) 117 | new_entry[0][0][5] = 0.15 + (rand_val2 / 2) 118 | new_entry[0][0][6] = 0.7 - rand_val2 119 | new_entry[0][0][7] = 0.15 - (rand_val2 / 2) 120 | 121 | elif trial == "neutral": 122 | 123 | rand_val = np.random.rand(1)[0] / 5 124 | rand_val2 = np.random.rand(1)[0] / 5 125 | 126 | new_entry[0][0][1] = 0.15 + (rand_val / 2) 127 | new_entry[0][0][2] = 0.15 - (rand_val / 2) 128 | new_entry[0][0][3] = 0.7 - rand_val 129 | 130 | new_entry[0][0][5] = 0.15 + (rand_val2 / 2) 131 | new_entry[0][0][6] = 0.15 - (rand_val2 / 2) 132 | new_entry[0][0][7] = 0.7 - rand_val2 133 | 134 | 135 | X_test_curr = np.concatenate((X_test_curr, new_entry), axis=1)[:, 1:, :] 136 | 137 | 138 | # Plotting 139 | # import pdb; pdb.set_trace() 140 | 141 | all_dates = dates_test[args.window_size:] + future_dates 142 | all_preds = np.concatenate((predicted_stock_price_prev, np.array(y_future_preds).reshape(-1))) 143 | all_labels = np.concatenate((y_test_prev, np.array(y_future_preds).reshape(-1))) 144 | 145 | plt.plot(all_dates, all_labels, color = "red", label = "Real Stock Price") 146 | plt.plot(all_dates, all_preds, color = "blue", label = "Predicted Stock Price") 147 | 148 | # plt.title(title) 149 | plt.xlabel('Time') 150 | plt.ylabel('Stock Price') 151 | plt.legend() 152 | plt.show() 153 | 154 | 155 | 156 | 157 | 158 | -------------------------------------------------------------------------------- /backend/finBert/model/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "_name_or_path": "/home/ubuntu/finbert/models/language_model/finbertTRC2", 3 | "architectures": [ 4 | "BertForSequenceClassification" 5 | ], 6 | "attention_probs_dropout_prob": 0.1, 7 | "gradient_checkpointing": false, 8 | "hidden_act": "gelu", 9 | "hidden_dropout_prob": 0.1, 10 | "hidden_size": 768, 11 | "id2label": { 12 | "0": "positive", 13 | "1": "negative", 14 | "2": "neutral" 15 | }, 16 | "initializer_range": 0.02, 17 | "intermediate_size": 3072, 18 | "label2id": { 19 | "positive": 0, 20 | "negative": 1, 21 | "neutral": 2 22 | }, 23 | "layer_norm_eps": 1e-12, 24 | "max_position_embeddings": 512, 25 | "model_type": "bert", 26 | "num_attention_heads": 12, 27 | "num_hidden_layers": 12, 28 | "pad_token_id": 0, 29 | "position_embedding_type": "absolute", 30 | "type_vocab_size": 2, 31 | "vocab_size": 30522 32 | } -------------------------------------------------------------------------------- /backend/finBert/sentiment.py: -------------------------------------------------------------------------------- 1 | from nltk.tokenize import sent_tokenize 2 | from transformers import AutoTokenizer, AutoModelForSequenceClassification 3 | import pandas as pd 4 | import numpy as np 5 | import torch 6 | import re 7 | 8 | 9 | # UTILITY FUNCTIONS 10 | class InputExample(object): 11 | """A single training/test example for simple sequence classification.""" 12 | 13 | def __init__(self, guid, text, label=None, agree=None): 14 | """ 15 | Constructs an InputExample 16 | Parameters 17 | ---------- 18 | guid: str 19 | Unique id for the examples 20 | text: str 21 | Text for the first sequence. 22 | label: str, optional 23 | Label for the example. 24 | agree: str, optional 25 | For FinBERT , inter-annotator agreement level. 26 | """ 27 | self.guid = guid 28 | self.text = text 29 | self.label = label 30 | self.agree = agree 31 | 32 | class InputFeatures(object): 33 | """ 34 | A single set of features for the data. 35 | """ 36 | 37 | def __init__(self, input_ids, attention_mask, token_type_ids, label_id, agree=None): 38 | self.input_ids = input_ids 39 | self.attention_mask = attention_mask 40 | self.token_type_ids = token_type_ids 41 | self.label_id = label_id 42 | self.agree = agree 43 | 44 | def chunks(l, n): 45 | """ 46 | Simple utility function to split a list into fixed-length chunks. 47 | Parameters 48 | ---------- 49 | l: list 50 | given list 51 | n: int 52 | length of the sequence 53 | """ 54 | for i in range(0, len(l), n): 55 | # Create an index range for l of n items: 56 | yield l[i:i + n] 57 | 58 | def softmax(x): 59 | """Compute softmax values for each sets of scores in x.""" 60 | e_x = np.exp(x - np.max(x, axis=1)[:, None]) 61 | return e_x / np.sum(e_x, axis=1)[:, None] 62 | 63 | def convert_examples_to_features(examples, label_list, max_seq_length, tokenizer, mode='classification'): 64 | """ 65 | Loads a data file into a list of InputBatch's. With this function, the InputExample's are converted to features 66 | that can be used for the model. Text is tokenized, converted to ids and zero-padded. Labels are mapped to integers. 67 | 68 | Parameters 69 | ---------- 70 | examples: list 71 | A list of InputExample's. 72 | label_list: list 73 | The list of labels. 74 | max_seq_length: int 75 | The maximum sequence length. 76 | tokenizer: BertTokenizer 77 | The tokenizer to be used. 78 | mode: str, optional 79 | The task type: 'classification' or 'regression'. Default is 'classification' 80 | 81 | Returns 82 | ------- 83 | features: list 84 | A list of InputFeature's, which is an InputBatch. 85 | """ 86 | 87 | if mode == 'classification': 88 | label_map = {label: i for i, label in enumerate(label_list)} 89 | label_map[None] = 9090 90 | 91 | features = [] 92 | for (ex_index, example) in enumerate(examples): 93 | tokens = tokenizer.tokenize(example.text) 94 | 95 | if len(tokens) > max_seq_length - 2: 96 | tokens = tokens[:(max_seq_length // 4) - 1] + tokens[ 97 | len(tokens) - (3 * max_seq_length // 4) + 1:] 98 | 99 | tokens = ["[CLS]"] + tokens + ["[SEP]"] 100 | 101 | token_type_ids = [0] * len(tokens) 102 | 103 | input_ids = tokenizer.convert_tokens_to_ids(tokens) 104 | 105 | attention_mask = [1] * len(input_ids) 106 | 107 | padding = [0] * (max_seq_length - len(input_ids)) 108 | input_ids += padding 109 | attention_mask += padding 110 | 111 | 112 | token_type_ids += padding 113 | 114 | assert len(input_ids) == max_seq_length 115 | assert len(attention_mask) == max_seq_length 116 | assert len(token_type_ids) == max_seq_length 117 | 118 | if mode == 'classification': 119 | label_id = label_map[example.label] 120 | elif mode == 'regression': 121 | label_id = float(example.label) 122 | else: 123 | raise ValueError("The mode should either be classification or regression. You entered: " + mode) 124 | 125 | agree = example.agree 126 | mapagree = {'0.5': 1, '0.66': 2, '0.75': 3, '1.0': 4} 127 | try: 128 | agree = mapagree[agree] 129 | except: 130 | agree = 0 131 | 132 | features.append( 133 | InputFeatures(input_ids=input_ids, 134 | attention_mask=attention_mask, 135 | token_type_ids=token_type_ids, 136 | label_id=label_id, 137 | agree=agree)) 138 | return features 139 | 140 | def get_prediction(text_list): 141 | # Load model 142 | tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased") 143 | model = AutoModelForSequenceClassification.from_pretrained('finBert/model/',num_labels=3,cache_dir=None) 144 | 145 | 146 | # Get prediction 147 | model.eval() 148 | label_list = ['positive', 'negative', 'neutral'] 149 | label_dict = {0: 'positive', 1: 'negative', 2: 'neutral'} 150 | sentiment = [] 151 | 152 | for text in text_list: 153 | # Preprocess text 154 | text = re.sub(r'[^\w\s]', '', text) 155 | text += '.' 156 | sentences = sent_tokenize(text) 157 | examples = [InputExample(str(i), sentence) for i, sentence in enumerate(sentences)] 158 | features = convert_examples_to_features(examples, label_list, 64, tokenizer) 159 | all_input_ids = torch.tensor([f.input_ids for f in features], dtype=torch.long) 160 | all_attention_mask = torch.tensor([f.attention_mask for f in features], dtype=torch.long) 161 | all_token_type_ids = torch.tensor([f.token_type_ids for f in features], dtype=torch.long) 162 | with torch.no_grad(): 163 | logits = model(all_input_ids, all_attention_mask, all_token_type_ids)[0] 164 | logits = softmax(np.array(logits)) 165 | predictions = np.squeeze(np.argmax(logits, axis=1)) 166 | sentiment.append(label_dict[int(predictions)]) 167 | return sentiment 168 | 169 | #text = ["Dogecoin Breaks Past The 20 Cents Mark In Dizzying Rally With 'Earnings Report' Achievement Unlocked $DOGE $BTC… https://t.co/DUQt8r2m70", "hi"] 170 | #print(get_prediction(text)) 171 | 172 | -------------------------------------------------------------------------------- /backend/main_keras.py: -------------------------------------------------------------------------------- 1 | from models_keras import get_LSTM_model 2 | from data_proc import load_data 3 | from utils import init_scalers, fit_scalers, transform_scalers, viz_test_vs_pred 4 | 5 | import math 6 | import matplotlib.pyplot as plt 7 | import keras as keras 8 | import pandas as pd 9 | import numpy as np 10 | import pickle 11 | 12 | from sklearn.preprocessing import MinMaxScaler 13 | import argparse 14 | 15 | # Parser 16 | parser = argparse.ArgumentParser() 17 | parser.add_argument('--window_size', default=100, type=int, help='window size') 18 | parser.add_argument('--num_epochs', default=100, type=int, help='num epochs') 19 | 20 | parser.add_argument('--company', type=str, help='company', required=True) 21 | parser.add_argument('--stockdir', type=str, help='stock', required=True) 22 | parser.add_argument('--redditdir', type=str, help='reddit', required=True) 23 | parser.add_argument('--twitterdir', type=str, help='twitter', required=True) 24 | parser.add_argument('--trainyears', nargs='+', help='Years', required=True) 25 | parser.add_argument('--testyears', nargs='+', help='Years', required=True) 26 | 27 | args = parser.parse_args() 28 | 29 | ### Default parameters ### 30 | window_size = int(args.window_size) 31 | title = "TSLA_wsize" + str(window_size) 32 | args.trainyears = [int(year) for year in args.trainyears] 33 | args.testyears = [int(year) for year in args.testyears] 34 | 35 | print(title) 36 | print(args.trainyears) 37 | print(args.testyears) 38 | 39 | ############### Dataloading #################### 40 | ## Implement load_data function 41 | ## If you are changing fields, you need to change the indices in utils.py fit scalers as well as the label index when making label sets 42 | 43 | dates_train, X_train_indi = load_data(args.company, args.stockdir, args.redditdir, args.twitterdir, args.trainyears, ['Close','positive_reddit','negative_reddit', 'neutral_reddit', 'count_reddit', 'positive_twitter', 'negative_twitter','neutral_twitter', 'count_twitter', 'retweet_num_twitter', 'like_num_twitter']) 44 | 45 | dates_test, X_test_indi = load_data(args.company, args.stockdir, args.redditdir, args.twitterdir, args.testyears, ['Close','positive_reddit','negative_reddit', 'neutral_reddit', 'count_reddit', 'positive_twitter', 'negative_twitter','neutral_twitter', 'count_twitter', 'retweet_num_twitter', 'like_num_twitter']) 46 | 47 | # dates_train, X_train_indi = load_data("TSLA", 'stock_data/tesla/', "reddit/TSLA/","twitter/TSLA/", [2015, 2016, 2017, 2018], ['Close','positive_reddit','negative_reddit', 'neutral_reddit', 'count_reddit', 'positive_twitter', 'negative_twitter','neutral_twitter', 'count_twitter', 'retweet_num_twitter', 'like_num_twitter']) 48 | 49 | # dates_test, X_test_indi = load_data("TSLA", 'stock_data/tesla/', "reddit/TSLA/","twitter/TSLA/", [2019], ['Close','positive_reddit','negative_reddit', 'neutral_reddit', 'count_reddit', 'positive_twitter', 'negative_twitter','neutral_twitter', 'count_twitter', 'retweet_num_twitter', 'like_num_twitter']) 50 | 51 | 52 | ## train: Scale data, combine data into sequences then send into the model 53 | sc_dict = init_scalers() 54 | X_train_indi, sc_dict = fit_scalers(X_train_indi, sc_dict) 55 | 56 | X_train = [] 57 | y_train = [] 58 | for i in range(window_size, X_train_indi.shape[0]): 59 | X_train.append(X_train_indi[i-window_size:i, :]) 60 | y_train.append(X_train_indi[i, :][0]) # just closing price 61 | 62 | X_train, y_train = np.array(X_train), np.array(y_train) 63 | print(X_train.shape, y_train.shape) 64 | 65 | ## test: Scale data, combine data into sequences then send into the model 66 | X_test_indi = transform_scalers(X_test_indi, sc_dict) 67 | 68 | X_test = [] 69 | y_test = [] 70 | for i in range(window_size, X_test_indi.shape[0]): 71 | X_test.append(X_test_indi[i-window_size:i, :]) 72 | y_test.append(X_test_indi[i, :][0]) # just closing price 73 | 74 | X_test, y_test = np.array(X_test), np.array(y_test) 75 | print(X_test.shape, y_test.shape) 76 | 77 | ############### Model setup and train #################### 78 | 79 | model = get_LSTM_model(X_train.shape[1], X_train.shape[2]) 80 | 81 | # Callbacks 82 | best_weights_filepath = './weights/best_weights_' + str(title) + '.hdf5' 83 | earlyStopping = keras.callbacks.EarlyStopping(monitor='val_loss', patience=10, verbose=1, mode='auto') 84 | saveBestModel = keras.callbacks.ModelCheckpoint(best_weights_filepath, monitor='val_loss', verbose=1, save_best_only=True, mode='auto') 85 | 86 | # Fitting the model to the Training set 87 | hist_train = model.fit(X_train, y_train, epochs = args.num_epochs, batch_size = 16, shuffle=True, validation_data=(X_test, y_test), callbacks=[earlyStopping, saveBestModel]) 88 | 89 | pickle.dump(sc_dict, open('./weights/' + title + "_sc_dict.p", "wb")) 90 | 91 | ############### Testing and Viz #################### 92 | #reload best weights 93 | model.load_weights(best_weights_filepath) 94 | sc_dict = pickle.load(open('./weights/' + title + "_sc_dict.p", "rb" ) ) 95 | 96 | predicted_stock_price = model.predict(X_test) 97 | hist_test = model.evaluate(X_test, y_test) 98 | 99 | #Rescale stock prices 100 | predicted_stock_price = sc_dict["Close"].inverse_transform(predicted_stock_price.reshape(-1, 1)).reshape(-1) 101 | y_test = sc_dict["Close"].inverse_transform(y_test.reshape(-1, 1)).reshape(-1) 102 | 103 | RMSE_score = (np.sum(np.power(y_test - predicted_stock_price, 2))) / y_test.shape[0] 104 | RMSE_score = np.sqrt(RMSE_score) 105 | 106 | print("RMSE is {}".format(RMSE_score)) 107 | 108 | viz_test_vs_pred(title + " -- RMSE: {:.2f}".format(RMSE_score), dates_test[window_size:], y_test, predicted_stock_price) 109 | -------------------------------------------------------------------------------- /backend/models_keras.py: -------------------------------------------------------------------------------- 1 | import keras 2 | from keras.models import Sequential 3 | from keras.layers import Dense 4 | from keras.layers import LSTM 5 | from keras.layers import Dropout 6 | from keras.layers import * 7 | 8 | def get_LSTM_model(window_size, dim_size): 9 | 10 | model = Sequential() 11 | 12 | model.add(LSTM(units = 50, return_sequences = True, input_shape = (window_size, dim_size))) # X_train.shape[1] 13 | model.add(Dropout(0.2)) 14 | 15 | model.add(LSTM(units = 50, return_sequences = True)) 16 | model.add(Dropout(0.2)) 17 | 18 | model.add(LSTM(units = 50, return_sequences = True)) 19 | model.add(Dropout(0.2)) 20 | 21 | model.add(LSTM(units = 50)) 22 | model.add(Dropout(0.2)) 23 | 24 | model.add(Dense(units = 1)) 25 | 26 | model.compile(optimizer = 'adam', loss = 'mean_squared_error', metrics=[keras.metrics.MeanSquaredError()]) 27 | 28 | return model -------------------------------------------------------------------------------- /backend/predict_keras.py: -------------------------------------------------------------------------------- 1 | from models_keras import get_LSTM_model 2 | from data_proc import load_data 3 | from utils import init_scalers, fit_scalers, transform_scalers, viz_test_vs_pred 4 | 5 | import math 6 | import matplotlib.pyplot as plt 7 | import keras as keras 8 | import pandas as pd 9 | import numpy as np 10 | import pickle 11 | 12 | from sklearn.preprocessing import MinMaxScaler 13 | import argparse 14 | 15 | # Parser 16 | parser = argparse.ArgumentParser() 17 | parser.add_argument('--window_size', default=100, type=int, help='window size') 18 | parser.add_argument('--company', type=str, help='company', required=True) 19 | parser.add_argument('--modelpath', type=str, help='model path', required=True) 20 | parser.add_argument('--scpath', type=str, help='scaler path', required=True) 21 | 22 | parser.add_argument('--stockdir', type=str, help='stock') 23 | parser.add_argument('--redditdir', type=str, help='reddit') 24 | parser.add_argument('--twitterdir', type=str, help='twitter') 25 | parser.add_argument('--years', nargs='+', help='Years') 26 | 27 | args = parser.parse_args() 28 | args.years = [int(year) for year in args.years] 29 | 30 | ############### Model loading #################### 31 | 32 | model = get_LSTM_model(args.window_size, 11) 33 | model.load_weights(args.modelpath) 34 | 35 | sc_dict = pickle.load(open(args.scpath, "rb")) 36 | 37 | #### Get last entry from train data to kick start test data 38 | dates_pre_test, X_pre_test_indi = load_data(args.company, args.stockdir, args.redditdir, args.twitterdir, [args.years[0] - 1], ['Close','positive_reddit','negative_reddit', 'neutral_reddit', 'count_reddit', 'positive_twitter', 'negative_twitter','neutral_twitter', 'count_twitter', 'retweet_num_twitter', 'like_num_twitter']) 39 | 40 | ############### Evaluate on test set #################### 41 | dates_test, X_test_indi = load_data(args.company, args.stockdir, args.redditdir, args.twitterdir, args.years, ['Close','positive_reddit','negative_reddit', 'neutral_reddit', 'count_reddit', 'positive_twitter', 'negative_twitter','neutral_twitter', 'count_twitter', 'retweet_num_twitter', 'like_num_twitter']) 42 | 43 | dates_pre_test_win = dates_pre_test[-args.window_size:] 44 | X_pre_test_indi_win = X_pre_test_indi[-args.window_size:] 45 | 46 | dates_test = dates_pre_test_win + dates_test 47 | X_test_indi = np.concatenate((X_pre_test_indi_win, X_test_indi)) 48 | 49 | ## test: Scale data, combine data into sequences then send into the model 50 | X_test_indi = transform_scalers(X_test_indi, sc_dict) 51 | 52 | X_test = [] 53 | y_test = [] 54 | for i in range(args.window_size, X_test_indi.shape[0]): 55 | X_test.append(X_test_indi[i-args.window_size:i, :]) 56 | y_test.append(X_test_indi[i, :][0]) # just closing price 57 | 58 | X_test, y_test = np.array(X_test), np.array(y_test) 59 | print(X_test.shape, y_test.shape) 60 | 61 | predicted_stock_price = model.predict(X_test) 62 | hist_test = model.evaluate(X_test, y_test) 63 | 64 | predicted_stock_price = sc_dict["Close"].inverse_transform(predicted_stock_price.reshape(-1, 1)).reshape(-1) 65 | y_test = sc_dict["Close"].inverse_transform(y_test.reshape(-1, 1)).reshape(-1) 66 | 67 | RMSE_score = (np.sum(np.power(y_test - predicted_stock_price, 2))) / y_test.shape[0] 68 | RMSE_score = np.sqrt(RMSE_score) 69 | 70 | print("RMSE is {}".format(RMSE_score)) 71 | 72 | viz_test_vs_pred("RMSE: {:.2f}".format(RMSE_score), dates_test[args.window_size:], y_test, predicted_stock_price) 73 | 74 | 75 | -------------------------------------------------------------------------------- /backend/runs.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | python3 main_keras.py --window_size 90 --company TSLA --stockdir stock_data/tesla/ --redditdir reddit/TSLA/ --twitterdir twitter/TSLA/ --trainyears 2015 2016 2017 2018 --testyears 2019 4 | 5 | python3 predict_keras.py --window_size 90 --company TSLA --modelpath weights/best_weights_TSLA_wsize90.hdf5 --scpath weights/TSLA_wsize90_sc_dict.p --stockdir stock_data/tesla/ --redditdir reddit/TSLA/ --twitterdir twitter/TSLA/ --years 2019 6 | 7 | python3 extrapolate_backend.py --window_size 90 --company TSLA --modelpath weights/best_weights_TSLA_wsize90.hdf5 --scpath weights/TSLA_wsize90_sc_dict.p --fdpath Tesla_5_weeks.csv --stockdir stock_data/tesla/ --redditdir reddit/TSLA/ --twitterdir twitter/TSLA/ --years 2019 -------------------------------------------------------------------------------- /backend/server.py: -------------------------------------------------------------------------------- 1 | # Imports 2 | import os 3 | import tweepy as tw 4 | import pandas as pd 5 | from extrapolate_backend import extrapolate_func 6 | import csv 7 | from flask import Flask, request, jsonify 8 | from flask_cors import CORS, cross_origin 9 | from datetime import datetime, timedelta 10 | from dotenv import load_dotenv 11 | from finBert.sentiment import get_prediction 12 | 13 | # Need to do one time 14 | #import nltk 15 | #import ssl 16 | #try: 17 | #_create_unverified_https_context = ssl._create_unverified_context 18 | #except AttributeError: 19 | #pass 20 | #else: 21 | #ssl._create_default_https_context = _create_unverified_https_context 22 | #nltk.download('punkt') 23 | 24 | # Load env variables 25 | load_dotenv() 26 | 27 | # Flask configuration 28 | app = Flask(__name__) 29 | cors = CORS(app) 30 | app.config['CORS_HEADERS'] = 'Content-Type' 31 | 32 | # API for getting sentiment of popular tweets 33 | # in the last 7 days with finBert 34 | @app.route('/get_sentiment', methods=['POST']) 35 | @cross_origin() 36 | def get_sentiment(): 37 | date_since = str(datetime.now() - timedelta(days=7)).split(' ')[0] 38 | curr_date = str(datetime.now()).split(' ')[0] 39 | search_words = request.json['query'] + ' -filter:retweets until:'+curr_date 40 | 41 | # Initialize tweepy isntance 42 | auth = tw.OAuthHandler(os.getenv('consumer_key'), os.getenv('consumer_secret')) 43 | auth.set_access_token(os.getenv('access_token'), os.getenv('access_token_secret')) 44 | api = tw.API(auth, wait_on_rate_limit=True) 45 | 46 | # Get tweets 47 | tweets = tw.Cursor(api.search, 48 | q=search_words, 49 | lang="en", 50 | since=date_since, 51 | result_type="popular").items(100) 52 | 53 | tweet_list = [tweet for tweet in tweets] 54 | data = {'created_at': [], 'text': [], 'favorite_count': [], 'sentiment': []} 55 | created_at, text, favorite_count = [], [], [] 56 | for tweet in tweet_list: 57 | data['created_at'].append(str(tweet.created_at).split(' ')[0]) 58 | data['text'].append(tweet.text) 59 | data['favorite_count'].append(tweet.favorite_count) 60 | 61 | sentiment_list = get_prediction(data['text']) 62 | for sentiment in sentiment_list: 63 | data['sentiment'].append(sentiment) 64 | 65 | # Create dataframe 66 | df = pd.DataFrame(data) 67 | if (len(df.index) == 0) : 68 | return jsonify({'error': 'Could not find any tweets or input had invalid character such as "$"'}) 69 | keys1, keys2 = [], [] 70 | 71 | # Get data for graphs 72 | positive_df = df[df['sentiment']=='positive'] 73 | negative_df = df[df['sentiment']=='negative'] 74 | neutral_df = df[df['sentiment']=='neutral'] 75 | if len(positive_df.index) > 0: 76 | keys1.append('positive_count') 77 | keys2.append('positive') 78 | if len(negative_df.index) > 0: 79 | keys1.append('negative_count') 80 | keys2.append('negative') 81 | if len(neutral_df.index) > 0: 82 | keys1.append('neutral_count') 83 | keys2.append('neutral') 84 | 85 | 86 | cols = ['count', 'negative_count', 'neutral_count', 'positive_count'] 87 | grouped_df = pd.DataFrame([], columns=cols) 88 | grouped_df['count'] = df['sentiment'].groupby(df['created_at']).count() 89 | grouped_df[keys1] = \ 90 | df.reset_index().groupby(['created_at','sentiment'], as_index=False)\ 91 | .size()\ 92 | .pivot(index='created_at', columns='sentiment', values='size')\ 93 | [keys2]\ 94 | .fillna(0) 95 | idx_negative = negative_df.groupby(['created_at'])['favorite_count'].transform(max) == negative_df['favorite_count'] 96 | top_negative = negative_df[idx_negative] 97 | idx_positive = positive_df.groupby(['created_at'])['favorite_count'].transform(max) == positive_df['favorite_count'] 98 | top_positive = positive_df[idx_positive] 99 | idx_neutral = neutral_df.groupby(['created_at'])['favorite_count'].transform(max) == neutral_df['favorite_count'] 100 | top_neutral = neutral_df[idx_neutral] 101 | x = pd.merge(grouped_df, top_negative, on='created_at', how='outer').fillna('N/A') 102 | x = x.drop(['sentiment'], axis=1) 103 | x = x.rename(columns={"text": "negative_text", "favorite_count": "negative_favorites"}) 104 | x = pd.merge(x, top_positive, on='created_at', how='outer').fillna('N/A') 105 | x = x.drop(['sentiment'], axis=1) 106 | x = x.rename(columns={"text": "positive_text", "favorite_count": "positive_favorites"}) 107 | x = pd.merge(x, top_neutral, on='created_at', how='outer').fillna('N/A') 108 | x = x.drop(['sentiment'], axis=1) 109 | x = x.rename(columns={"text": "neutral_text", "favorite_count": "neutral_favorites"}) 110 | 111 | output = {} 112 | for index, row in x.iterrows(): 113 | row_data = {} 114 | row_data['count'] = row['count'] 115 | row_data['negative_count'] = row['negative_count'] 116 | row_data['neutral_count'] = row['neutral_count'] 117 | row_data['positive_count'] = row['positive_count'] 118 | row_data['negative_text'] = row['negative_text'] 119 | row_data['negative_favorites'] = row['negative_favorites'] 120 | row_data['positive_text'] = row['positive_text'] 121 | row_data['positive_favorites'] = row['positive_favorites'] 122 | row_data['neutral_text'] = row['neutral_text'] 123 | row_data['neutral_favorites'] = row['neutral_favorites'] 124 | output[row['created_at']] = row_data 125 | 126 | return jsonify(output) 127 | 128 | @app.route('/', methods=['POST']) 129 | @cross_origin() 130 | def get_plots(): 131 | with open(f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', 'w', newline='') as csvfile: 132 | spamwriter = csv.writer(csvfile, delimiter=',', quotechar='|', quoting=csv.QUOTE_MINIMAL) 133 | for row in request.json["data"]: 134 | spamwriter.writerow(row) 135 | if request.json["company"] == "Tesla": 136 | if request.json["window"] == "30": 137 | result = extrapolate_func(int(request.json["window"]), "TSLA", "best_weights/best_weights_TSLA_wsize30.hdf5", 138 | "best_weights/TSLA_wsize30_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/tesla/", 139 | "reddit/TSLA/", "twitter/TSLA/", [2019]) 140 | elif request.json["window"] == "60": 141 | result = extrapolate_func(int(request.json["window"]), "TSLA", "best_weights/best_weights_TSLA_wsize60.hdf5", 142 | "best_weights/TSLA_wsize60_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/tesla/", 143 | "reddit/TSLA/", "twitter/TSLA/", [2019]) 144 | elif request.json["window"] == "90": 145 | result = extrapolate_func(int(request.json["window"]), "TSLA", "best_weights/best_weights_TSLA_wsize90.hdf5", 146 | "best_weights/TSLA_wsize90_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/tesla/", 147 | "reddit/TSLA/", "twitter/TSLA/", [2019]) 148 | elif request.json["company"] == "Amazon": 149 | if request.json["window"] == "30": 150 | result = extrapolate_func(int(request.json["window"]), "AMZN", "best_weights/best_weights_AMZN_wsize30.hdf5", 151 | "best_weights/AMZN_wsize30_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/amazon/", 152 | "reddit/AMZN/", "twitter/AMZN/", [2019]) 153 | elif request.json["window"] == "60": 154 | result = extrapolate_func(int(request.json["window"]), "AMZN", "best_weights/best_weights_AMZN_wsize60.hdf5", 155 | "best_weights/AMZN_wsize60_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/amazon/", 156 | "reddit/AMZN/", "twitter/AMZN/", [2019]) 157 | elif request.json["window"] == "90": 158 | result = extrapolate_func(int(request.json["window"]), "AMZN", "best_weights/best_weights_AMZN_wsize90.hdf5", 159 | "best_weights/AMZN_wsize90_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/amazon/", 160 | "reddit/AMZN/", "twitter/AMZN/", [2019]) 161 | elif request.json["company"] == "Apple": 162 | if request.json["window"] == "30": 163 | result = extrapolate_func(int(request.json["window"]), "AAPL", "best_weights/best_weights_AAPL_wsize30.hdf5", 164 | "best_weights/AAPL_wsize30_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/apple/", 165 | "reddit/AAPL/", "twitter/AAPL/", [2019]) 166 | elif request.json["window"] == "60": 167 | result = extrapolate_func(int(request.json["window"]), "AAPL", "best_weights/best_weights_AAPL_wsize60.hdf5", 168 | "best_weights/AAPL_wsize60_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/apple/", 169 | "reddit/AAPL/", "twitter/AAPL/", [2019]) 170 | elif request.json["window"] == "90": 171 | result = extrapolate_func(int(request.json["window"]), "AAPL", "best_weights/best_weights_AAPL_wsize90.hdf5", 172 | "best_weights/AAPL_wsize90_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/apple/", 173 | "reddit/AAPL/", "twitter/AAPL/", [2019]) 174 | elif request.json["company"] == "Microsoft": 175 | if request.json["window"] == "30": 176 | result = extrapolate_func(int(request.json["window"]), "MSFT", "best_weights/best_weights_MSFT_wsize30.hdf5", 177 | "best_weights/MSFT_wsize30_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/microsoft/", 178 | "reddit/MSFT/", "twitter/MSFT/", [2019]) 179 | elif request.json["window"] == "60": 180 | result = extrapolate_func(int(request.json["window"]), "MSFT", "best_weights/best_weights_MSFT_wsize60.hdf5", 181 | "best_weights/MSFT_wsize60_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/microsoft/", 182 | "reddit/MSFT/", "twitter/MSFT/", [2019]) 183 | elif request.json["window"] == "90": 184 | result = extrapolate_func(int(request.json["window"]), "MSFT", "best_weights/best_weights_MSFT_wsize90.hdf5", 185 | "best_weights/MSFT_wsize90_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/microsoft/", 186 | "reddit/MSFT/", "twitter/MSFT/", [2019]) 187 | elif request.json["company"] == "Google": 188 | if request.json["window"] == "30": 189 | result = extrapolate_func(int(request.json["window"]), "GOOGL", "best_weights/best_weights_GOOGL_wsize30.hdf5", 190 | "best_weights/GOOGL_wsize30_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/google/", 191 | "reddit/GOOGL/", "twitter/GOOGL/", [2019]) 192 | elif request.json["window"] == "60": 193 | result = extrapolate_func(int(request.json["window"]), "GOOGL", "best_weights/best_weights_GOOGL_wsize60.hdf5", 194 | "best_weights/GOOGL_wsize60_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/google/", 195 | "reddit/GOOGL/", "twitter/GOOGL/", [2019]) 196 | elif request.json["window"] == "90": 197 | result = extrapolate_func(int(request.json["window"]), "GOOGL", "best_weights/best_weights_GOOGL_wsize90.hdf5", 198 | "best_weights/GOOGL_wsize90_sc_dict.p", f'{request.json["company"]}_{request.json["window"]}_5weeks.csv', "stock_data/google/", 199 | "reddit/GOOGL/", "twitter/GOOGL/", [2019]) 200 | return jsonify(result) 201 | 202 | if __name__ == "__main__": 203 | app.run(host="0.0.0.0", port=5000, debug=True) -------------------------------------------------------------------------------- /backend/starter_code.py: -------------------------------------------------------------------------------- 1 | import pandas as pd 2 | import numpy as np 3 | import datetime 4 | import os, sys 5 | 6 | 7 | def starter_code(): 8 | company = sys.argv[1].upper() 9 | 10 | reddit_path = company + '_reddit_2015_to_2019.csv' 11 | twitter_path = company + '_tweets_2015_to_2019.csv' 12 | 13 | reddit_df = pd.read_csv(reddit_path) 14 | twitter_df = pd.read_csv(twitter_path) 15 | 16 | reddit_df = reddit_df.loc[:, ~reddit_df.columns.str.contains('^Unnamed')] 17 | twitter_df = twitter_df.loc[:, ~twitter_df.columns.str.contains('^Unnamed')] 18 | 19 | print('Creating the data for each year ...') 20 | 21 | for year in pd.to_datetime(twitter_df['post_date'], unit='s').dt.year.unique(): 22 | 23 | tw_df = twitter_df[pd.to_datetime(twitter_df['post_date'], unit='s').dt.year==year] 24 | red_df = reddit_df[pd.to_datetime(reddit_df['post_date'], unit='s').dt.year==year] 25 | 26 | tw_out_name = str(company) + '_tweets_' + str(year) + '.csv' 27 | outdir = './twitter/' + company 28 | if not os.path.exists(outdir): 29 | os.mkdir(outdir) 30 | fullname = os.path.join(outdir, tw_out_name) 31 | tw_df.to_csv(fullname) 32 | 33 | red_out_name = str(company) + '_reddit_' + str(year) + '.csv' 34 | outdir = './reddit/'+ company 35 | if not os.path.exists(outdir): 36 | os.mkdir(outdir) 37 | fullname = os.path.join(outdir, red_out_name) 38 | red_df.to_csv(fullname) 39 | 40 | print('year ' + str(year) + ' done') 41 | 42 | if __name__ == '__main__': 43 | starter_code() -------------------------------------------------------------------------------- /backend/utils.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | import numpy as np 3 | from sklearn.preprocessing import MinMaxScaler 4 | 5 | def init_scalers(): 6 | 7 | return { 8 | "Close": MinMaxScaler(feature_range = (0, 1)), 9 | "count_reddit": MinMaxScaler(feature_range = (0, 1)), 10 | "count_twitter": MinMaxScaler(feature_range = (0, 1)), 11 | "retweet_num_twitter": MinMaxScaler(feature_range = (0, 1)), 12 | "like_num_twitter": MinMaxScaler(feature_range = (0, 1)), 13 | } 14 | 15 | 16 | def fit_scalers(X_train_indi, sc_dict): 17 | 18 | X_train_indi[:, 0] = sc_dict["Close"].fit_transform(X_train_indi[:, 0].reshape(-1, 1)).reshape(-1) 19 | X_train_indi[:, 4] = sc_dict["count_reddit"].fit_transform(X_train_indi[:, 4].reshape(-1, 1)).reshape(-1) 20 | X_train_indi[:, 8] = sc_dict["count_twitter"].fit_transform(X_train_indi[:, 8].reshape(-1, 1)).reshape(-1) 21 | X_train_indi[:, 9] = sc_dict["retweet_num_twitter"].fit_transform(X_train_indi[:, 9].reshape(-1, 1)).reshape(-1) 22 | X_train_indi[:, 10] = sc_dict["like_num_twitter"].fit_transform(X_train_indi[:, 10].reshape(-1, 1)).reshape(-1) 23 | 24 | return X_train_indi, sc_dict 25 | 26 | def transform_scalers(X_train_indi, sc_dict): 27 | 28 | X_train_indi[:, 0] = sc_dict["Close"].transform(X_train_indi[:, 0].reshape(-1, 1)).reshape(-1) 29 | X_train_indi[:, 4] = sc_dict["count_reddit"].transform(X_train_indi[:, 4].reshape(-1, 1)).reshape(-1) 30 | X_train_indi[:, 8] = sc_dict["count_twitter"].transform(X_train_indi[:, 8].reshape(-1, 1)).reshape(-1) 31 | X_train_indi[:, 9] = sc_dict["retweet_num_twitter"].transform(X_train_indi[:, 9].reshape(-1, 1)).reshape(-1) 32 | X_train_indi[:, 10] = sc_dict["like_num_twitter"].transform(X_train_indi[:, 10].reshape(-1, 1)).reshape(-1) 33 | 34 | return X_train_indi 35 | 36 | def viz_test_vs_pred(title, dates, dataset_test, predicted_stock_price, show=True): 37 | 38 | # Visualising the results 39 | plt.plot(dates, dataset_test, color = "red", label = "Real Stock Price") 40 | plt.plot(dates, predicted_stock_price, color = "blue", label = "Predicted Stock Price") 41 | 42 | plt.title(title) 43 | plt.xlabel('Time') 44 | plt.ylabel('Stock Price') 45 | plt.legend() 46 | 47 | if show: 48 | plt.show() 49 | else: 50 | plt.savefig('./plots/' + title + '.png') -------------------------------------------------------------------------------- /frontend/README.md: -------------------------------------------------------------------------------- 1 | # Getting Started with Create React App 2 | 3 | This project was bootstrapped with [Create React App](https://github.com/facebook/create-react-app). 4 | 5 | ## Available Scripts 6 | 7 | In the project directory, you can run: 8 | 9 | ### `yarn start` 10 | 11 | Runs the app in the development mode.\ 12 | Open [http://localhost:3000](http://localhost:3000) to view it in the browser. 13 | 14 | The page will reload if you make edits.\ 15 | You will also see any lint errors in the console. 16 | 17 | ### `yarn test` 18 | 19 | Launches the test runner in the interactive watch mode.\ 20 | See the section about [running tests](https://facebook.github.io/create-react-app/docs/running-tests) for more information. 21 | 22 | ### `yarn build` 23 | 24 | Builds the app for production to the `build` folder.\ 25 | It correctly bundles React in production mode and optimizes the build for the best performance. 26 | 27 | The build is minified and the filenames include the hashes.\ 28 | Your app is ready to be deployed! 29 | 30 | See the section about [deployment](https://facebook.github.io/create-react-app/docs/deployment) for more information. 31 | 32 | ### `yarn eject` 33 | 34 | **Note: this is a one-way operation. Once you `eject`, you can’t go back!** 35 | 36 | If you aren’t satisfied with the build tool and configuration choices, you can `eject` at any time. This command will remove the single build dependency from your project. 37 | 38 | Instead, it will copy all the configuration files and the transitive dependencies (webpack, Babel, ESLint, etc) right into your project so you have full control over them. All of the commands except `eject` will still work, but they will point to the copied scripts so you can tweak them. At this point you’re on your own. 39 | 40 | You don’t have to ever use `eject`. The curated feature set is suitable for small and middle deployments, and you shouldn’t feel obligated to use this feature. However we understand that this tool wouldn’t be useful if you couldn’t customize it when you are ready for it. 41 | 42 | ## Learn More 43 | 44 | You can learn more in the [Create React App documentation](https://facebook.github.io/create-react-app/docs/getting-started). 45 | 46 | To learn React, check out the [React documentation](https://reactjs.org/). 47 | 48 | ### Code Splitting 49 | 50 | This section has moved here: [https://facebook.github.io/create-react-app/docs/code-splitting](https://facebook.github.io/create-react-app/docs/code-splitting) 51 | 52 | ### Analyzing the Bundle Size 53 | 54 | This section has moved here: [https://facebook.github.io/create-react-app/docs/analyzing-the-bundle-size](https://facebook.github.io/create-react-app/docs/analyzing-the-bundle-size) 55 | 56 | ### Making a Progressive Web App 57 | 58 | This section has moved here: [https://facebook.github.io/create-react-app/docs/making-a-progressive-web-app](https://facebook.github.io/create-react-app/docs/making-a-progressive-web-app) 59 | 60 | ### Advanced Configuration 61 | 62 | This section has moved here: [https://facebook.github.io/create-react-app/docs/advanced-configuration](https://facebook.github.io/create-react-app/docs/advanced-configuration) 63 | 64 | ### Deployment 65 | 66 | This section has moved here: [https://facebook.github.io/create-react-app/docs/deployment](https://facebook.github.io/create-react-app/docs/deployment) 67 | 68 | ### `yarn build` fails to minify 69 | 70 | This section has moved here: [https://facebook.github.io/create-react-app/docs/troubleshooting#npm-run-build-fails-to-minify](https://facebook.github.io/create-react-app/docs/troubleshooting#npm-run-build-fails-to-minify) 71 | -------------------------------------------------------------------------------- /frontend/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "client", 3 | "version": "0.1.0", 4 | "private": true, 5 | "dependencies": { 6 | "@fortawesome/fontawesome-svg-core": "^1.2.35", 7 | "@fortawesome/free-solid-svg-icons": "^5.15.3", 8 | "@fortawesome/react-fontawesome": "^0.1.14", 9 | "@testing-library/jest-dom": "^5.11.4", 10 | "@testing-library/react": "^11.1.0", 11 | "@testing-library/user-event": "^12.1.10", 12 | "d3": "^6.7.0", 13 | "d32": "npm:d3@3.5.17", 14 | "html-loader": "^2.1.2", 15 | "react": "^17.0.2", 16 | "react-bootstrap": "^1.5.2", 17 | "react-d3-library": "^1.1.8", 18 | "react-dom": "^17.0.2", 19 | "react-pro-sidebar": "^0.6.0", 20 | "react-router-bootstrap": "^0.25.0", 21 | "react-router-dom": "^5.2.0", 22 | "react-scripts": "4.0.3", 23 | "react-spinners": "^0.10.6", 24 | "web-vitals": "^1.0.1" 25 | }, 26 | "scripts": { 27 | "start": "react-scripts start", 28 | "build": "react-scripts build", 29 | "test": "react-scripts test", 30 | "eject": "react-scripts eject" 31 | }, 32 | "eslintConfig": { 33 | "extends": [ 34 | "react-app", 35 | "react-app/jest" 36 | ] 37 | }, 38 | "browserslist": { 39 | "production": [ 40 | ">0.2%", 41 | "not dead", 42 | "not op_mini all" 43 | ], 44 | "development": [ 45 | "last 1 chrome version", 46 | "last 1 firefox version", 47 | "last 1 safari version" 48 | ] 49 | } 50 | } 51 | -------------------------------------------------------------------------------- /frontend/public/favicon.ico: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rohitgajawada/Stock-Price-Prediction-Visualization/a5bc88e6a6a0718a27ac185df688e7a72472d026/frontend/public/favicon.ico -------------------------------------------------------------------------------- /frontend/public/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 13 | 14 | 23 | 29 | Stock Visualization 30 | 31 | 32 | 33 |
34 | 44 | 45 | 46 | -------------------------------------------------------------------------------- /frontend/public/manifest.json: -------------------------------------------------------------------------------- 1 | { 2 | "short_name": "React App", 3 | "name": "Create React App Sample", 4 | "icons": [ 5 | { 6 | "src": "favicon.ico", 7 | "sizes": "64x64 32x32 24x24 16x16", 8 | "type": "image/x-icon" 9 | } 10 | ], 11 | "start_url": ".", 12 | "display": "standalone", 13 | "theme_color": "#000000", 14 | "background_color": "#ffffff" 15 | } 16 | -------------------------------------------------------------------------------- /frontend/src/App.css: -------------------------------------------------------------------------------- 1 | .App { 2 | text-align: center; 3 | } 4 | 5 | .title-container { 6 | display: flex; 7 | justify-content: center; 8 | background-color: #3f4d67; 9 | } 10 | 11 | .title { 12 | color: white; 13 | font-size: calc(10px + 2vmin); 14 | } 15 | 16 | .sidebar-header { 17 | margin-top: 15px; 18 | } 19 | 20 | .left-half { 21 | text-align: center; 22 | float: left; 23 | width: 15%; 24 | color: black; 25 | overflow: hidden; 26 | box-shadow: 10px 0 5px -2px #888; 27 | } 28 | 29 | .right-half { 30 | text-align: left; 31 | float: right; 32 | height: 100vh; 33 | width: 85%; 34 | color: black; 35 | overflow: auto; 36 | } 37 | 38 | .pro-sidebar-layout { 39 | background-color: #025955; 40 | } 41 | 42 | .pro-sidebar-inner { 43 | color: white; 44 | } 45 | 46 | .pro-sidebar-header { 47 | font-size: 40px; 48 | margin-left: 10px; 49 | } 50 | 51 | .pro-sidebar > .pro-sidebar-inner > .pro-sidebar-layout .pro-sidebar-header { 52 | border-bottom: 0px solid rgba(173, 173, 173, 0.2); 53 | font-weight: 700; 54 | } 55 | 56 | .pro-item-content { 57 | font-size: 14px; 58 | margin-top: 10px; 59 | } 60 | -------------------------------------------------------------------------------- /frontend/src/App.js: -------------------------------------------------------------------------------- 1 | import { useState } from 'react' 2 | import { ProSidebar, SidebarHeader, Menu, MenuItem } from 'react-pro-sidebar' 3 | import 'react-pro-sidebar/dist/css/styles.css' 4 | import './App.css' 5 | import Landing from './components/Misc/Landing' 6 | import Sentiment from './components/Sentiment/Sentiment' 7 | import Prediction from './components/Prediction/Prediction' 8 | import Financial from './components/Financial/Financial' 9 | import { FontAwesomeIcon } from '@fortawesome/react-fontawesome' 10 | import { 11 | faChartLine, 12 | faMoneyCheck, 13 | faComment, 14 | } from '@fortawesome/free-solid-svg-icons' 15 | 16 | export default function App() { 17 | const [graph, setGraph] = useState('Landing') 18 | 19 | var currGraph 20 | if (graph === 'graph1') { 21 | currGraph = 22 | } else if (graph === 'graph2') { 23 | currGraph = 24 | } else if (graph === 'graph3') { 25 | currGraph = 26 | } else { 27 | currGraph = 28 | } 29 | 30 | return ( 31 |
32 |
33 |
34 | 35 | Stock Viz 36 | 37 | setGraph('graph1')} className="menuItem"> 38 | 42 | Financial Dashboard 43 | 44 | setGraph('graph2')} className="menuItem"> 45 | 49 | Predict Stock Price 50 | 51 | setGraph('graph3')} className="menuItem"> 52 | 56 | Analyze Recent Sentiment 57 | 58 | 59 | 60 |
61 |
{currGraph}
62 |
63 |
64 | ) 65 | } 66 | -------------------------------------------------------------------------------- /frontend/src/components/Financial/Financial.css: -------------------------------------------------------------------------------- 1 | .chart__grid { 2 | stroke: #e7eef4; 3 | opacity: 0.3; 4 | } 5 | svg { 6 | font: 10px sans-serif; 7 | } 8 | .area { 9 | fill: rgba(255, 100, 100, 0.5); 10 | clip-path: url('#clip'); 11 | } 12 | .axis path, 13 | .axis line { 14 | fill: none; 15 | stroke: #e7eef4; 16 | shape-rendering: crispEdges; 17 | } 18 | .brush .extent { 19 | stroke: #fff; 20 | fill-opacity: 0.125; 21 | shape-rendering: crispEdges; 22 | } 23 | /*New*/ 24 | .chart text { 25 | fill: #7f8faf; 26 | } 27 | .chart__line { 28 | fill: none; 29 | stroke-width: 1px; 30 | clip-path: url('#clip'); 31 | } 32 | .chart__price--focus { 33 | stroke-width: 1px; 34 | stroke: #2980b9; 35 | fill: #2980b9; 36 | } 37 | .chart__average--focus { 38 | stroke-width: 1px; 39 | stroke: #21f30e; 40 | opacity: 0.3; 41 | } 42 | .chart__overlay { 43 | opacity: 0; 44 | pointer-events: all; 45 | } 46 | .chart__tooltip--price { 47 | fill: none; 48 | stroke: #f00; 49 | } 50 | .chart__tooltip--average { 51 | fill: none; 52 | stroke: #f00; 53 | } 54 | .chart__bars { 55 | fill: #7161db; 56 | opacity: 0.7; 57 | } 58 | .chart__range-selection text { 59 | cursor: pointer; 60 | text-decoration: underline; 61 | fill: #2980b9; 62 | } 63 | .chart__axis--context text { 64 | fill: #7f8faf; 65 | } 66 | .chart__axis--context .tick:nth-child(odd) { 67 | display: none; 68 | } 69 | .chart__axis--context path, 70 | .chart__axis--context line { 71 | display: none; 72 | } 73 | .chart__area { 74 | fill: #e6f6fe; 75 | stroke: #3587bc; 76 | } 77 | .extent { 78 | fill: #e2f0ff; 79 | fill: #3587bc; 80 | fill-opacity: 0.3; 81 | } 82 | .y.axis .tick text { 83 | text-anchor: start !important; 84 | fill: #7f8faf; 85 | } 86 | .y.axis .tick line { 87 | display: none; 88 | } 89 | .y.axis path { 90 | display: none; 91 | } 92 | 93 | path.line { 94 | fill: none; 95 | stroke: #2980b9; 96 | } 97 | 98 | .dropdown-container, 99 | .page-desc-container, 100 | .title-container { 101 | display: flex; 102 | justify-content: center; 103 | } 104 | 105 | .title-container { 106 | position: relative; 107 | z-index: -1; 108 | } 109 | -------------------------------------------------------------------------------- /frontend/src/components/Financial/Financial.js: -------------------------------------------------------------------------------- 1 | import { useEffect } from 'react' 2 | import * as d3 from 'd32' 3 | import combined1 from './combined1.csv' 4 | import './Financial.css' 5 | 6 | export default function Financial() { 7 | useEffect(() => { 8 | componentDidMount() 9 | }) 10 | 11 | const componentDidMount = () => { 12 | var margin = { top: 30, right: 20, bottom: 100, left: 50 }, 13 | margin2 = { top: 210, right: 20, bottom: 20, left: 50 }, 14 | width = 800 - margin.left - margin.right, 15 | height = 310 - margin.top - margin.bottom, 16 | height2 = 310 - margin2.top - margin2.bottom 17 | 18 | var parseDate = d3.time.format('%m/%d/%Y').parse, 19 | bisectDate = d3.bisector(function (d) { 20 | return d.date 21 | }).left, 22 | legendFormat = d3.time.format('%b %d, %Y') 23 | 24 | var x = d3.time.scale().range([0, width]), 25 | x2 = d3.time.scale().range([0, width]), 26 | y = d3.scale.linear().range([height, 0]), 27 | y2 = d3.scale.linear().range([height2, 0]), 28 | y3 = d3.scale.linear().range([60, 0]) 29 | 30 | var xAxis = d3.svg.axis().scale(x).orient('bottom'), 31 | xAxis2 = d3.svg.axis().scale(x2).orient('bottom'), 32 | yAxis = d3.svg.axis().scale(y).orient('left') 33 | 34 | var priceLine = d3.svg 35 | .line() 36 | .interpolate('monotone') 37 | .x(function (d) { 38 | return x(d.date) 39 | }) 40 | .y(function (d) { 41 | return y(d.price) 42 | }) 43 | 44 | var area2 = d3.svg 45 | .area() 46 | .interpolate('monotone') 47 | .x(function (d) { 48 | return x2(d.date) 49 | }) 50 | .y0(height2) 51 | .y1(function (d) { 52 | return y2(d.price) 53 | }) 54 | 55 | var svg = d3 56 | .select('#graph') 57 | .append('svg') 58 | .attr('id', 'svg2') 59 | .attr('class', 'chart') 60 | .attr('width', 750 + margin.left + margin.right) 61 | .attr('height', height + margin.top + margin.bottom + 120) 62 | var svg33 = d3 63 | .select('#graph') 64 | .append('svg') 65 | .attr('id', 'svg33') 66 | .attr('class', 'chart33') 67 | .attr('width', 220 + margin.left + margin.right) 68 | .attr('height', height + margin.top + margin.bottom + 120) 69 | 70 | svg 71 | .append('defs') 72 | .append('clipPath') 73 | .attr('id', 'clip') 74 | .append('rect') 75 | .attr('width', width) 76 | .attr('height', height) 77 | 78 | var make_y_axis = function () { 79 | return d3.svg.axis().scale(y).orient('left').ticks(10) 80 | } 81 | 82 | var focus = svg 83 | .append('g') 84 | .attr('class', 'focus') 85 | .attr('transform', 'translate(' + margin.left + ',' + margin.top + ')') 86 | 87 | var barsGroup = svg 88 | .append('g') 89 | .attr('class', 'volume') 90 | .attr('clip-path', 'url(#clip)') 91 | .attr( 92 | 'transform', 93 | 'translate(' + margin.left + ',' + (margin.top + 90 + 20) + ')' 94 | ) 95 | 96 | var context = svg 97 | .append('g') 98 | .attr('class', 'context') 99 | .attr( 100 | 'transform', 101 | 'translate(' + margin2.left + ',' + (margin2.top + 90) + ')' 102 | ) 103 | 104 | var legend = svg 105 | .append('g') 106 | .attr('class', 'chart__legend') 107 | .attr('width', width) 108 | .attr('height', 30) 109 | .attr('transform', 'translate(' + margin2.left + ', 10)') 110 | var legend1 = svg 111 | .append('g') 112 | .attr('class', 'chart__legend') 113 | .attr('width', width) 114 | .attr('height', 30) 115 | .attr('transform', 'translate(' + 5 + ',' + (margin.top + 60 + 205) + ')') 116 | 117 | legend1 118 | .append('text') 119 | .attr('class', 'chart__symbol') 120 | .attr('transform', 'rotate(-90)') 121 | .attr('y', 35) 122 | .attr('x', -30) 123 | .style('fill', 'darkOrange') 124 | .text('Stock Volume ') 125 | 126 | var legend2 = svg 127 | .append('g') 128 | .attr('class', 'chart__legend') 129 | .attr('width', width) 130 | .attr('height', 30) 131 | .attr( 132 | 'transform', 133 | 'translate(' + margin.left + ',' + (margin.top + 60 + 225) + ')' 134 | ) 135 | 136 | legend2 137 | .append('text') 138 | .attr('class', 'chart__symbol') 139 | .text('Time period selection:Select an area on bottom graph') 140 | .attr('font-size', '14px') 141 | 142 | legend.append('text').attr('class', 'chart__symbol').text('NASDAQ') 143 | 144 | var rangeSelection = legend 145 | .append('g') 146 | .attr('class', 'chart__range-selection') 147 | .attr('transform', 'translate(110, 0)') 148 | 149 | d3.csv(combined1, type, function (err, data) { 150 | var nest = d3 151 | .nest() 152 | .key(function (d) { 153 | return d.comp 154 | }) 155 | .entries(data) 156 | var fruitMenu = d3.select('#fruitDropdown') 157 | 158 | fruitMenu 159 | .append('select') 160 | .selectAll('option') 161 | .data(nest) 162 | .enter() 163 | .append('option') 164 | .attr('value', function (d) { 165 | return d.key 166 | }) 167 | .text(function (d) { 168 | return d.key 169 | }) 170 | var selectFruit = nest.filter(function (d) { 171 | return d.key === 'Microsoft' 172 | }) 173 | console.log(selectFruit[0].values) 174 | var brush = d3.svg.brush().x(x2).on('brush', brushed) 175 | // ----------------------------------------------------------------------------------- 176 | var xRange = d3.extent( 177 | selectFruit[0].values.map(function (d) { 178 | return d.date 179 | }) 180 | ) 181 | 182 | x.domain(xRange) 183 | y.domain( 184 | d3.extent( 185 | selectFruit[0].values.map(function (d) { 186 | return d.price 187 | }) 188 | ) 189 | ) 190 | y3.domain( 191 | d3.extent( 192 | selectFruit[0].values.map(function (d) { 193 | return d.price 194 | }) 195 | ) 196 | ) 197 | x2.domain(x.domain()) 198 | y2.domain(y.domain()) 199 | 200 | var min = d3.min( 201 | selectFruit[0].values.map(function (d) { 202 | return d.price 203 | }) 204 | ) 205 | var max = d3.max( 206 | selectFruit[0].values.map(function (d) { 207 | return d.price 208 | }) 209 | ) 210 | 211 | var range = legend 212 | .append('text') 213 | .text( 214 | legendFormat(new Date(xRange[0])) + 215 | ' - ' + 216 | legendFormat(new Date(xRange[1])) 217 | ) 218 | .style('text-anchor', 'end') 219 | .attr('transform', 'translate(' + width + ', 0)') 220 | 221 | focus 222 | .append('g') 223 | .attr('class', 'y chart__grid') 224 | .call(make_y_axis().tickSize(-width, 0, 0).tickFormat('')) 225 | 226 | // var averageChart = focus.append('path') 227 | // .datum(selectFruit[0].values) 228 | // .attr('class', 'chart__line chart__average--focus line') 229 | // .style("stroke-width","0.5px") 230 | // .attr('d', avgLine); 231 | 232 | var priceChart = focus 233 | .append('path') 234 | .datum(selectFruit[0].values) 235 | .attr('class', 'chart__line chart__price--focus line') 236 | .style('stroke-width', '1.7px') 237 | .style('stroke', ' #2980b9') 238 | .attr('d', priceLine) 239 | 240 | focus 241 | .append('g') 242 | .attr('class', 'x axis') 243 | .attr('transform', 'translate(0 ,' + height + ')') 244 | .call(xAxis) 245 | .append('text') 246 | .attr('class', 'label') 247 | .attr('x', width) 248 | .attr('y', -6) 249 | .style('text-anchor', 'end') 250 | .style('fill', 'darkOrange') 251 | .text('Date') 252 | 253 | focus 254 | .append('g') 255 | .attr('class', 'y axis') 256 | .attr('transform', 'translate(12, 0)') 257 | .call(yAxis) 258 | .append('text') 259 | .attr('class', 'label') 260 | .attr('x', -20) 261 | .attr('y', -30) 262 | .attr('dy', '.71em') 263 | .attr('transform', 'rotate(-90)') 264 | .style('text-anchor', 'end') 265 | .style('fill', 'darkOrange') 266 | .text('Stock price in USD') 267 | 268 | var focusGraph = barsGroup 269 | .selectAll('rect') 270 | .data(selectFruit[0].values) 271 | .enter() 272 | .append('rect') 273 | .attr('class', 'chart__bars') 274 | .attr('x', function (d, i) { 275 | return x(d.date) 276 | }) 277 | .attr('y', function (d) { 278 | return 155 - y3(d.price) 279 | }) 280 | .attr('width', 1) 281 | .style('fill', '#7161db') 282 | .attr('height', function (d) { 283 | return y3(d.price) 284 | }) 285 | 286 | var helper = svg33 287 | .append('g') 288 | .attr('class', 'chart__helper') 289 | .attr('transform', 'translate(' + 40 + ',' + 210 + ')') 290 | 291 | var helperText = helper 292 | .append('text') 293 | .style('fill', 'blue') 294 | .style('font-size', '12px') 295 | 296 | var helper1 = svg33 297 | .append('g') 298 | .attr('class', 'chart__helper1') 299 | .attr('transform', 'translate(' + 40 + ',' + 230 + ')') 300 | 301 | var helperText1 = helper1 302 | .append('text') 303 | .style('fill', 'blue') 304 | .style('font-size', '12px') 305 | 306 | var helper2 = svg33 307 | .append('g') 308 | .attr('class', 'chart__helper2') 309 | .attr('transform', 'translate(' + 40 + ',' + 250 + ')') 310 | 311 | var helperText2 = helper2 312 | .append('text') 313 | .style('fill', 'blue') 314 | .style('font-size', '12px') 315 | var priceTooltip = focus 316 | .append('g') 317 | .attr('class', 'chart__tooltip--price') 318 | .append('circle') 319 | .style('display', 'none') 320 | .attr('r', 2.5) 321 | 322 | // eslint-disable-next-line 323 | var mouseArea = svg 324 | .append('g') 325 | .attr('class', 'chart__mouse') 326 | .append('rect') 327 | .attr('class', 'chart__overlay') 328 | .attr('width', width) 329 | .attr('height', height) 330 | .attr('transform', 'translate(' + margin.left + ',' + margin.top + ')') 331 | .on('mouseover', function () { 332 | priceTooltip.style('display', null) 333 | svg33.style('display', null) 334 | }) 335 | .on('mouseout', function () { 336 | priceTooltip.style('display', 'none') 337 | svg33.style('display', 'none') 338 | }) 339 | .on('mousemove', mousemove) 340 | 341 | context 342 | .append('path') 343 | .datum(selectFruit[0].values) 344 | .attr('class', 'chart__area area') 345 | .attr('d', area2) 346 | .style('fill', '#67c0ed') 347 | 348 | context 349 | .append('g') 350 | .attr('class', 'x axis chart__axis--context') 351 | .attr('y', 0) 352 | .attr('transform', 'translate(0,' + (height2 - 22) + ')') 353 | .call(xAxis2) 354 | 355 | context 356 | .append('g') 357 | .attr('class', 'x brush') 358 | .call(brush) 359 | .selectAll('rect') 360 | .attr('y', -6) 361 | .attr('height', height2 + 7) 362 | 363 | function mousemove() { 364 | var x0 = x.invert(d3.mouse(this)[0]) 365 | var i = bisectDate(selectFruit[0].values, x0, 1) 366 | var d0 = selectFruit[0].values[i - 1] 367 | var d1 = selectFruit[0].values[i] 368 | var d = x0 - d0.date > d1.date - x0 ? d1 : d0 369 | var aColor = ['rgb(0, 255, 0)', 'rgb(255, 0, 0)', 'rgb(200,200,200)'] 370 | 371 | var r = 50 372 | var data1 = [{ value: d.tp }, { value: d.tn }, { value: d.tnn }] 373 | 374 | var p1 = svg33 375 | .append('svg') 376 | .attr('id', 'svg3') 377 | .attr('class', 'chart__pie') 378 | .data([data1]) 379 | .attr('width', width) 380 | .attr('height', height) 381 | .append('svg:g') 382 | svg33 383 | .append('text') 384 | .attr('x', 40) 385 | .attr('y', 8) 386 | .text('Pie chart of stock sentiment ratios') 387 | .style('font-size', '12px') 388 | .attr('alignment-baseline', 'middle') 389 | svg33 390 | .append('text') 391 | .attr('x', 40) 392 | .attr('y', 25) 393 | .text('calculated using FinBERT model') 394 | .style('font-size', '12px') 395 | .attr('alignment-baseline', 'middle') 396 | svg33 397 | .append('circle') 398 | .attr('cx', 170) 399 | .attr('cy', 70) 400 | .attr('r', 7) 401 | .style('fill', 'rgb(0, 255, 0)') 402 | svg33 403 | .append('circle') 404 | .attr('cx', 170) 405 | .attr('cy', 100) 406 | .attr('r', 7) 407 | .style('fill', 'rgb(255, 0, 0)') 408 | svg33 409 | .append('circle') 410 | .attr('cx', 170) 411 | .attr('cy', 130) 412 | .attr('r', 7) 413 | .style('fill', 'rgb(200,200,200)') 414 | svg33 415 | .append('text') 416 | .attr('x', 180) 417 | .attr('y', 70) 418 | .text('positive sentiment ratio') 419 | .style('font-size', '10px') 420 | .attr('alignment-baseline', 'middle') 421 | svg33 422 | .append('text') 423 | .attr('x', 180) 424 | .attr('y', 100) 425 | .text('negative sentiment ratio') 426 | .style('font-size', '10x') 427 | .attr('alignment-baseline', 'middle') 428 | svg33 429 | .append('text') 430 | .attr('x', 180) 431 | .attr('y', 130) 432 | .text('neutral sentiment ratio') 433 | .style('font-size', '10x') 434 | .attr('alignment-baseline', 'middle') 435 | //svg.append("circle").attr("cx",200).attr("cy",160).attr("r", 6).style("fill", "#404080") 436 | //svg.append("text").attr("x", 220).attr("y", 130).text("variable A").style("font-size", "15px").attr("alignment-baseline","middle") 437 | //svg.append("text").attr("x", 220).attr("y", 160).text("variable B").style("font-size", "15px").attr("alignment-baseline","middle") 438 | 439 | var pie = d3.layout.pie().value(function (d) { 440 | return d.value 441 | }) 442 | 443 | // Declare an arc generator function 444 | var arc = d3.svg.arc().outerRadius(r) 445 | 446 | // Select paths, use arc generator to draw 447 | var arcs = p1 448 | .selectAll('g.slice') 449 | .data(pie) 450 | .enter() 451 | .append('svg:g') 452 | .attr('class', 'slice') 453 | arcs 454 | .append('svg:path') 455 | .attr('fill', function (d, i) { 456 | return aColor[i] 457 | }) 458 | .attr('d', function (d) { 459 | return arc(d) 460 | }) 461 | p1.attr('transform', 'translate(' + 70 + ',' + 100 + ')') 462 | /* svg33.append("text").attr("x", 40).attr("y", 240).text("Date").style("font-size", "10px").attr("alignment-baseline","middle") 463 | svg33.append("text").attr("x", 40).attr("y", 270).text("Price").style("font-size", "10x").attr("alignment-baseline","middle") 464 | svg33.append("text").attr("x", 40).attr("y", 290).text("# likes").style("font-size", "10x").attr("alignment-baseline","middle") 465 | svg33.append("text").attr("x", 95).attr("y", 240).text(legendFormat(new Date(d.date))).style("font-size", "10px").attr("alignment-baseline","middle") 466 | svg33.append("text").attr("x", 95).attr("y", 270).text(d.price).style("font-size", "10x").attr("alignment-baseline","middle")*/ 467 | // svg33.append("text").attr("x", 95).attr("y", 290).text(d.lk).style("font-size", "10x").attr("alignment-baseline","middle") 468 | helperText.text('Date: ' + legendFormat(new Date(d.date))) 469 | helperText1.text('Stock Price: ' + d.price) 470 | helperText2.text('Total #Tweet Favorites: ' + d.lk) 471 | priceTooltip.attr( 472 | 'transform', 473 | 'translate(' + x(d.date) + ',' + y(d.price) + ')' 474 | ) 475 | //averageTooltip.attr('transform', 'translate(' + x(d.date) + ',' + y(d.price+5*d.score) + ')'); 476 | } 477 | 478 | function brushed() { 479 | var ext = brush.extent() 480 | if (!brush.empty()) { 481 | x.domain(brush.empty() ? x2.domain() : brush.extent()) 482 | y.domain([ 483 | d3.min( 484 | selectFruit[0].values.map(function (d) { 485 | return d.date >= ext[0] && d.date <= ext[1] ? d.price : max 486 | }) 487 | ), 488 | d3.max( 489 | selectFruit[0].values.map(function (d) { 490 | return d.date >= ext[0] && d.date <= ext[1] ? d.price : min 491 | }) 492 | ), 493 | ]) 494 | range.text( 495 | legendFormat(new Date(ext[0])) + 496 | ' - ' + 497 | legendFormat(new Date(ext[1])) 498 | ) 499 | focusGraph.attr('x', function (d, i) { 500 | return x(d.date) 501 | }) 502 | 503 | var days = Math.ceil((ext[1] - ext[0]) / (24 * 3600 * 1000)) 504 | focusGraph.attr('width', 40 > days ? ((40 - days) * 5) / 6 : 5) 505 | } 506 | 507 | priceChart.attr('d', priceLine) 508 | // averageChart.attr('d', avgLine); 509 | focus.select('.x.axis').call(xAxis) 510 | focus.select('.y.axis').call(yAxis) 511 | } 512 | 513 | var dateRange = ['1w', '1m', '3m', '6m', '1y', '5y'] 514 | for (var i = 0, l = dateRange.length; i < l; i++) { 515 | var v = dateRange[i] 516 | rangeSelection 517 | .append('text') 518 | .attr('class', 'chart__range-selection') 519 | .text(v) 520 | .attr('transform', 'translate(' + 18 * i + ', 0)') 521 | .on('click', function (d) { 522 | focusOnRange(this.textContent) 523 | }) 524 | } 525 | 526 | function focusOnRange(range) { 527 | var today = new Date( 528 | selectFruit[0].values[selectFruit[0].values.length - 1].date 529 | ) 530 | var ext = new Date( 531 | selectFruit[0].values[selectFruit[0].values.length - 1].date 532 | ) 533 | 534 | if (range === '1m') ext.setMonth(ext.getMonth() - 1) 535 | 536 | if (range === '1w') ext.setDate(ext.getDate() - 7) 537 | 538 | if (range === '3m') ext.setMonth(ext.getMonth() - 3) 539 | 540 | if (range === '6m') ext.setMonth(ext.getMonth() - 6) 541 | 542 | if (range === '1y') ext.setFullYear(ext.getFullYear() - 1) 543 | 544 | if (range === '5y') ext.setFullYear(ext.getFullYear() - 5) 545 | 546 | brush.extent([ext, today]) 547 | brushed() 548 | context.select('g.x.brush').call(brush.extent([ext, today])) 549 | } 550 | 551 | fruitMenu.on('change', function () { 552 | // Find which fruit was selected from the dropdown 553 | var selectedFruit = d3.select(this).select('select').property('value') 554 | console.log('s', selectedFruit) 555 | 556 | // Run update function with the selected fruit 557 | updateGraph(selectedFruit) 558 | }) 559 | 560 | var updateGraph = function (fruit) { 561 | d3.select('#svg2').remove() 562 | d3.select('#svg33').remove() 563 | 564 | //----------------------- 565 | var margin = { top: 30, right: 20, bottom: 100, left: 50 }, 566 | margin2 = { top: 210, right: 20, bottom: 20, left: 50 }, 567 | width = 800 - margin.left - margin.right, 568 | height = 310 - margin.top - margin.bottom, 569 | height2 = 310 - margin2.top - margin2.bottom 570 | 571 | var x = d3.time.scale().range([0, width]), 572 | x2 = d3.time.scale().range([0, width]), 573 | y = d3.scale.linear().range([height, 0]), 574 | y2 = d3.scale.linear().range([height2, 0]), 575 | y3 = d3.scale.linear().range([60, 0]) 576 | 577 | var xAxis = d3.svg.axis().scale(x).orient('bottom'), 578 | xAxis2 = d3.svg.axis().scale(x2).orient('bottom'), 579 | yAxis = d3.svg.axis().scale(y).orient('left') 580 | 581 | var priceLine = d3.svg 582 | .line() 583 | .interpolate('monotone') 584 | .x(function (d) { 585 | return x(d.date) 586 | }) 587 | .y(function (d) { 588 | return y(d.price) 589 | }) 590 | 591 | var area2 = d3.svg 592 | .area() 593 | .interpolate('monotone') 594 | .x(function (d) { 595 | return x2(d.date) 596 | }) 597 | .y0(height2) 598 | .y1(function (d) { 599 | return y2(d.price) 600 | }) 601 | 602 | var svg = d3 603 | .select('#graph') 604 | .append('svg') 605 | .attr('id', 'svg2') 606 | .attr('class', 'chart') 607 | .attr('width', 750 + margin.left + margin.right) 608 | .attr('height', height + margin.top + margin.bottom + 180) 609 | var svg33 = d3 610 | .select('#graph') 611 | .append('svg') 612 | .attr('id', 'svg33') 613 | .attr('class', 'chart33') 614 | .attr('width', 220 + margin.left + margin.right) 615 | .attr('height', height + margin.top + margin.bottom + 120) 616 | 617 | svg 618 | .append('defs') 619 | .append('clipPath') 620 | .attr('id', 'clip') 621 | .append('rect') 622 | .attr('width', width) 623 | .attr('height', height) 624 | 625 | var make_y_axis = function () { 626 | return d3.svg.axis().scale(y).orient('left').ticks(10) 627 | } 628 | 629 | var focus = svg 630 | .append('g') 631 | .attr('class', 'focus') 632 | .attr( 633 | 'transform', 634 | 'translate(' + margin.left + ',' + margin.top + ')' 635 | ) 636 | 637 | var barsGroup = svg 638 | .append('g') 639 | .attr('class', 'volume') 640 | .attr('clip-path', 'url(#clip)') 641 | .attr( 642 | 'transform', 643 | 'translate(' + margin.left + ',' + (margin.top + 90 + 20) + ')' 644 | ) 645 | 646 | var context = svg 647 | .append('g') 648 | .attr('class', 'context') 649 | .attr( 650 | 'transform', 651 | 'translate(' + margin2.left + ',' + (margin2.top + 90) + ')' 652 | ) 653 | 654 | var legend = svg 655 | .append('g') 656 | .attr('class', 'chart__legend') 657 | .attr('width', width) 658 | .attr('height', 30) 659 | .attr('transform', 'translate(' + margin2.left + ', 10)') 660 | var legend1 = svg 661 | .append('g') 662 | .attr('class', 'chart__legend') 663 | .attr('width', width) 664 | .attr('height', 30) 665 | .attr( 666 | 'transform', 667 | 'translate(' + 5 + ',' + (margin.top + 60 + 205) + ')' 668 | ) 669 | 670 | legend1 671 | .append('text') 672 | .attr('class', 'chart__symbol') 673 | .attr('transform', 'rotate(-90)') 674 | .attr('y', 35) 675 | .attr('x', -30) 676 | .style('fill', 'darkOrange') 677 | .text('Stock Volume ') 678 | 679 | var legend2 = svg 680 | .append('g') 681 | .attr('class', 'chart__legend') 682 | .attr('width', width) 683 | .attr('height', 30) 684 | .attr( 685 | 'transform', 686 | 'translate(' + margin.left + ',' + (margin.top + 60 + 225) + ')' 687 | ) 688 | 689 | legend2 690 | .append('text') 691 | .attr('class', 'chart__symbol') 692 | .text('Time period selection:Select an area on bottom graph') 693 | .attr('font-size', '14px') 694 | 695 | legend.append('text').attr('class', 'chart__symbol').text('NASDAQ') 696 | 697 | var rangeSelection = legend 698 | .append('g') 699 | .attr('class', 'chart__range-selection') 700 | .attr('transform', 'translate(110, 0)') 701 | 702 | //================= 703 | 704 | // Filter the data to include only fruit of interest 705 | var selectFruit = nest.filter(function (d) { 706 | return d.key === fruit 707 | }) 708 | // console.log(data) 709 | console.log(selectFruit[0].values) 710 | 711 | //----------------------------- 712 | var brush = d3.svg.brush().x(x2).on('brush', brushed) 713 | // ----------------------------------------------------------------------------------- 714 | var xRange = d3.extent( 715 | selectFruit[0].values.map(function (d) { 716 | return d.date 717 | }) 718 | ) 719 | 720 | x.domain(xRange) 721 | y.domain( 722 | d3.extent( 723 | selectFruit[0].values.map(function (d) { 724 | return d.price 725 | }) 726 | ) 727 | ) 728 | y3.domain( 729 | d3.extent( 730 | selectFruit[0].values.map(function (d) { 731 | return d.price 732 | }) 733 | ) 734 | ) 735 | x2.domain(x.domain()) 736 | y2.domain(y.domain()) 737 | 738 | var min = d3.min( 739 | selectFruit[0].values.map(function (d) { 740 | return d.price 741 | }) 742 | ) 743 | var max = d3.max( 744 | selectFruit[0].values.map(function (d) { 745 | return d.price 746 | }) 747 | ) 748 | 749 | var range = legend 750 | .append('text') 751 | .text( 752 | legendFormat(new Date(xRange[0])) + 753 | ' - ' + 754 | legendFormat(new Date(xRange[1])) 755 | ) 756 | .style('text-anchor', 'end') 757 | .attr('transform', 'translate(' + width + ', 0)') 758 | 759 | focus 760 | .append('g') 761 | .attr('class', 'y chart__grid') 762 | .call(make_y_axis().tickSize(-width, 0, 0).tickFormat('')) 763 | 764 | // var averageChart = focus.append('path') 765 | // .datum(selectFruit[0].values) 766 | // .attr('class', 'chart__line chart__average--focus line') 767 | // .style("stroke-width","0.5px") 768 | // .attr('d', avgLine); 769 | 770 | var priceChart = focus 771 | .append('path') 772 | .datum(selectFruit[0].values) 773 | .attr('class', 'chart__line chart__price--focus line') 774 | .style('stroke-width', '1.7px') 775 | .style('stroke', ' #2980b9') 776 | .attr('d', priceLine) 777 | 778 | focus 779 | .append('g') 780 | .attr('class', 'x axis') 781 | .attr('transform', 'translate(0 ,' + height + ')') 782 | .call(xAxis) 783 | .append('text') 784 | .attr('class', 'label') 785 | .attr('x', width) 786 | .attr('y', -6) 787 | .style('text-anchor', 'end') 788 | .style('fill', 'darkOrange') 789 | .text('Date') 790 | 791 | focus 792 | .append('g') 793 | .attr('class', 'y axis') 794 | .attr('transform', 'translate(12, 0)') 795 | .call(yAxis) 796 | .append('text') 797 | .attr('class', 'label') 798 | .attr('x', -20) 799 | .attr('y', -30) 800 | .attr('dy', '.71em') 801 | .attr('transform', 'rotate(-90)') 802 | .style('text-anchor', 'end') 803 | .style('fill', 'darkOrange') 804 | .text('Stock price in USD') 805 | 806 | var focusGraph = barsGroup 807 | .selectAll('rect') 808 | .data(selectFruit[0].values) 809 | .enter() 810 | .append('rect') 811 | .attr('class', 'chart__bars') 812 | .attr('x', function (d, i) { 813 | return x(d.date) 814 | }) 815 | .attr('y', function (d) { 816 | return 155 - y3(d.price) 817 | }) 818 | .attr('width', 1) 819 | .style('fill', '#7161db') 820 | .attr('height', function (d) { 821 | return y3(d.price) 822 | }) 823 | 824 | var helper = svg33 825 | .append('g') 826 | .attr('class', 'chart__helper') 827 | .attr('transform', 'translate(' + 40 + ',' + 210 + ')') 828 | 829 | var helperText = helper 830 | .append('text') 831 | .style('fill', 'blue') 832 | .style('font-size', '12px') 833 | 834 | var helper1 = svg33 835 | .append('g') 836 | .attr('class', 'chart__helper1') 837 | .attr('transform', 'translate(' + 40 + ',' + 230 + ')') 838 | 839 | var helperText1 = helper1 840 | .append('text') 841 | .style('fill', 'blue') 842 | .style('font-size', '12px') 843 | 844 | var helper2 = svg33 845 | .append('g') 846 | .attr('class', 'chart__helper2') 847 | .attr('transform', 'translate(' + 40 + ',' + 250 + ')') 848 | 849 | var helperText2 = helper2 850 | .append('text') 851 | .style('fill', 'blue') 852 | .style('font-size', '12px') 853 | var priceTooltip = focus 854 | .append('g') 855 | .attr('class', 'chart__tooltip--price') 856 | .append('circle') 857 | .style('display', 'none') 858 | .attr('r', 2.5) 859 | 860 | // var averageTooltip = focus.append('g') 861 | // .attr('class', 'chart__tooltip--average') 862 | // .append('circle') 863 | // .style('display', 'none') 864 | // .attr('r', 2.5); 865 | 866 | // eslint-disable-next-line 867 | var mouseArea = svg 868 | .append('g') 869 | .attr('class', 'chart__mouse') 870 | .append('rect') 871 | .attr('class', 'chart__overlay') 872 | .attr('width', width) 873 | .attr('height', height) 874 | .attr( 875 | 'transform', 876 | 'translate(' + margin.left + ',' + margin.top + ')' 877 | ) 878 | .on('mouseover', function () { 879 | helper.style('display', null) 880 | helper1.style('display', null) 881 | helper2.style('display', null) 882 | priceTooltip.style('display', null) 883 | //averageTooltip.style('display', null); 884 | svg33.style('display', null) 885 | //d3.select("#svg3").remove(); 886 | }) 887 | .on('mouseout', function () { 888 | helper.style('display', 'none') 889 | helper1.style('display', 'none') 890 | helper2.style('display', 'none') 891 | priceTooltip.style('display', 'none') 892 | // averageTooltip.style('display', 'none'); 893 | //d3.select("#svg3").remove(); 894 | svg33.style('display', 'none') 895 | //d3.select("#svg3").remove(); 896 | }) 897 | .on('mousemove', mousemove) 898 | 899 | context 900 | .append('path') 901 | .datum(selectFruit[0].values) 902 | .attr('class', 'chart__area area') 903 | .attr('d', area2) 904 | .style('fill', '#67c0ed') 905 | 906 | context 907 | .append('g') 908 | .attr('class', 'x axis chart__axis--context') 909 | .attr('y', 0) 910 | .attr('transform', 'translate(0,' + (height2 - 22) + ')') 911 | .call(xAxis2) 912 | 913 | context 914 | .append('g') 915 | .attr('class', 'x brush') 916 | .call(brush) 917 | .selectAll('rect') 918 | .attr('y', -6) 919 | .attr('height', height2 + 7) 920 | 921 | function mousemove() { 922 | var x0 = x.invert(d3.mouse(this)[0]) 923 | var i = bisectDate(selectFruit[0].values, x0, 1) 924 | var d0 = selectFruit[0].values[i - 1] 925 | var d1 = selectFruit[0].values[i] 926 | var d = x0 - d0.date > d1.date - x0 ? d1 : d0 927 | var aColor = ['rgb(0, 255, 0)', 'rgb(255, 0, 0)', 'rgb(200,200,200)'] 928 | 929 | var r = 50 930 | var data1 = [{ value: d.tp }, { value: d.tn }, { value: d.tnn }] 931 | 932 | var p1 = svg33 933 | .append('svg') 934 | .attr('id', 'svg3') 935 | .attr('class', 'chart__pie') 936 | .data([data1]) 937 | .attr('width', width) 938 | .attr('height', height) 939 | .append('svg:g') 940 | 941 | svg33 942 | .append('text') 943 | .attr('x', 40) 944 | .attr('y', 8) 945 | .text('Pie chart of stock sentiment ratios') 946 | .style('font-size', '12px') 947 | .attr('alignment-baseline', 'middle') 948 | svg33 949 | .append('text') 950 | .attr('x', 40) 951 | .attr('y', 25) 952 | .text('calculated using FinBERT model') 953 | .style('font-size', '12px') 954 | .attr('alignment-baseline', 'middle') 955 | svg33 956 | .append('circle') 957 | .attr('cx', 170) 958 | .attr('cy', 70) 959 | .attr('r', 7) 960 | .style('fill', 'rgb(0, 255, 0)') 961 | svg33 962 | .append('circle') 963 | .attr('cx', 170) 964 | .attr('cy', 100) 965 | .attr('r', 7) 966 | .style('fill', 'rgb(255, 0, 0)') 967 | svg33 968 | .append('circle') 969 | .attr('cx', 170) 970 | .attr('cy', 130) 971 | .attr('r', 7) 972 | .style('fill', 'rgb(200,200,200)') 973 | svg33 974 | .append('text') 975 | .attr('x', 180) 976 | .attr('y', 70) 977 | .text('positive sentiment ratio') 978 | .style('font-size', '10px') 979 | .attr('alignment-baseline', 'middle') 980 | svg33 981 | .append('text') 982 | .attr('x', 180) 983 | .attr('y', 100) 984 | .text('negative sentiment ratio') 985 | .style('font-size', '10x') 986 | .attr('alignment-baseline', 'middle') 987 | svg33 988 | .append('text') 989 | .attr('x', 180) 990 | .attr('y', 130) 991 | .text('neutral sentiment ratio') 992 | .style('font-size', '10x') 993 | .attr('alignment-baseline', 'middle') 994 | //svg.append("circle").attr("cx",200).attr("cy",160).attr("r", 6).style("fill", "#404080") 995 | //svg.append("text").attr("x", 220).attr("y", 130).text("variable A").style("font-size", "15px").attr("alignment-baseline","middle") 996 | //svg.append("text").attr("x", 220).attr("y", 160).text("variable B").style("font-size", "15px").attr("alignment-baseline","middle") 997 | 998 | //var vis = d3.select('#chart').append("svg:svg").data([data]).attr("width", w).attr("height", h).append("svg:g").attr("transform", "translate(" + r + "," + r + ")"); 999 | 1000 | var pie = d3.layout.pie().value(function (d) { 1001 | return d.value 1002 | }) 1003 | 1004 | // Declare an arc generator function 1005 | var arc = d3.svg.arc().outerRadius(r) 1006 | 1007 | // Select paths, use arc generator to draw 1008 | var arcs = p1 1009 | .selectAll('g.slice') 1010 | .data(pie) 1011 | .enter() 1012 | .append('svg:g') 1013 | .attr('class', 'slice') 1014 | arcs 1015 | .append('svg:path') 1016 | .attr('fill', function (d, i) { 1017 | return aColor[i] 1018 | }) 1019 | .attr('d', function (d) { 1020 | return arc(d) 1021 | }) 1022 | p1.attr('transform', 'translate(' + 70 + ',' + 100 + ')') 1023 | /* svg33.append("text").attr("x", 40).attr("y", 240).text("Date").style("font-size", "10px").attr("alignment-baseline","middle") 1024 | svg33.append("text").attr("x", 40).attr("y", 270).text("Price").style("font-size", "10x").attr("alignment-baseline","middle") 1025 | svg33.append("text").attr("x", 40).attr("y", 290).text("# likes").style("font-size", "10x").attr("alignment-baseline","middle") 1026 | svg33.append("text").attr("x", 95).attr("y", 240).text(legendFormat(new Date(d.date))).style("font-size", "10px").attr("alignment-baseline","middle") 1027 | svg33.append("text").attr("x", 95).attr("y", 270).text(d.price).style("font-size", "10x").attr("alignment-baseline","middle")*/ 1028 | // svg33.append("text").attr("x", 95).attr("y", 290).text(d.lk).style("font-size", "10x").attr("alignment-baseline","middle") 1029 | helperText.text('Date: ' + legendFormat(new Date(d.date))) 1030 | helperText1.text('Stock Price: ' + d.price) 1031 | helperText2.text('#Twitter Likes: ' + d.lk) 1032 | priceTooltip.attr( 1033 | 'transform', 1034 | 'translate(' + x(d.date) + ',' + y(d.price) + ')' 1035 | ) 1036 | // averageTooltip.attr('transform', 'translate(' + x(d.date) + ',' + y(d.price+5*d.score) + ')'); 1037 | } 1038 | 1039 | function brushed() { 1040 | var ext = brush.extent() 1041 | if (!brush.empty()) { 1042 | x.domain(brush.empty() ? x2.domain() : brush.extent()) 1043 | y.domain([ 1044 | d3.min( 1045 | selectFruit[0].values.map(function (d) { 1046 | return d.date >= ext[0] && d.date <= ext[1] ? d.price : max 1047 | }) 1048 | ), 1049 | d3.max( 1050 | selectFruit[0].values.map(function (d) { 1051 | return d.date >= ext[0] && d.date <= ext[1] ? d.price : min 1052 | }) 1053 | ), 1054 | ]) 1055 | range.text( 1056 | legendFormat(new Date(ext[0])) + 1057 | ' - ' + 1058 | legendFormat(new Date(ext[1])) 1059 | ) 1060 | focusGraph.attr('x', function (d, i) { 1061 | return x(d.date) 1062 | }) 1063 | 1064 | var days = Math.ceil((ext[1] - ext[0]) / (24 * 3600 * 1000)) 1065 | focusGraph.attr('width', 40 > days ? ((40 - days) * 5) / 6 : 5) 1066 | } 1067 | 1068 | priceChart.attr('d', priceLine) 1069 | //averageChart.attr('d', avgLine); 1070 | focus.select('.x.axis').call(xAxis) 1071 | focus.select('.y.axis').call(yAxis) 1072 | } 1073 | 1074 | var dateRange = ['1w', '1m', '3m', '6m', '1y', '5y'] 1075 | for (var i = 0, l = dateRange.length; i < l; i++) { 1076 | var v = dateRange[i] 1077 | rangeSelection 1078 | .append('text') 1079 | .attr('class', 'chart__range-selection') 1080 | .text(v) 1081 | .attr('transform', 'translate(' + 18 * i + ', 0)') 1082 | .on('click', function (d) { 1083 | focusOnRange(this.textContent) 1084 | }) 1085 | } 1086 | 1087 | function focusOnRange(range) { 1088 | var today = new Date( 1089 | selectFruit[0].values[selectFruit[0].values.length - 1].date 1090 | ) 1091 | var ext = new Date( 1092 | selectFruit[0].values[selectFruit[0].values.length - 1].date 1093 | ) 1094 | 1095 | if (range === '1m') ext.setMonth(ext.getMonth() - 1) 1096 | 1097 | if (range === '1w') ext.setDate(ext.getDate() - 7) 1098 | 1099 | if (range === '3m') ext.setMonth(ext.getMonth() - 3) 1100 | 1101 | if (range === '6m') ext.setMonth(ext.getMonth() - 6) 1102 | 1103 | if (range === '1y') ext.setFullYear(ext.getFullYear() - 1) 1104 | 1105 | if (range === '5y') ext.setFullYear(ext.getFullYear() - 5) 1106 | 1107 | brush.extent([ext, today]) 1108 | brushed() 1109 | context.select('g.x.brush').call(brush.extent([ext, today])) 1110 | } 1111 | 1112 | //------------------------ 1113 | 1114 | // Select all of the grouped elements and update the data 1115 | var selectFruitGroups = svg.selectAll('.fruitGroups').data(selectFruit) 1116 | 1117 | // Select all the lines and transition to new positions 1118 | selectFruitGroups 1119 | .selectAll('path.line') 1120 | .data(function (d) { 1121 | return d.values 1122 | }) 1123 | .transition() 1124 | .duration(1000) 1125 | .attr('d', function (d) { 1126 | //return valueLine(d.values) 1127 | }) 1128 | 1129 | selectFruitGroups.exit().remove() 1130 | } 1131 | }) // end Data 1132 | 1133 | function type(d) { 1134 | return { 1135 | date: parseDate(d.Date), 1136 | price: +d.Close, 1137 | score: +d.positive_twitter, 1138 | tp: +d.positive_twitter, 1139 | tn: +d.negative_twitter, 1140 | tnn: +d.neutral_twitter, 1141 | comp: d.company, 1142 | volume: +d.volume, 1143 | lk: d.like_num_twitter, 1144 | } 1145 | } 1146 | } 1147 | 1148 | return ( 1149 |
1150 |
1151 |

Historical Stock Price Data by Company

1152 |
1153 |
1154 |

1155 | This graph allows you to view stock and social media sentiment data 1156 | for 5 companies between 2015-2020. 1157 |

1158 |
1159 |
1160 |
Select Company:
1161 |
1162 |

1163 |
1164 |
1165 | ) 1166 | } 1167 | -------------------------------------------------------------------------------- /frontend/src/components/Misc/Landing.js: -------------------------------------------------------------------------------- 1 | export default function Landing() { 2 | return ( 3 |
4 |
5 |

6 | Welcome to Team 6's Data Visualization Tool{' '} 7 |

8 |
9 |
16 |

17 | This tool using React + D3 + Flask to generate visualizations involing 18 | sentiment analysis of social media posts about Stocks, financial data 19 | about Stocks, and more! Click on a graph on the left to get started. 20 |

21 |
22 |
23 | ) 24 | } 25 | -------------------------------------------------------------------------------- /frontend/src/components/Prediction/Prediction.css: -------------------------------------------------------------------------------- 1 | path.line { 2 | fill: none; 3 | stroke: #000; 4 | } 5 | 6 | pre { 7 | font-family: Arial, sans-serif; 8 | font-size: 12px; 9 | } 10 | 11 | .tg { 12 | border-collapse: collapse; 13 | border-color: #bbb; 14 | border-spacing: 0; 15 | margin-top: 20px; 16 | } 17 | .tg td { 18 | background-color: #e0ffeb; 19 | border-color: #bbb; 20 | border-style: solid; 21 | border-width: 1px; 22 | color: #594f4f; 23 | font-family: Arial, sans-serif; 24 | font-size: 14px; 25 | overflow: hidden; 26 | padding: 10px 5px; 27 | word-break: normal; 28 | } 29 | .tg th { 30 | background-color: #9de0ad; 31 | border-color: #bbb; 32 | border-style: solid; 33 | border-width: 1px; 34 | color: #493f3f; 35 | font-family: Arial, sans-serif; 36 | font-size: 14px; 37 | font-weight: normal; 38 | overflow: hidden; 39 | padding: 10px 5px; 40 | word-break: normal; 41 | } 42 | .tg .tg-0lax { 43 | text-align: center; 44 | vertical-align: top; 45 | } 46 | .tg .tg-sjuo { 47 | background-color: #c2ffd6; 48 | text-align: center; 49 | vertical-align: top; 50 | } 51 | 52 | .y-axis { 53 | z-index: 100; 54 | } 55 | 56 | .dropdown-container, 57 | .desc-container, 58 | .title-container, 59 | .page-desc-container, 60 | .loading-container, 61 | .desc2-container { 62 | display: flex; 63 | justify-content: center; 64 | } 65 | 66 | .desc2, 67 | .desc { 68 | width: 60%; 69 | text-align: center; 70 | } 71 | 72 | .loading-container { 73 | margin-top: 50px; 74 | } 75 | -------------------------------------------------------------------------------- /frontend/src/components/Prediction/Prediction.js: -------------------------------------------------------------------------------- 1 | import * as d3 from 'd3' 2 | import './Prediction.css' 3 | import { Button } from 'react-bootstrap' 4 | import { useState } from 'react' 5 | import RingLoader from 'react-spinners/RingLoader' 6 | 7 | export default function Prediction() { 8 | const [loading, setLoading] = useState(false) 9 | 10 | const onSubmit = () => { 11 | let parseDate = d3.timeParse('%d %b %Y') 12 | var svg = d3.select('#graph2') 13 | svg.selectAll('.linegraph').remove() 14 | let xhr = new XMLHttpRequest() 15 | const url = 'http://localhost:5000/' 16 | xhr.open('POST', url, true) 17 | xhr.setRequestHeader('Content-Type', 'application/json') 18 | let weekData = [ 19 | [ 20 | 'Week', 21 | 'Reddit Sentiment', 22 | 'Number of Reddit Posts', 23 | 'Twitter Sentiment', 24 | 'Number of Tweets', 25 | 'Tweet Activity', 26 | ], 27 | [ 28 | '1', 29 | d3.select('#one_red_sent')['_groups'][0][0].value / 100, 30 | d3.select('#one_red')['_groups'][0][0].value / 100, 31 | d3.select('#one_tw_sent')['_groups'][0][0].value / 100, 32 | d3.select('#one_tw')['_groups'][0][0].value / 100, 33 | d3.select('#one_tw_ac')['_groups'][0][0].value / 100, 34 | ], 35 | [ 36 | '2', 37 | d3.select('#two_red_sent')['_groups'][0][0].value / 100, 38 | d3.select('#two_red')['_groups'][0][0].value / 100, 39 | d3.select('#two_tw_sent')['_groups'][0][0].value / 100, 40 | d3.select('#two_tw')['_groups'][0][0].value / 100, 41 | d3.select('#two_tw_ac')['_groups'][0][0].value / 100, 42 | ], 43 | [ 44 | '3', 45 | d3.select('#three_red_sent')['_groups'][0][0].value / 100, 46 | d3.select('#three_red')['_groups'][0][0].value / 100, 47 | d3.select('#three_tw_sent')['_groups'][0][0].value / 100, 48 | d3.select('#three_tw')['_groups'][0][0].value / 100, 49 | d3.select('#three_tw_ac')['_groups'][0][0].value / 100, 50 | ], 51 | [ 52 | '4', 53 | d3.select('#four_red_sent')['_groups'][0][0].value / 100, 54 | d3.select('#four_red')['_groups'][0][0].value / 100, 55 | d3.select('#four_tw_sent')['_groups'][0][0].value / 100, 56 | d3.select('#four_tw')['_groups'][0][0].value / 100, 57 | d3.select('#four_tw_ac')['_groups'][0][0].value / 100, 58 | ], 59 | [ 60 | '5', 61 | d3.select('#five_red_sent')['_groups'][0][0].value / 100, 62 | d3.select('#five_red')['_groups'][0][0].value / 100, 63 | d3.select('#five_tw_sent')['_groups'][0][0].value / 100, 64 | d3.select('#five_tw')['_groups'][0][0].value / 100, 65 | d3.select('#five_tw_ac')['_groups'][0][0].value / 100, 66 | ], 67 | ] 68 | let data2 = JSON.stringify({ 69 | company: d3.select('#select')['_groups'][0][0].value, 70 | window: d3.select('#select2')['_groups'][0][0].value, 71 | data: weekData, 72 | }) 73 | setLoading(true) 74 | xhr.send(data2) 75 | xhr.onreadystatechange = (e) => { 76 | console.log(xhr.responseText) 77 | if (xhr.readyState === XMLHttpRequest.DONE) { 78 | let x = JSON.parse(xhr.responseText) 79 | 80 | let preds = [] 81 | let actual = [] 82 | for (let i = 0; i < x['dates'].length; i++) { 83 | let date = x['dates'][i].split(' ') 84 | preds.push({ 85 | date: parseDate(date[1] + ' ' + date[2] + ' ' + date[3]), 86 | price: x['pred'][i], 87 | }) 88 | if (x['actual'].length > i) { 89 | actual.push({ 90 | date: parseDate(date[1] + ' ' + date[2] + ' ' + date[3]), 91 | price: x['actual'][i], 92 | }) 93 | } 94 | } 95 | 96 | generate_graph(preds, actual) 97 | setLoading(false) 98 | } 99 | } 100 | } 101 | 102 | const generate_graph = (preds, actual) => { 103 | var timeFormat = d3.timeFormat('%d %b %y') 104 | 105 | var margin = { top: 50, right: 200, bottom: 50, left: 75 } 106 | var width = window.innerWidth - margin.left - margin.right - 400 107 | var height = window.innerHeight - margin.top - margin.bottom - 20 108 | 109 | var n = preds.length 110 | 111 | var xScale = d3 112 | .scaleTime() 113 | .domain([preds[0].date, preds[n - 1].date]) 114 | .range([0, width]) 115 | 116 | let maxPrice = 0 117 | for (let i = 0; i < preds.length; i++) { 118 | if (actual.length > i) { 119 | if (actual[i].price > maxPrice) { 120 | maxPrice = actual[i].price 121 | } 122 | } 123 | if (preds[i].price > maxPrice) { 124 | maxPrice = preds[i].price 125 | } 126 | } 127 | 128 | var yScale = d3.scaleLinear().domain([0, maxPrice]).range([height, 0]) 129 | 130 | var line = d3 131 | .line() 132 | .x(function (d) { 133 | return xScale(d.date) 134 | }) 135 | .y(function (d) { 136 | return yScale(d.price) 137 | }) 138 | 139 | var svg = d3 140 | .select('#graph2') 141 | .append('svg') 142 | .attr('width', width + margin.left + margin.right) 143 | .attr('height', height + margin.top + margin.bottom) 144 | .attr('class', 'linegraph') 145 | .append('g') 146 | .attr('transform', 'translate(' + margin.left + ',' + margin.top + ')') 147 | 148 | svg 149 | .append('g') 150 | .attr('class', 'x-axis') 151 | .attr('transform', 'translate(0,' + height + ')') 152 | .call(d3.axisBottom(xScale).tickFormat(timeFormat)) 153 | 154 | svg 155 | .append('text') 156 | .attr('transform', 'translate(400,' + (height + 38) + ')') 157 | .text('Date') 158 | .style('font-size', '14px') 159 | 160 | svg.append('g').attr('class', 'y-axis').call(d3.axisLeft(yScale)) 161 | 162 | svg 163 | .append('text') 164 | .attr('transform', 'rotate(-90)') 165 | .attr('y', -50) 166 | .attr('x', -400) 167 | .text('Stock Price in Dollars') 168 | .style('font-size', '14px') 169 | 170 | svg 171 | .append('path') 172 | .datum(preds) 173 | .attr('class', 'line') 174 | .attr('d', line) 175 | .style('stroke', 'blue') 176 | 177 | svg 178 | .append('path') 179 | .datum(actual) 180 | .attr('class', 'line') 181 | .attr('d', line) 182 | .style('stroke', 'red') 183 | 184 | var legend = svg.append('g') 185 | 186 | svg 187 | .append('text') 188 | .attr('x', width / 2) 189 | .attr('y', 0 - margin.top / 2) 190 | .attr('text-anchor', 'middle') 191 | .style('font-size', '20px') 192 | .style('font-weight', 'bold') 193 | .text( 194 | 'Stock Price for ' + 195 | d3.select('#select')['_groups'][0][0].value + 196 | ' ' + 197 | 'at Window Size ' + 198 | d3.select('#select2')['_groups'][0][0].value 199 | ) 200 | 201 | legend 202 | .append('text') 203 | .attr('dx', width + 55) 204 | .attr('dy', 50) 205 | .text('Predicted Stock Price') 206 | .style('fill', 'blue') 207 | .style('font-size', '14px') 208 | 209 | legend 210 | .append('text') 211 | .attr('dx', width + 55) 212 | .attr('dy', 70) 213 | .text('Actual Stock Price') 214 | .style('fill', 'red') 215 | .style('font-size', '14px') 216 | 217 | let future_length = preds.length - actual.length 218 | let future_week = [0, 0, 0, 0, 0] 219 | let future_week_lengths = [0, 0, 0, 0, 0] 220 | for (let i = 0; i < future_length; i++) { 221 | future_week[i % 5] += preds[actual.length + i].price 222 | future_week_lengths[i % 5] += 1 223 | } 224 | console.log(future_week) 225 | console.log(future_week_lengths) 226 | for (let i = 0; i < future_week.length; i++) { 227 | future_week[i] = future_week[i] / future_week_lengths[i] 228 | } 229 | console.log(future_week) 230 | legend 231 | .append('text') 232 | .attr('dx', width + 55) 233 | .attr('dy', 280) 234 | .text('Prediction Average Price') 235 | .style('fill', 'black') 236 | .style('font-size', '13px') 237 | 238 | legend 239 | .append('text') 240 | .attr('dx', width + 55) 241 | .attr('dy', 300) 242 | .text('Week 1: $' + Math.round(future_week[0] * 100) / 100) 243 | .style('fill', 'black') 244 | .style('font-size', '13px') 245 | 246 | legend 247 | .append('text') 248 | .attr('dx', width + 55) 249 | .attr('dy', 320) 250 | .text('Week 2: $' + Math.round(future_week[1] * 100) / 100) 251 | .style('fill', 'black') 252 | .style('font-size', '13px') 253 | 254 | legend 255 | .append('text') 256 | .attr('dx', width + 55) 257 | .attr('dy', 340) 258 | .text('Week 3: $' + Math.round(future_week[2] * 100) / 100) 259 | .style('fill', 'black') 260 | .style('font-size', '13px') 261 | 262 | legend 263 | .append('text') 264 | .attr('dx', width + 55) 265 | .attr('dy', 360) 266 | .text('Week 4: $' + Math.round(future_week[3] * 100) / 100) 267 | .style('fill', 'black') 268 | .style('font-size', '13px') 269 | 270 | legend 271 | .append('text') 272 | .attr('dx', width + 55) 273 | .attr('dy', 380) 274 | .text('Week 5: $' + Math.round(future_week[4] * 100) / 100) 275 | .style('fill', 'black') 276 | .style('font-size', '13px') 277 | } 278 | 279 | return ( 280 |
281 |
282 |

Use Sentiment to Predict Future Stock Price

283 |
284 |
285 |

286 | This graph allows you to adjust different social media sentiment 287 | values and see the impact on a company's closing stock price. 288 |

289 |
290 | 291 | 292 | 293 | 294 | 295 | 296 | 297 | 298 | 299 | 300 | 301 | 302 | 303 | 304 | 311 | 318 | 325 | 332 | 339 | 340 | 341 | 342 | 349 | 356 | 363 | 370 | 377 | 378 | 379 | 380 | 387 | 394 | 401 | 408 | 415 | 416 | 417 | 418 | 425 | 432 | 439 | 446 | 453 | 454 | 455 | 456 | 463 | 470 | 477 | 484 | 491 | 492 | 493 |
Reddit SentimentNumber of Reddit PostsTwitter SentimentNumber of TweetsTweet Activity (Retweets and Likes)
Week 1 305 | 306 | 310 | 312 | 313 | 317 | 319 | 320 | 324 | 326 | 327 | 331 | 333 | 334 | 338 |
Week 2 343 | 344 | 348 | 350 | 351 | 355 | 357 | 358 | 362 | 364 | 365 | 369 | 371 | 372 | 376 |
Week 3 381 | 382 | 386 | 388 | 389 | 393 | 395 | 396 | 400 | 402 | 403 | 407 | 409 | 410 | 414 |
Week 4 419 | 420 | 424 | 426 | 427 | 431 | 433 | 434 | 438 | 440 | 441 | 445 | 447 | 448 | 452 |
Week 5 457 | 458 | 462 | 464 | 465 | 469 | 471 | 472 | 476 | 478 | 479 | 483 | 485 | 486 | 490 |
494 |
495 |

496 | Please select which company and window size you want to predict the 497 | next month for. A smaller window size means you are okay with more 498 | volatility in price. 499 |

500 |
501 |
502 |

503 | We are only predicting for January 2020, due to data limitations. We 504 | are confident in our model's ability to predict real time stock prices 505 | given real time access to stock + social media data. 506 |

507 |
508 | 509 |
510 | 517 | 522 | 531 |
532 |
533 | 539 |
540 |
541 | ) 542 | } 543 | -------------------------------------------------------------------------------- /frontend/src/components/Sentiment/Sentiment.css: -------------------------------------------------------------------------------- 1 | .title-container, 2 | .button-container, 3 | .graph-container, 4 | .desc-container, 5 | .loading-container { 6 | display: flex; 7 | justify-content: center; 8 | } 9 | div.tooltip { 10 | position: absolute; 11 | text-align: left; 12 | padding: 1.5rem; 13 | padding-left: 1rem; 14 | background: rgb(29, 161, 242); 15 | color: white; 16 | border-radius: 10px; 17 | pointer-events: none; 18 | font-size: 12px; 19 | width: 20%; 20 | } 21 | .title-container { 22 | margin-top: 20px; 23 | background-color: white; 24 | } 25 | .loading-container { 26 | margin-top: 50px; 27 | } 28 | .text-form-control { 29 | width: 50%; 30 | } 31 | 32 | .popover-container { 33 | display: flex; 34 | justify-content: flex-end; 35 | margin-right: 20px; 36 | } 37 | -------------------------------------------------------------------------------- /frontend/src/components/Sentiment/Sentiment.js: -------------------------------------------------------------------------------- 1 | import { useState } from 'react' 2 | import './Sentiment.css' 3 | import { Button, InputGroup, FormControl } from 'react-bootstrap' 4 | import * as d3 from 'd3' 5 | import RingLoader from 'react-spinners/RingLoader' 6 | 7 | export default function Sentiment() { 8 | const [loading, setLoading] = useState(false) 9 | 10 | const onSubmit = () => { 11 | if (loading) { 12 | return 13 | } 14 | var inputVal = document.getElementById('input').value 15 | if (inputVal !== inputVal.toUpperCase() || inputVal.length > 5) { 16 | alert('please enter a valid uppercase stock ticker') 17 | return 18 | } 19 | 20 | let xhr = new XMLHttpRequest() 21 | const url = 'http://localhost:5000/get_sentiment' 22 | xhr.open('POST', url, true) 23 | xhr.setRequestHeader('Content-Type', 'application/json') 24 | let data = JSON.stringify({ query: inputVal }) 25 | setLoading(true) 26 | xhr.send(data) 27 | xhr.onreadystatechange = (e) => { 28 | let jsonResponse = JSON.parse(xhr.responseText) 29 | if (jsonResponse.error) { 30 | alert(jsonResponse.error) 31 | } 32 | createGraph(jsonResponse) 33 | setLoading(false) 34 | } 35 | } 36 | 37 | const createGraph = (jsonData) => { 38 | const graph = document.getElementById('graph') 39 | graph.innerHTML = '' 40 | var maxCount = 0 41 | var data = [] 42 | for (let key in jsonData) { 43 | if (jsonData.hasOwnProperty(key)) { 44 | if (jsonData[key]['count'] > maxCount) { 45 | maxCount = jsonData[key]['count'] 46 | } 47 | data.push({ 48 | date: key, 49 | count: jsonData[key]['count'], 50 | negative_count: jsonData[key]['negative_count'], 51 | negative_favorites: jsonData[key]['negative_favorites'], 52 | negative_text: jsonData[key]['negative_text'], 53 | neutral_count: jsonData[key]['neutral_count'], 54 | neutral_favorites: jsonData[key]['neutral_favorites'], 55 | neutral_text: jsonData[key]['neutral_text'], 56 | positive_count: jsonData[key]['positive_count'], 57 | positive_favorites: jsonData[key]['positive_favorites'], 58 | positive_text: jsonData[key]['positive_text'], 59 | }) 60 | } 61 | } 62 | let margin = { top: 40, right: 30, bottom: 40, left: 50 }, 63 | width = 460 - margin.left - margin.right, 64 | height = 600 - margin.top - margin.bottom 65 | 66 | var svg = d3 67 | .select('#graph') 68 | .append('svg') 69 | .attr('width', width + margin.left + margin.right) 70 | .attr('height', height + margin.top + margin.bottom) 71 | .append('g') 72 | .attr('transform', 'translate(' + margin.left + ',' + margin.top + ')') 73 | 74 | var hover = d3 75 | .select('#popover') 76 | .append('div') 77 | .attr('class', 'tooltip') 78 | .style('opacity', 0) 79 | 80 | let subgroups = ['negative_count', 'neutral_count', 'positive_count'] 81 | 82 | let groups = data.map((d) => { 83 | return d.date 84 | }) 85 | 86 | // Add X axis 87 | const x = d3.scaleBand().domain(groups).range([0, width]).padding([0.2]) 88 | 89 | svg 90 | .append('g') 91 | .attr('transform', 'translate(0,' + height + ')') 92 | .call(d3.axisBottom(x).tickSizeOuter(0)) 93 | 94 | svg 95 | .append('text') 96 | .attr('transform', 'translate(0,' + (height + 40) + ')') 97 | .attr('x', width / 2 - margin.left / 2) 98 | .text('Date') 99 | .style('font-size', '14px') 100 | 101 | // Add Y axis 102 | var y = d3.scaleLinear().domain([0, maxCount]).range([height, 0]) 103 | svg.append('g').call(d3.axisLeft(y)) 104 | 105 | svg 106 | .append('text') 107 | .attr('transform', 'rotate(-90)') 108 | .attr('y', -40) 109 | .attr('x', -height / 2 - 50) 110 | .text('Number of Popular Tweets') 111 | .style('font-size', '14px') 112 | .style('font-color', 'black') 113 | 114 | const ticker = document.getElementById('input').value 115 | svg 116 | .append('text') 117 | .attr('x', width / 2) 118 | .attr('y', 0 - margin.top / 2) 119 | .attr('text-anchor', 'middle') 120 | .style('font-size', '20px') 121 | .style('font-weight', 'bold') 122 | .text(`${ticker} Sentiment Last 7 days`) 123 | 124 | // color palette = one color per subgroup 125 | var color = d3 126 | .scaleOrdinal() 127 | .domain(subgroups) 128 | .range(['#e41a1c', '#377eb8', '#4daf4a']) 129 | 130 | //stack the data? --> stack per subgroup 131 | var stackedData = d3.stack().keys(subgroups)(data) 132 | // Show the bars 133 | svg 134 | .append('g') 135 | .selectAll('g') 136 | // Enter in the stack data = loop key per key = group per group 137 | .data(stackedData) 138 | .enter() 139 | .append('g') 140 | .attr('fill', function (d) { 141 | return color(d.key) 142 | }) 143 | .selectAll('rect') 144 | // enter a second time = loop subgroup per subgroup to add all rectangles 145 | .data(function (d) { 146 | return d 147 | }) 148 | .enter() 149 | .append('rect') 150 | .attr('x', function (d) { 151 | return x(d.data.date) 152 | }) 153 | .attr('y', function (d) { 154 | return y(d[1]) 155 | }) 156 | .attr('height', function (d) { 157 | return y(d[0]) - y(d[1]) 158 | }) 159 | .attr('width', x.bandwidth()) 160 | .on('mouseover', function (_, d) { 161 | d3.select(this).transition().duration('50').attr('opacity', '.7') 162 | var tweet = 'No data available' 163 | var favorites = 'No data available' 164 | var type = 'Positive' 165 | 166 | if (d3.select(this).style('fill') === 'rgb(228, 26, 28)') { 167 | // Negative 168 | tweet = d.data.negative_text 169 | favorites = d.data.negative_favorites 170 | type = 'Negative' 171 | } else if (d3.select(this).style('fill') === 'rgb(55, 126, 184)') { 172 | // Neutral 173 | tweet = d.data.neutral_text 174 | favorites = d.data.neutral_favorites 175 | type = 'Neutral' 176 | } else { 177 | // Positive 178 | tweet = d.data.positive_text 179 | favorites = d.data.positive_favorites 180 | } 181 | 182 | let output = `
183 |

Top ${type} Tweet

184 |

${tweet}

185 |

Number of Favorites: ${favorites}

186 |

Date Posted: ${d.data.date}

187 |
` 188 | 189 | hover.transition().duration(50).style('opacity', 1) 190 | hover.html(output) 191 | }) 192 | .on('mouseout', function (d) { 193 | d3.select(this).transition().duration('50').attr('opacity', '1') 194 | hover.transition().duration('50').style('opacity', 0) 195 | }) 196 | } 197 | 198 | if (loading) { 199 | const graph = document.getElementById('graph') 200 | graph.innerHTML = '' 201 | } 202 | 203 | return ( 204 |
205 |
206 |

Get Sentiment of Recent Tweets

207 |
208 |
209 |

210 | This graph pulls the latest "popular" tweets from the past 7 days 211 | according to the Twitter API and analyzes the sentiment of these 212 | tweets with finBert 213 |

214 |
215 |
216 | 217 | 223 | 224 | 225 | 236 |
237 |
238 | 244 |
245 |
246 |
247 |
248 | ) 249 | } 250 | -------------------------------------------------------------------------------- /frontend/src/index.css: -------------------------------------------------------------------------------- 1 | body { 2 | margin: 0; 3 | font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen', 4 | 'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue', 5 | sans-serif; 6 | -webkit-font-smoothing: antialiased; 7 | -moz-osx-font-smoothing: grayscale; 8 | } 9 | 10 | html, 11 | body { 12 | height: 100%; 13 | overflow: hidden; 14 | } 15 | 16 | code { 17 | font-family: source-code-pro, Menlo, Monaco, Consolas, 'Courier New', 18 | monospace; 19 | } 20 | 21 | * { 22 | font-family: Roboto; 23 | } 24 | -------------------------------------------------------------------------------- /frontend/src/index.js: -------------------------------------------------------------------------------- 1 | import React from 'react' 2 | import ReactDOM from 'react-dom' 3 | import './index.css' 4 | import App from './App' 5 | 6 | ReactDOM.render( 7 | 8 | 9 | , 10 | document.getElementById('root') 11 | ) 12 | --------------------------------------------------------------------------------