├── .gitattributes ├── CONTRIBUTING.md ├── Docs ├── Alphavantage.rst ├── AlternativeData.rst ├── Commodities.rst ├── Crypto.rst ├── Equities.rst ├── FRED.rst ├── FX.rst ├── FXCM.rst ├── FinancialDatasets.rst ├── FixedIncome.rst ├── FundamentalAnalysis.rst ├── Fundamentals.rst ├── IEX.rst ├── JupyterNotebooks │ ├── Alphavantage.ipynb │ ├── EOD.ipynb │ ├── FRED.ipynb │ ├── FXCM.ipynb │ ├── FundamentalAnalysis.ipynb │ ├── IEX.ipynb │ ├── Oanda.ipynb │ ├── financial-data-webscraping.ipynb │ ├── finviz.ipynb │ ├── quandl.ipynb │ ├── stooq.ipynb │ ├── yahoofinancials.ipynb │ └── yfinance.ipynb ├── Macroeconomic.rst ├── News.rst ├── Oanda.rst ├── OptionFuture.rst ├── Realtime.rst ├── Sources.rst ├── Stooq.rst ├── YahooFinance.rst ├── finviz.rst ├── oanda_example.cfg └── quandl.rst ├── README.md ├── conf.py ├── feature_tracker_table.csv ├── homepage_tabs.rst ├── index.rst ├── requirements.txt └── spelling_wordlist.txt /.gitattributes: -------------------------------------------------------------------------------- 1 | # Auto detect text files and perform LF normalization 2 | * text=auto 3 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | ## Contributing to FinAILAb-Datasets 2 | 3 | If you are interested in contributing to FinAILAb-Datasets, your contributions will fall 4 | into two categories: 5 | 1. You want to propose a new Data source and include it in the documentation 6 | - Create an python code about your intended data set and include the details under the "Data sources" and link the items to whichever section is good. 7 | 2. You want to implement a new item: 8 | - Look at the roadmap here: https://github.com/tatsath/FinAILabDatasets/projects/2 9 | - Pick a to do list or in progress list and comment on the task that you want to work on this feature. 10 | 11 | Once you finish implementing a additional items, please send a Pull Request to 12 | https://github.com/tatsath/FinAILabDatasets/ 13 | 14 | 15 | If you are not familiar with creating a Pull Request, here are some guides: 16 | - http://stackoverflow.com/questions/14680711/how-to-do-a-github-pull-request 17 | - https://help.github.com/articles/creating-a-pull-request/ 18 | 19 | 20 | ## Developing FinAILAb-Datasets 21 | 22 | To develop FinAILAb-Datasets on your machine, here are some tips: 23 | 24 | 1. Clone a copy of FinAILAb-Datasets from source: 25 | 26 | ```bash 27 | git clone https://github.com/tatsath/FinAILabDatasets 28 | cd https://github.com/tatsath/FinAILabDatasets 29 | ``` 30 | 31 | 2. Install Stable-Baselines in develop mode, with support for building the docs and running tests: 32 | 33 | ```bash 34 | pip install -e .[docs,tests] 35 | ``` 36 | 37 | ## Codestyle 38 | 39 | Please document each function/method and [type](https://google.github.io/pytype/user_guide.html) them using the following template: 40 | 41 | ```python 42 | 43 | def my_function(arg1: type1, arg2: type2) -> returntype: 44 | """ 45 | Short description of the function. 46 | 47 | :param arg1: (type1) describe what is arg1 48 | :param arg2: (type2) describe what is arg2 49 | :return: (returntype) describe what is returned 50 | """ 51 | ... 52 | return my_variable 53 | ``` 54 | 55 | 56 | -------------------------------------------------------------------------------- /Docs/Alphavantage.rst: -------------------------------------------------------------------------------- 1 | .. _Alphavantage: 2 | 3 | Alphavantage 4 | ============ 5 | 6 | Alpha Vantage provides enterprise-grade financial market data through a set of powerful and developer-friendly APIs. To set up this environment you will need to have an API key, it can be straightly taken from the documentation here. 7 | 8 | 9 | .. note:: 10 | Refer to `Alphavantage Jupyter Notebook `_ for more details. 11 | 12 | Table of Contents 13 | ----------------- 14 | 15 | - `Installation`_ 16 | - `Usage`_ 17 | - `Symbol Search`_ 18 | - `Historical Price and Volume for 1 Stock`_ 19 | - `Adding Time Periods`_ 20 | - `Frequency Setting`_ 21 | - `Stock Splits and Dividends`_ 22 | - `Foreign Exchange`_ 23 | - `Cryptocurrencies`_ 24 | - `Mutual Funds`_ 25 | - `Treasury Rates`_ 26 | - `Stock Fundamentals`_ 27 | - `Financials`_ 28 | - `Stream Realtime Data`_ 29 | - `Economic Indicators`_ 30 | - `Technical Indicators`_ 31 | - `Sector Performance`_ 32 | 33 | .. _Jupyter Notebook: JupyterNotebooks/Alphavantage.ipynb 34 | 35 | Installation 36 | ------------ 37 | 38 | .. note:: 39 | Before working with this API, you will need to obtain 40 | a key from `AlphaVantage's Website `_ 41 | 42 | To install the package use: 43 | 44 | .. code:: ipython3 45 | 46 | pip install alpha_vantage 47 | 48 | Or install with pandas support 49 | 50 | .. code:: ipython3 51 | 52 | pip install alpha_vantage pandas 53 | 54 | Or install from the source 55 | 56 | .. code:: ipython3 57 | 58 | git clone https://github.com/RomelTorres/alpha_vantage.git 59 | pip install -e alpha_vantage 60 | 61 | Usage 62 | ----- 63 | 64 | Import all necessary libraries: 65 | 66 | .. code:: ipython3 67 | 68 | from alpha_vantage.timeseries import TimeSeries 69 | import pandas as pd 70 | import time 71 | import requests 72 | from io import BytesIO 73 | 74 | .. code:: ipython3 75 | 76 | key = 'insert your unique key here' 77 | 78 | Symbol Search 79 | ------------- 80 | 81 | For checking to see if the equity, commodity, mutual fund, etc. you want is available on Alphavantage: 82 | 83 | .. note:: 84 | This example, and the following, also demonstrate how to convert an Alphavantage dictionary 85 | into a Pandas DataFrame for easier data analysis. 86 | 87 | .. code:: ipython3 88 | 89 | symbol_to_search = 'TSLA' 90 | url = 'https://www.alphavantage.co/query?function=SYMBOL_SEARCH&keywords='+symbol_to_search+'&apikey={key}' 91 | r = requests.get(url) 92 | data = r.json() 93 | data = pd.DataFrame(data['bestMatches']) 94 | 95 | Historical Price and Volume for 1 Stock 96 | --------------------------------------- 97 | 98 | 99 | 100 | 101 | .. note:: 102 | See the data dictionary for adjustments to time frame. Daily, weekly, and monthly time frames are available for equities. 103 | 104 | .. code:: ipython3 105 | 106 | data = { 107 | "function": "TIME_SERIES_DAILY", # WEEKLY, MONTHLY possible 108 | "symbol": "TSLA", 109 | "apikey": key 110 | } 111 | r = requests.get(url, params=data) 112 | data = r.json() 113 | data = pd.DataFrame(data['Time Series (Daily)']).T 114 | 115 | Adding Time Periods 116 | ^^^^^^^^^^^^^^^^^^^ 117 | 118 | Shown below are the adjusted dictionaries for weekly and monthly time frames. 119 | 120 | .. code:: ipython3 121 | 122 | weekly = { 123 | "function": "DIGITAL_CURRENCY_WEEKLY", 124 | "symbol": "ETH", 125 | "market": 'CNY', 126 | "apikey": key 127 | } 128 | 129 | monthly = { 130 | "function": "DIGITAL_CURRENCY_MONTHLY", 131 | "symbol": "ETH", 132 | "market": 'CNY', 133 | "apikey": key 134 | } 135 | 136 | Frequency Setting 137 | ----------------- 138 | 139 | Outputs a similar Pandas DataFrame that breaks the OHLCV down into 1 minute intervals. 140 | 141 | .. code:: ipython3 142 | 143 | ticker = 'TSLA' 144 | interval = '1min' 145 | api_key = key 146 | 147 | api_url = f'https://www.alphavantage.co/query?function=TIME_SERIES_INTRADAY&symbol={ticker}&interval={interval}&apikey={api_key}' 148 | raw_df = requests.get(api_url).json() 149 | df = pd.DataFrame(raw_df[f'Time Series ({interval})']).T 150 | df = df.rename(columns = {'1. open': 'open', '2. high': 'high', '3. low': 'low', '4. close': 'close', '5. volume': 'volume'}) 151 | for i in df.columns: 152 | df[i] = df[i].astype(float) 153 | df.index = pd.to_datetime(df.index) 154 | df = df.iloc[::-1] 155 | df.tail() 156 | 157 | Stock Splits and Dividends 158 | ------------------------- 159 | 160 | Outputs a Pandas DataFrame with the DPS, Yield, Dividend Date and ExDate for the given ticker. 161 | 162 | .. code:: ipython3 163 | 164 | ticker = "IBM" 165 | url = 'https://www.alphavantage.co/query?function=OVERVIEW&symbol='+ticker+'&apikey={key}' 166 | r = requests.get(url) 167 | data = r.json() 168 | dividends = pd.DataFrame(data, index = ['Values']) 169 | dividends = dividends[['DividendPerShare', 'DividendYield', 'DividendDate', 'ExDividendDate']].T 170 | 171 | Financial Indices 172 | ----------------- 173 | 174 | .. note:: 175 | This feature requires a premium subscription. 176 | 177 | .. code:: ipython3 178 | 179 | index = "DJI" # FCHI, IXIC, ... 180 | url = 'https://www.alphavantage.co/query?function=TIME_SERIES_DAILY_ADJUSTED&symbol='+index+'&outputsize=full&apikey={key}' 181 | r = requests.get(url) 182 | data = r.json() 183 | 184 | 185 | Foreign Exchange 186 | ---------------- 187 | 188 | Outputs a dictionary with the exchange rate's OHLC values on the given time interval. 189 | 190 | .. code:: ipython3 191 | 192 | # Currency list: https://www.alphavantage.co/physical_currency_list/ 193 | currency_a = 'EUR' 194 | currency_b = 'USD' 195 | interval = '5min' # 1min, 5min, 15min, 30min, 60min 196 | url = 'https://www.alphavantage.co/query?function=FX_INTRADAY&from_symbol=EUR&to_symbol=USD&interval=5min&apikey=demo' 197 | r = requests.get(url) 198 | data = r.json() 199 | 200 | Alternatively, you can use the ``ForeignExchange`` library. 201 | 202 | .. code:: ipython3 203 | 204 | from alpha_vantage.foreignexchange import ForeignExchange 205 | from pprint import pprint 206 | cc = ForeignExchange(key='YOUR_API_KEY') 207 | # There is no metadata in this call 208 | data, _ = cc.get_currency_exchange_rate(from_currency='BTC',to_currency='USD') 209 | pprint(data) 210 | 211 | Cryptocurrencies 212 | ---------------- 213 | 214 | There are multiple ways to view data on cryptocurrencies. 215 | 216 | The first is using Alphavantage's API request which returns the OHLCV for the given crypto: 217 | 218 | .. code:: ipython3 219 | 220 | ticker = 'ETH' 221 | url = 'https://www.alphavantage.co/query?function=CRYPTO_INTRADAY&symbol='+ticker+'&market=USD&interval=5min&apikey={key}' 222 | r = requests.get(url) 223 | data = r.json() 224 | 225 | Another way is to import the ``CryptoCurrencies`` library, which allows for easy plotting: 226 | 227 | .. code:: ipython3 228 | 229 | from alpha_vantage.cryptocurrencies import CryptoCurrencies 230 | import matplotlib.pyplot as plt 231 | 232 | cc = CryptoCurrencies(key='YOUR_API_KEY', output_format='pandas') 233 | data, meta_data = cc.get_digital_currency_daily(symbol='BTC', market='CNY') 234 | data['4b. close (USD)'].plot() 235 | plt.tight_layout() 236 | plt.title('Daily close value for bitcoin (BTC)') 237 | plt.grid() 238 | plt.show() 239 | 240 | Lastly, we can view the excahnge rates for cryptos: 241 | 242 | .. code:: ipython3 243 | 244 | data = { 245 | "function": "CURRENCY_EXCHANGE_RATE", # WEEKLY, MONTHLY possible 246 | "from_currency": "ETH", 247 | "to_currency": 'USD', 248 | "apikey": key 249 | } 250 | r = requests.get(url, params=data) 251 | data = r.json() 252 | 253 | Mutual Funds 254 | --------------- 255 | 256 | Outputs a dictionary of the OHLCV values for the given mutual fund. 257 | 258 | .. code:: ipython3 259 | 260 | ticker = 'OMOIX' 261 | url = 'https://www.alphavantage.co/query?function=TIME_SERIES_DAILY&symbol='+ticker+'&apikey={key}' 262 | r = requests.get(url) 263 | data = r.json() 264 | 265 | Treasury Rates 266 | --------------- 267 | 268 | Outputs a dictionary of the daily, weekly, or monthly treasury rate. 269 | 270 | .. code:: ipython3 271 | 272 | treasury_yield = { 273 | "function": "TREASURY_YIELD", 274 | "interval": "weekly", # daily, monthly 275 | "maturity": "3month", # OPTIONAL 5year, 10year, 30year 276 | "apikey": key 277 | } 278 | r = requests.get(url, params=treasury_yield) 279 | data = r.json() 280 | 281 | Stock Fundamentals 282 | ------------------ 283 | 284 | Outputs a dictionary of various stock data, including: AssetType, Description, 285 | Sector, Address, Market Cap, EBITDA, PE, EPS, RPS, Profit Margin, Moving Averages, 286 | Revenue, and Beta. 287 | 288 | .. code:: ipython3 289 | 290 | ticker = "IBM" 291 | url = 'https://www.alphavantage.co/query?function=OVERVIEW&symbol='+ticker+'&apikey={key}' 292 | r = requests.get(url) 293 | data = r.json() 294 | 295 | Financials 296 | ---------- 297 | 298 | Outputs a dictionary containing the information for a company's balance sheet, cash flows, or income statement. 299 | 300 | .. code:: ipython3 301 | 302 | document = 'INCOME_STATEMENT' # BALANCE_SHEET, CASH_FLOW 303 | url = 'https://www.alphavantage.co/query?function='+document+'&symbol=IBM&apikey=demo' 304 | r = requests.get(url) 305 | data = r.json() 306 | 307 | Stream Realtime Data 308 | -------------------- 309 | 310 | Each invocation of the below function will produce the most up-to-date data on the given symbol. 311 | 312 | .. code:: ipython3 313 | 314 | def get_live_updates(symbol): 315 | api_key = key 316 | api_url = f'https://www.alphavantage.co/query?function=GLOBAL_QUOTE&symbol={symbol}&apikey={api_key}' 317 | raw_df = requests.get(api_url).json() 318 | attributes = {'attributes':['symbol', 'open', 'high', 'low', 'price', 'volume', 'latest trading day', 'previous close', 'change', 'change percent']} 319 | attributes_df = pd.DataFrame(attributes) 320 | values = [] 321 | for i in list(raw_df['Global Quote']): 322 | values.append(raw_df['Global Quote'][i]) 323 | values_dict = {'values':values} 324 | values_df = pd.DataFrame(values).rename(columns = {0:'values'}) 325 | frames = [attributes_df, values_df] 326 | df = pd.concat(frames, axis = 1, join = 'inner').set_index('attributes') 327 | return df 328 | 329 | ibm_updates = get_live_updates('IBM') 330 | ibm_updates 331 | 332 | Economic Indicators 333 | ------------------- 334 | 335 | Below are a few dictionaries that contain different economic indicators that can be plugged 336 | into the JSON request at the very bottom. 337 | 338 | .. code:: ipython3 339 | 340 | gdp = { 341 | "function": "REAL_GDP", 342 | "interval": "annual", # quarterly 343 | "apikey": key 344 | } 345 | treasury_yield = { 346 | "function": "TREASURY_YIELD", 347 | "interval": "weekly", # daily, monthly 348 | "maturity": "3month", # OPTIONAL 5year, 10year, 30year 349 | "apikey": key 350 | } 351 | federal_funds_rate = { 352 | "function": "FEDERAL_FUNDS_RATE", 353 | "interval": "weekly", # daily, monthly 354 | "apikey": key 355 | } 356 | cpi = { 357 | "function": "CPI", 358 | "interval": "weekly", # daily, monthly 359 | "apikey": key 360 | } 361 | inflation = { 362 | "function": "INFLATION", 363 | "interval": "weekly", # daily, monthly 364 | "apikey": key 365 | } 366 | consumer_sentiment = { 367 | "function": "CONSUMER_SENTIMENT", 368 | "apikey": key 369 | } 370 | unemployment = { 371 | "function": "UNEMPLOYMENT", 372 | "apikey": key 373 | } 374 | 375 | Below is the aforementioned JSON request, where you will replace the ``params`` variable. 376 | 377 | .. code:: ipython3 378 | 379 | r = requests.get(url, params=unemployment) # REPLACE 'params' with desired dict 380 | data = r.json() 381 | df = pd.DataFrame(data['data']) 382 | df = crypto_df.set_index("date") 383 | 384 | Technical Indicators 385 | -------------------- 386 | 387 | Below is the JSON request approach to getting data on various technical indicators. 388 | 389 | .. code:: ipython3 390 | 391 | popular_ti = { 392 | "function": "ADX", # REPLACE: EMA, RSI, ADX, SMA 393 | "symbol": "IBM", 394 | "interval": "weekly", 395 | "time_period": "10", 396 | "series_type": "open", 397 | "apikey": key 398 | } 399 | 400 | r = requests.get(url, params=popular_ti) 401 | data = r.json() 402 | 403 | Alternatively, you can use the ``TechIndicators`` library to achieve similar results. 404 | 405 | .. code:: ipython3 406 | 407 | from alpha_vantage.techindicators import TechIndicators 408 | import matplotlib.pyplot as plt 409 | 410 | ti = TechIndicators(key='YOUR_API_KEY', output_format='pandas') 411 | data, meta_data = ti.get_bbands(symbol='MSFT', interval='60min', time_period=60) 412 | data.plot() 413 | plt.title('BBbands indicator for MSFT stock (60 min)') 414 | plt.show() 415 | 416 | Sector Performance 417 | ------------------ 418 | 419 | Lastly, Alphavantage allows you to use the ``SectorPerformances`` library to 420 | view the realtime performance, by sector: 421 | 422 | .. code:: ipython3 423 | 424 | from alpha_vantage.sectorperformance import SectorPerformances 425 | import matplotlib.pyplot as plt 426 | 427 | sp = SectorPerformances(key='YOUR_API_KEY', output_format='pandas') 428 | data, meta_data = sp.get_sector() 429 | data['Rank A: Real-Time Performance'].plot(kind='bar') 430 | plt.title('Real Time Performance (%) per Sector') 431 | plt.tight_layout() 432 | plt.grid() 433 | plt.show() 434 | -------------------------------------------------------------------------------- /Docs/AlternativeData.rst: -------------------------------------------------------------------------------- 1 | .. _AlternativeData: 2 | 3 | 4 | Alternative Data 5 | ====================================== 6 | 7 | 8 | 9 | Volume and Price data: 10 | ================================================ 11 | - `Yahoo Finance `_ 12 | - `Alphavantage `_ 13 | 14 | 15 | 16 | https://aroussi.com/post/download-options-data 17 | https://towardsdatascience.com/a-comprehensive-guide-to-downloading-stock-prices-in-python-2cd93ff821d4 18 | 19 | Importing Many Stocks: 20 | ================================================ 21 | - `Yahoo Finance `_ 22 | 23 | Indices 24 | ================================================ 25 | - `Yahoo Finance `_ 26 | 27 | Stock Splits and Dividends 28 | ================================================ 29 | - `Yahoo Finance `_ 30 | - `Alphavantage `_ 31 | 32 | Technical Indicators 33 | ================================================ 34 | - `Alphavantage `_ 35 | 36 | 37 | Which algorithm should I use? 38 | ============================= 39 | 40 | There is no silver bullet in RL, depending on your needs and problem, you may choose one or the other. 41 | The first distinction comes from your action space, i.e., do you have discrete (e.g. LEFT, RIGHT, ...) 42 | or continuous actions (ex: go to a certain speed)? 43 | 44 | Some algorithms are only tailored for one or the other domain: ``DQN`` only supports discrete actions, where ``SAC`` is restricted to continuous actions. 45 | 46 | The second difference that will help you choose is whether you can parallelize your training or not, and how you can do it (with or without MPI?). 47 | If what matters is the wall clock training time, then you should lean towards ``A2C`` and its derivatives (PPO, ACER, ACKTR, ...). 48 | Take a look at the `Vectorized Environments `_ to learn more about training with multiple workers. 49 | 50 | To sum it up: 51 | 52 | Discrete Actions 53 | ---------------- 54 | 55 | .. note:: 56 | 57 | This covers ``Discrete``, ``MultiDiscrete``, ``Binary`` and ``MultiBinary`` spaces 58 | 59 | 60 | Discrete Actions - Single Process 61 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 62 | 63 | DQN with extensions (double DQN, prioritized replay, ...) and ACER are the recommended algorithms. 64 | DQN is usually slower to train (regarding wall clock time) but is the most sample efficient (because of its replay buffer). 65 | 66 | Discrete Actions - Multiprocessed 67 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 68 | 69 | You should give a try to PPO2, A2C and its successors (ACKTR, ACER). 70 | 71 | If you can multiprocess the training using MPI, then you should checkout PPO1 and TRPO. 72 | 73 | 74 | Continuous Actions 75 | ------------------ 76 | 77 | Continuous Actions - Single Process 78 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 79 | 80 | Current State Of The Art (SOTA) algorithms are ``SAC`` and ``TD3``. 81 | Please use the hyperparameters in the `RL zoo `_ for best results. 82 | 83 | 84 | Continuous Actions - Multiprocessed 85 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 86 | 87 | Take a look at PPO2, TRPO or A2C. Again, don't forget to take the hyperparameters from the `RL zoo `_ 88 | for continuous actions problems (cf *Bullet* envs). 89 | 90 | .. note:: 91 | 92 | Normalization is critical for those algorithms 93 | 94 | If you can use MPI, then you can choose between PPO1, TRPO and DDPG. 95 | 96 | 97 | Goal Environment 98 | ----------------- 99 | 100 | If your environment follows the ``GoalEnv`` interface (cf `HER <../modules/her.html>`_), then you should use 101 | HER + (SAC/TD3/DDPG/DQN) depending on the action space. 102 | 103 | 104 | .. note:: 105 | 106 | The number of workers is an important hyperparameters for experiments with HER. Currently, only HER+DDPG supports multiprocessing using MPI. 107 | 108 | 109 | 110 | Tips and Tricks when creating a custom environment 111 | ================================================== 112 | 113 | If you want to learn about how to create a custom environment, we recommend you read this `page `_. 114 | We also provide a `colab notebook `_ for 115 | a concrete example of creating a custom gym environment. 116 | 117 | Some basic advice: 118 | 119 | - always normalize your observation space when you can, i.e., when you know the boundaries 120 | - normalize your action space and make it symmetric when continuous (cf potential issue below) A good practice is to rescale your actions to lie in [-1, 1]. This does not limit you as you can easily rescale the action inside the environment 121 | - start with shaped reward (i.e. informative reward) and simplified version of your problem 122 | - debug with random actions to check that your environment works and follows the gym interface: 123 | 124 | 125 | We provide a helper to check that your environment runs without error: 126 | 127 | .. code-block:: python 128 | 129 | from stable_baselines.common.env_checker import check_env 130 | 131 | env = CustomEnv(arg1, ...) 132 | # It will check your custom environment and output additional warnings if needed 133 | check_env(env) 134 | 135 | 136 | If you want to quickly try a random agent on your environment, you can also do: 137 | 138 | .. code-block:: python 139 | 140 | env = YourEnv() 141 | obs = env.reset() 142 | n_steps = 10 143 | for _ in range(n_steps): 144 | # Random action 145 | action = env.action_space.sample() 146 | obs, reward, done, info = env.step(action) 147 | 148 | 149 | **Why should I normalize the action space?** 150 | 151 | 152 | Most reinforcement learning algorithms rely on a Gaussian distribution (initially centered at 0 with std 1) for continuous actions. 153 | So, if you forget to normalize the action space when using a custom environment, 154 | this can harm learning and be difficult to debug (cf attached image and `issue #473 `_). 155 | 156 | .. figure:: ../_static/img/mistake.png 157 | 158 | 159 | Another consequence of using a Gaussian is that the action range is not bounded. 160 | That's why clipping is usually used as a bandage to stay in a valid interval. 161 | A better solution would be to use a squashing function (cf ``SAC``) or a Beta distribution (cf `issue #112 `_). 162 | 163 | .. note:: 164 | 165 | This statement is not true for ``DDPG`` or ``TD3`` because they don't rely on any probability distribution. 166 | 167 | 168 | 169 | Tips and Tricks when implementing an RL algorithm 170 | ================================================= 171 | 172 | When you try to reproduce a RL paper by implementing the algorithm, the `nuts and bolts of RL research `_ 173 | by John Schulman are quite useful (`video `_). 174 | 175 | We *recommend following those steps to have a working RL algorithm*: 176 | 177 | 1. Read the original paper several times 178 | 2. Read existing implementations (if available) 179 | 3. Try to have some "sign of life" on toy problems 180 | 4. Validate the implementation by making it run on harder and harder envs (you can compare results against the RL zoo) 181 | You usually need to run hyperparameter optimization for that step. 182 | 183 | You need to be particularly careful on the shape of the different objects you are manipulating (a broadcast mistake will fail silently cf `issue #75 `_) 184 | and when to stop the gradient propagation. 185 | 186 | A personal pick (by @araffin) for environments with gradual difficulty in RL with continuous actions: 187 | 188 | 1. Pendulum (easy to solve) 189 | 2. HalfCheetahBullet (medium difficulty with local minima and shaped reward) 190 | 3. BipedalWalkerHardcore (if it works on that one, then you can have a cookie) 191 | 192 | in RL with discrete actions: 193 | 194 | 1. CartPole-v1 (easy to be better than random agent, harder to achieve maximal performance) 195 | 2. LunarLander 196 | 3. Pong (one of the easiest Atari game) 197 | 4. other Atari games (e.g. Breakout) 198 | -------------------------------------------------------------------------------- /Docs/Commodities.rst: -------------------------------------------------------------------------------- 1 | .. _Commodities: 2 | 3 | Commodities 4 | =========== 5 | 6 | 7 | 8 | The Data of Commodities (including Crude Oil, Gold, Silver, etc.) from several indices for thousands of tickers across several frenquncies can be downloaded freely. 9 | There are many alternatives out there (YahooFinance, AlphaVantage, Quandl etc.). 10 | 11 | This page describes how to download the data from different sources. 12 | 13 | Single Commodity - Volume and Price data 14 | ----------------------------- 15 | 16 | The data can be obtained from the following sources. Click to view the code to retrieve it 17 | 18 | - `Yahoo Finance `_ 19 | 20 | - `Alphavantage `_ 21 | 22 | - `Quandl `_ 23 | 24 | - `FRED `_ 25 | 26 | 27 | Changing Time Period 28 | ----------------------------- 29 | 30 | - `Yahoo Finance `_ 31 | 32 | - `Alphavantage `_ 33 | 34 | - `Quandl `_ 35 | 36 | 37 | Realtime Data 38 | ----------------------------- 39 | 40 | - `Yahoo Finance `_ 41 | 42 | - `Alphavantage `_ 43 | -------------------------------------------------------------------------------- /Docs/Crypto.rst: -------------------------------------------------------------------------------- 1 | .. _Crypto: 2 | 3 | Cryptocurrencies 4 | ================ 5 | 6 | The Data of Cryptocurrencies (including Bitcoin, Etherium etc.) from several indices for thousands of tickers across several frenquncies can be downloaded freely. 7 | There are many alternatives out there (YahooFinance, AlphaVantage, Quandl etc.). 8 | 9 | This page describes how to download the data from different sources. 10 | 11 | Single Cryptocurrency - Volume and Price Data 12 | ----------------------------- 13 | 14 | The data can be obtained from the following sources. Click to view the code to retrieve it 15 | 16 | - `Yahoo Finance `_ 17 | 18 | - `Alphavantage `_ 19 | 20 | - `Quandl `_ 21 | 22 | - `FRED `_ 23 | 24 | 25 | Changing Time period 26 | ----------------------------- 27 | 28 | - `Yahoo Finance `_ 29 | 30 | - `Alphavantage `_ 31 | 32 | - `Quandl `_ 33 | 34 | 35 | Realtime Data 36 | ----------------------------- 37 | 38 | - `Yahoo Finance `_ 39 | 40 | - `Alphavantage `_ 41 | -------------------------------------------------------------------------------- /Docs/Equities.rst: -------------------------------------------------------------------------------- 1 | .. _Equities: 2 | 3 | Equities 4 | ======== 5 | 6 | The Data of Equities from 100+ indices for thousands of tickers across several frenquncies can be downloaded freely. 7 | There are many alternatives out there (yfinance, AlphaVantage, Quandl etc.). 8 | 9 | This page describes how to download the data from different sources. 10 | 11 | Single Stock- Volume and Price data 12 | ----------------------------------- 13 | 14 | The data can be obtained from the following sources. Click to view the code to retrieve it. 15 | 16 | - `Yahoo Finance `_ 17 | 18 | - `Alphavantage `_ 19 | 20 | - `Fundamental Analysis `_ 21 | 22 | 23 | - `Quandl `_ 24 | 25 | - `FRED `_ 26 | 27 | - `Stooq `_ 28 | 29 | - `Oanda `_ 30 | 31 | 32 | Indices 33 | ----------------------------- 34 | 35 | - `Yahoo Finance `_ 36 | 37 | - `Alphavantage `_ 38 | 39 | 40 | Stock Splits and Dividends 41 | ----------------------------- 42 | - `Yahoo Finance `_ 43 | 44 | - `Alphavantage `_ 45 | 46 | - `Quandl `_ 47 | 48 | 49 | 50 | Technical Indicators 51 | ----------------------------- 52 | - `Yahoo Finance `_ 53 | 54 | - `Alphavantage `_ 55 | 56 | 57 | Stock Fundamentals 58 | ----------------------------- 59 | - `Yahoo Finance `_ 60 | 61 | - `Alphavantage `_ 62 | 63 | Financials 64 | ----------------------------- 65 | 66 | - `Yahoo Finance `_ 67 | 68 | - `Alphavantage `_ 69 | 70 | - `Fundamental Analysis `_ 71 | 72 | Frequency Setting 73 | ----------------------------- 74 | 75 | - `Yahoo Finance `_ 76 | 77 | - `Alphavantage `_ 78 | 79 | Changing Time Period 80 | ----------------------------- 81 | 82 | - `Yahoo Finance `_ 83 | 84 | - `Alphavantage `_ 85 | 86 | - `Quandl `_ 87 | 88 | 89 | Realtime Data 90 | ----------------------------- 91 | 92 | - `Yahoo Finance `_ 93 | 94 | - `Alphavantage `_ 95 | -------------------------------------------------------------------------------- /Docs/FRED.rst: -------------------------------------------------------------------------------- 1 | .. _FRED: 2 | 3 | FRED 4 | ========= 5 | 6 | `FRED `_ is one of the richest source of economic data containing 816,000 US and international time series from 107 sources. 7 | 8 | .. note:: 9 | Refer to `FRED Jupyter Notebook `_ for more details. 10 | 11 | Table of Contents 12 | ----------------- 13 | 14 | - `Installation`_ 15 | - `Usage`_ 16 | - `Historical Price for 1 Stock`_ 17 | - `Many Stocks`_ 18 | - `Currencies`_ 19 | - `Cryptocurrencies`_ 20 | - `Mutual Funds`_ 21 | - `Treasury Rates`_ 22 | - `Sentiment`_ 23 | 24 | Installation 25 | ------------------ 26 | 27 | Install with pip: 28 | 29 | .. code:: ipython3 30 | 31 | pip install oandapyV20 32 | 33 | Or from Github: 34 | 35 | .. code:: ipython3 36 | 37 | pip install git+https://github.com/hootnot/oanda-api-v20.git 38 | 39 | Usage 40 | ----- 41 | 42 | Below are examples of how to get and plot data from datasets found in the 43 | Federal Reserve Economic Data database found `here `_. 44 | 45 | To obtain the code needed for the API call, search the database, then locate the 46 | unique ID code next to the title. From there, the process follows the examples below. 47 | 48 | Import all necessary libraries: 49 | 50 | .. code:: ipython3 51 | 52 | import pandas_datareader as web 53 | import pandas as pd 54 | from matplotlib import pyplot as plt 55 | import seaborn as sns 56 | from datetime import datetime 57 | 58 | Historical Price for 1 Stock 59 | ---------------------------- 60 | 61 | Gets the S&P price data from the ``start`` to the ``end`` dates specified, 62 | and plots them. 63 | 64 | .. code:: ipython3 65 | 66 | # Specify time periods 67 | start = datetime(2010,1,1) 68 | end = datetime(2030,1,1) 69 | 70 | # create your DataReader object for the S&P 71 | SP500 = web.DataReader('SP500','fred',start,end) 72 | 73 | .. code:: ipython3 74 | 75 | sns.set() #run this to overide matplotlib 76 | SP500['SP500'].plot(title='S&P 500 Price',figsize=(20, 6)) 77 | 78 | # Use the below to save the chart as an image file 79 | plt.savefig('s&p500.png') 80 | 81 | Many Stocks 82 | ----------- 83 | 84 | Plots multiple market cap indices against each other. 85 | 86 | .. code:: ipython3 87 | 88 | mkt_cap = web.DataReader(['WILLLRGCAPGR', 'WILLSMLCAP'], 'fred',start,end) 89 | mkt_cap.plot(title = 'Wilshire Large-Cap compared to Small-Cap', secondary_y = "DGS10", figsize=(20, 6)) 90 | plt.tight_layout() 91 | 92 | Currencies 93 | --------------- 94 | 95 | Plots the exchange rate between the Yuan and the Dollar. 96 | 97 | .. code:: ipython3 98 | 99 | er = web.DataReader('AEXCHUS', 'fred',start,end) 100 | er.plot(title = 'Chinese Yuan Renminbi to U.S. Dollar Spot Exchange Rate', secondary_y = "DGS10", figsize=(20, 6)) 101 | plt.tight_layout() 102 | 103 | Cryptocurrencies 104 | --------------- 105 | 106 | Plots the price of bitcoin. 107 | 108 | .. code:: ipython3 109 | 110 | btc = web.DataReader('CBBTCUSD', 'fred',start,end) 111 | btc.plot(title = 'Bitcoin Price', secondary_y = "DGS10", figsize=(20, 6)) 112 | plt.tight_layout() 113 | 114 | 115 | 116 | 117 | Mutual Funds 118 | --------------- 119 | 120 | Plots the mutual fund assets. 121 | 122 | .. code:: ipython3 123 | 124 | mf = web.DataReader('BOGZ1LM193064005Q', 'fred',start,end) 125 | mf.plot(title = 'Households; Corporate Equities and Mutual Fund Shares; Asset, Market Value Levels', secondary_y = "DGS10", figsize=(20, 6)) 126 | plt.tight_layout() 127 | 128 | 129 | 130 | 131 | Treasury Rates 132 | --------------- 133 | 134 | Plots the treasury rate. 135 | 136 | .. code:: ipython3 137 | 138 | treasury = web.DataReader('TB3MS', 'fred',start,end) 139 | treasury.plot(title = '3-Month Treasury Bill Secondary Market Rate', secondary_y = "DGS10", figsize=(20, 6)) 140 | plt.tight_layout() 141 | 142 | Sentiment 143 | --------- 144 | 145 | Plots the U Michigan consumer sentiment. 146 | 147 | .. code:: ipython3 148 | 149 | sentiment = web.DataReader('UMCSENT', 'fred',start,end) 150 | sentiment.plot(title = 'U Michigan Consumer Sentiment', secondary_y = "DGS10", figsize=(20, 6)) 151 | plt.tight_layout() 152 | -------------------------------------------------------------------------------- /Docs/FX.rst: -------------------------------------------------------------------------------- 1 | .. _FX: 2 | 3 | FX 4 | == 5 | 6 | The Data of FX Spot from several indices for thousands of tickers across several frenquncies can be downloaded freely. 7 | There are many alternatives out there (YahooFinance, AlphaVantage, Quandl etc.). 8 | 9 | This page describes how to download the data from different sources. 10 | 11 | FX Spot - Volume and Price Data: 12 | ------------------------------ 13 | 14 | The data can be obtained from the following sources. Click to view the code to retrieve it 15 | 16 | - `Yahoo Finance `_ 17 | 18 | - `FRED `_ 19 | 20 | 21 | Changing Time Period 22 | ----------------------------- 23 | 24 | - `Yahoo Finance `_ 25 | 26 | 27 | Realtime Data 28 | ----------------------------- 29 | 30 | - `Yahoo Finance `_ 31 | -------------------------------------------------------------------------------- /Docs/FXCM.rst: -------------------------------------------------------------------------------- 1 | .. _FXCM: 2 | 3 | FXCM 4 | ========= 5 | FXCM provides a API to interact with its trading platform. Among others, it allows the retrieval of historical data as well as of streaming data. In addition, it allows to place different types of orders and to read out account information. The overall goal is to allow the implementation automated, algortithmic trading programs. fxcmpy.py is a Python wrapper package for that API. 6 | 7 | .. note:: 8 | Refer to `documentation `_ for more details. 9 | 10 | Fetching the data 11 | ----------------- 12 | 13 | .. code:: ipython3 14 | 15 | import numpy as np 16 | import yfinance as yf 17 | 18 | Historical Price and Volume for 1 Stock 19 | --------------------------------------- 20 | 21 | .. code:: ipython3 22 | 23 | import numpy as np 24 | import yfinance as yf 25 | ticker = 'GE' 26 | yf.download(ticker) 27 | 28 | Adding Time Periods 29 | ------------------- 30 | 31 | .. code:: ipython3 32 | 33 | yf.download(ticker, start = "2014-01-01", end = "2018-12-31") 34 | GE = yf.download(ticker, start = "2014-01-01", end = "2018-12-31") 35 | GE.info() 36 | 37 | 38 | .. parsed-literal:: 39 | 40 | 41 | DatetimeIndex: 1257 entries, 2014-01-02 to 2018-12-28 42 | Data columns (total 6 columns): 43 | Open 1257 non-null float64 44 | High 1257 non-null float64 45 | Low 1257 non-null float64 46 | Close 1257 non-null float64 47 | Adj Close 1257 non-null float64 48 | Volume 1257 non-null int64 49 | dtypes: float64(5), int64(1) 50 | memory usage: 68.7 KB 51 | 52 | 53 | 54 | .. code:: ipython3 55 | 56 | yf.download(ticker, period = "ytd") 57 | yf.download(ticker, period = "1mo") 58 | yf.download(ticker, period = "5d") 59 | yf.download(ticker, period = "10y") 60 | 61 | 62 | Frequency Setting 63 | ----------------- 64 | 65 | .. code:: ipython3 66 | 67 | yf.download('GE',period='1mo',interval='1h') 68 | yf.download('GE',period='1mo',interval='5m') 69 | GE = yf.download('GE',period='5d',interval='5m') 70 | #Pre or post market data 71 | GE=yf.download('GE',prepost=True,period='5d',interval='5m') 72 | 73 | Stock Split and dividends 74 | ------------------------- 75 | 76 | .. code:: ipython3 77 | 78 | ticker = "AAPL" 79 | # action = True for dividend and Stock Split 80 | AAPL = yf.download(ticker, period="10y", actions = True) 81 | AAPL.head() 82 | 83 | .. code:: ipython3 84 | 85 | AAPL[AAPL["Dividends"]>0] 86 | AAPL.loc["2019-08-05":"2019-08-15"].diff() 87 | AAPL[AAPL["Stock Splits"] > 0] 88 | ticker = ['GE', 'AAPL','FB'] 89 | yf.download(ticker, period="5y") 90 | .. code:: ipython3 91 | 92 | stock=yf.download(ticker, period="5y").Close 93 | 94 | 95 | FInancial Indices 96 | ----------------- 97 | 98 | .. code:: ipython3 99 | 100 | index = ['^DJI', '^GSPC'] 101 | 102 | .. code:: ipython3 103 | 104 | stock = yf.download(index,period='10y').Close 105 | 106 | 107 | .. code:: ipython3 108 | 109 | #Total Return 110 | index = ['^DJITR', '^SP500TR'] 111 | 112 | .. code:: ipython3 113 | 114 | indexes = yf.download(index,period='10y').Close 115 | 116 | 117 | 118 | Currencies 119 | ---------- 120 | 121 | .. code:: ipython3 122 | 123 | #Tickers 124 | ticker1 = "EURUSD=X" 125 | ticker2 = "USDEUR=X" 126 | 127 | .. code:: ipython3 128 | 129 | yf.download(ticker1,period='5y') 130 | 131 | .. code:: ipython3 132 | 133 | yf.download(ticker2,period='5y') 134 | 135 | 136 | 137 | 138 | 139 | 140 | Crypto 141 | ------ 142 | 143 | .. code:: ipython3 144 | 145 | #Tickers 146 | ticker1 = ["BTC-USD", "ETH-USD"] 147 | 148 | .. code:: ipython3 149 | 150 | data = yf.download(ticker1,start='2019-08-01',end='2020-05-01') 151 | 152 | 153 | 154 | 155 | Mutual Funds 156 | ------------ 157 | 158 | .. code:: ipython3 159 | 160 | #Tickers 161 | #20+Y Treasury Bobd ETF and Vivoldi Multi-Strategy Fund Class 162 | ticker1 = ["TLT", "OMOIX"] 163 | 164 | .. code:: ipython3 165 | 166 | data = yf.download(ticker1,start='2019-08-01',end='2020-05-01') 167 | 168 | 169 | 170 | 171 | Treasury Rates 172 | --------------- 173 | 174 | .. code:: ipython3 175 | 176 | #10Y and 5Y Treasury Rates 177 | ticker1 = ["^TNX", "^FVX"] 178 | 179 | .. code:: ipython3 180 | 181 | data = yf.download(ticker1,period="5y") 182 | 183 | 184 | Stock Fundamentals 185 | ------------------ 186 | 187 | .. code:: ipython3 188 | 189 | ticker ="DIS" 190 | dis = yf.Ticker(ticker) 191 | 192 | .. code:: ipython3 193 | 194 | dis.ticker 195 | 196 | 197 | .. parsed-literal:: 198 | 199 | 'DIS' 200 | 201 | .. code:: ipython3 202 | 203 | data=dis.history() 204 | 205 | .. code:: ipython3 206 | 207 | ticker = ["MSFT","FB"] 208 | 209 | .. code:: ipython3 210 | 211 | for i in ticker: 212 | df.loc["{}".format(i)] = pd.Series(yf.Ticker(i).info) 213 | 214 | .. code:: ipython3 215 | 216 | df.info() 217 | 218 | Import Financials 219 | ----------------- 220 | 221 | .. code:: ipython3 222 | 223 | ticker ="DIS" 224 | dis = yf.Ticker(ticker) 225 | 226 | .. code:: ipython3 227 | 228 | dis.balance_sheet 229 | 230 | .. code:: ipython3 231 | 232 | dis.financials 233 | 234 | .. code:: ipython3 235 | 236 | dis.cashflow 237 | 238 | Put Call Option 239 | --------------- 240 | 241 | .. code:: ipython3 242 | 243 | ticker ="DIS" 244 | dis = yf.Ticker(ticker) 245 | 246 | .. code:: ipython3 247 | 248 | dis.option_chain() 249 | 250 | .. code:: ipython3 251 | 252 | calls = dis.option_chain()[0] 253 | calls 254 | 255 | .. code:: ipython3 256 | 257 | puts = dis.option_chain()[1] 258 | puts 259 | 260 | ### Stream Realtime Data 261 | 262 | .. code:: ipython3 263 | 264 | import time 265 | 266 | .. code:: ipython3 267 | 268 | ticker1 ="EURUSD=X" 269 | data = yf.download(ticker1,interval = '1m', period='1d') 270 | print(data.index[-1], data.iloc[-1,3]) 271 | #Every 5 second data corresponding to 5 seconds 272 | while True: 273 | time.sleep(5) 274 | data = yf.download(ticker1,interval = '1m', period='1d') 275 | print(data.index[-1], data.iloc[-1,3]) 276 | -------------------------------------------------------------------------------- /Docs/FinancialDatasets.rst: -------------------------------------------------------------------------------- 1 | .. _FinancialDatasets: 2 | 3 | =============== 4 | Financial Datasets 5 | =============== 6 | 7 | This page contains the details of the financial datasets 8 | 9 | Most of the library tries to follow a sklearn-like syntax for the Reinforcement Learning algorithms. 10 | 11 | Here is a quick example of how to train and run PPO2 on a cartpole environment: 12 | 13 | .. code-block:: python 14 | 15 | import gym 16 | 17 | from stable_baselines.common.policies import MlpPolicy 18 | from stable_baselines.common.vec_env import DummyVecEnv 19 | from stable_baselines import PPO2 20 | 21 | env = gym.make('CartPole-v1') 22 | # Optional: PPO2 requires a vectorized environment to run 23 | # the env is now wrapped automatically when passing it to the constructor 24 | # env = DummyVecEnv([lambda: env]) 25 | 26 | model = PPO2(MlpPolicy, env, verbose=1) 27 | model.learn(total_timesteps=10000) 28 | 29 | obs = env.reset() 30 | for i in range(1000): 31 | action, _states = model.predict(obs) 32 | obs, rewards, dones, info = env.step(action) 33 | env.render() 34 | 35 | 36 | Or just train a model with a one liner if 37 | `the environment is registered in Gym `_ and if 38 | `the policy is registered `_: 39 | 40 | .. code-block:: python 41 | 42 | from stable_baselines import PPO2 43 | 44 | model = PPO2('MlpPolicy', 'CartPole-v1').learn(10000) 45 | 46 | 47 | .. figure:: https://cdn-images-1.medium.com/max/960/1*R_VMmdgKAY0EDhEjHVelzw.gif 48 | 49 | Define and train a RL agent in one line of code! 50 | -------------------------------------------------------------------------------- /Docs/FixedIncome.rst: -------------------------------------------------------------------------------- 1 | .. _FixedIncome: 2 | 3 | 4 | Fixed Income 5 | ============= 6 | 7 | The Data of Fixed Income Instruments including Treasury bills, corporate bonds, mutual funde etc. can be retrieved from several indices for thousands of tickers across several frenquncies can be downloaded freely. 8 | There are many alternatives out there (YahooFinance, AlphaVantage, Quandl etc.). 9 | 10 | This page describes how to download the data from different sources. 11 | 12 | Treasury Rates 13 | ----------------------------- 14 | 15 | The data can be obtained from the following sources. Click to view the code to retrieve it 16 | 17 | - `Yahoo Finance `_ 18 | 19 | - `Alphavantage `_ 20 | 21 | - `Quandl `_ 22 | 23 | - `FRED `_ 24 | 25 | Mutual Funds 26 | ----------------------------- 27 | 28 | The data can be obtained from the following sources. Click to view the code to retrieve it 29 | 30 | - `Yahoo Finance `_ 31 | 32 | - `Stooq `_ 33 | 34 | 35 | 36 | Changing Time period 37 | ----------------------------- 38 | 39 | - `Yahoo Finance `_ 40 | 41 | 42 | Realtime Data 43 | ----------------------------- 44 | 45 | - `Yahoo Finance `_ 46 | -------------------------------------------------------------------------------- /Docs/FundamentalAnalysis.rst: -------------------------------------------------------------------------------- 1 | .. _FundamentalAnalysis: 2 | 3 | .. note:: 4 | Refer to `FundamentalAnalysis Jupyter Notebook `_ for more details. 5 | 6 | FundamentalAnalysis 7 | =================== 8 | 9 | - `Installation`_ 10 | - `Usage`_ 11 | - `List all Commands`_ 12 | - `Historical Price and Volume for 1 Stock`_ 13 | - `Financial Ratios`_ 14 | - `Stock Fundamentals`_ 15 | - `Financials`_ 16 | - `Key Metrics`_ 17 | 18 | Installation 19 | ------------ 20 | 21 | .. note:: 22 | Before working with this API, you will need to obtain 23 | a key from `FinancialModellingPrep's API `_ 24 | 25 | Install with pip: 26 | 27 | .. code:: ipython3 28 | 29 | pip install FundamentalAnalysis 30 | 31 | Usage 32 | ----- 33 | 34 | .. note:: 35 | FundamentalAnalysis automatically uses Pandas DataFrames. 36 | 37 | Import all necessary libraries: 38 | 39 | .. code:: ipython3 40 | 41 | import FundamentalAnalysis as fa 42 | import financedatabase as fd 43 | import pandas as pd 44 | 45 | ticker = "TSLA" 46 | api_key = "your api key" 47 | 48 | List all Commands 49 | ----------------- 50 | 51 | Gets all of the options from ``cryptocurrencies``, ``currencies``, ``equities``, ``etfs`` or ``funds`` 52 | that are available to be queried. 53 | 54 | .. code:: ipython3 55 | 56 | # Options: 'cryptocurrencies', 'currencies', 'equities', 'etfs' or 'funds' 57 | options = fd.show_options('cryptocurrencies', equities_selection=None, country=None, sector=None, industry=None) 58 | options = pd.DataFrame(options) 59 | options 60 | 61 | Lists all of the companies that the API offers to be queried. 62 | 63 | .. code:: ipython3 64 | 65 | # Show the available companies 66 | companies = fa.available_companies(api_key) 67 | companies.sort_values('symbol') 68 | 69 | Lists all of the exchanges available for access. 70 | 71 | .. code:: ipython3 72 | 73 | companies.exchange.unique() 74 | 75 | Historical Price and Volume for 1 Stock 76 | --------------------------------------- 77 | 78 | Gets the OHLCV for one stock, given the ``period`` and ``interval``. 79 | 80 | .. code:: ipython3 81 | 82 | # General stock data 83 | stock_data = fa.stock_data(ticker, period="ytd", interval="1d") 84 | 85 | # Detailed stock data 86 | stock_data_detailed = fa.stock_data_detailed(ticker, api_key, begin="2000-01-01", end="2020-01-01") 87 | stock_data_detailed 88 | 89 | Financial Ratios 90 | ---------------- 91 | 92 | .. warning:: 93 | This feature requires a premium subscription. 94 | 95 | .. code:: ipython3 96 | 97 | 98 | # Large set of in-depth ratios 99 | financial_ratios_annually = fa.financial_ratios(ticker, api_key, period="annual") 100 | financial_ratios_quarterly = fa.financial_ratios(ticker, api_key, period="quarter") 101 | 102 | Stock Fundamentals 103 | ------------------ 104 | 105 | .. code:: ipython3 106 | 107 | profile = fa.profile(ticker, api_key) 108 | profile 109 | 110 | Financials 111 | ---------- 112 | 113 | .. warning:: 114 | This feature requires a premium subscription. 115 | 116 | .. code:: ipython3 117 | 118 | ticker ="DIS" 119 | 120 | .. code:: ipython3 121 | 122 | # Balance Sheet statements 123 | balance_sheet_annually = fa.balance_sheet_statement(ticker, api_key, period="annual") 124 | balance_sheet_quarterly = fa.balance_sheet_statement(ticker, api_key, period="quarter") 125 | 126 | .. code:: ipython3 127 | 128 | # Income Statements 129 | income_statement_annually = fa.income_statement(ticker, api_key, period="annual") 130 | income_statement_quarterly = fa.income_statement(ticker, api_key, period="quarter") 131 | 132 | .. code:: ipython3 133 | 134 | # Cash Flow Statements 135 | cash_flow_statement_annually = fa.cash_flow_statement(ticker, api_key, period="annual") 136 | cash_flow_statement_quarterly = fa.cash_flow_statement(ticker, api_key, period="quarter") 137 | 138 | Key Metrics 139 | ----------- 140 | 141 | .. warning:: 142 | This feature requires a premium subscription. 143 | 144 | .. code:: ipython3 145 | 146 | # Key Metrics 147 | key_metrics_annually = fa.key_metrics(ticker, api_key, period="annual") 148 | key_metrics_quarterly = fa.key_metrics(ticker, api_key, period="quarter") 149 | 150 | Sentiment 151 | --------- 152 | 153 | Gets various ratings and scores for the given ``ticker``. 154 | 155 | .. code:: ipython3 156 | 157 | ratings = fa.rating(ticker, api_key) 158 | 159 | -------------------------------------------------------------------------------- /Docs/Fundamentals.rst: -------------------------------------------------------------------------------- 1 | .. _Fundamentals: 2 | 3 | 4 | Stock Fundamentals 5 | ================== 6 | 7 | The Data of fundamentals for several financial instruments across several frenquncies can be downloaded freely. 8 | There are many alternatives out there (YahooFinance, AlphaVantage, Quandl etc.). 9 | 10 | This page describes how to download the data from different sources. 11 | 12 | 13 | 14 | Stock Fundamentals 15 | ----------------------------- 16 | - `Yahoo Finance `_ 17 | 18 | - `Alphavantage `_ 19 | 20 | - `FinancialAnalysis `_ 21 | 22 | - `Quandl `_ 23 | 24 | - `FinViz `_ 25 | 26 | 27 | Financial Ratios 28 | ----------------------------- 29 | - `FinancialAnalysis `_ 30 | 31 | 32 | 33 | Financials 34 | ----------------------------- 35 | 36 | - `Yahoo Finance `_ 37 | 38 | - `Alphavantage `_ 39 | 40 | Frequency Setting(including High Frequncy) 41 | ----------------------------- 42 | 43 | - `Yahoo Finance `_ 44 | 45 | - `Alphavantage `_ 46 | 47 | Changing Time period 48 | ----------------------------- 49 | 50 | - `Yahoo Finance `_ 51 | 52 | - `Alphavantage `_ 53 | 54 | - `Quandl `_ 55 | 56 | 57 | Realtime Data 58 | ----------------------------- 59 | 60 | - `Yahoo Finance `_ 61 | 62 | - `Alphavantage `_ 63 | -------------------------------------------------------------------------------- /Docs/IEX.rst: -------------------------------------------------------------------------------- 1 | .. _IEX: 2 | 3 | .. note:: 4 | Refer to `IEX Jupyter Notebook `_ for more details. 5 | 6 | IEX 7 | === 8 | 9 | - `Installation`_ 10 | - `Usage`_ 11 | - `Show all Functions`_ 12 | - `Historical Price and Volume for 1 Stock`_ 13 | - `Adding Time Periods or Frequency`_ 14 | - `Stock Split and Dividends`_ 15 | - `Sentiment and News`_ 16 | - `Insider Trades`_` 17 | - `Stream Realtime Data`_ 18 | 19 | 20 | Installation 21 | ------------ 22 | 23 | .. note:: 24 | Before working with this API, you will need to obtain 25 | a key from `IEX Cloud `_ 26 | 27 | Install pyEX with pip: 28 | 29 | .. code:: ipython3 30 | 31 | pip install pyEX 32 | 33 | Usage 34 | ----- 35 | 36 | .. note:: 37 | This library will output a Pandas DataFrame when the function ends with "DF". 38 | Otherwise, they can easily be converted to a dataframe, as show in 39 | the `Stream Realtime Data`_ section. 40 | 41 | Import all necessary libraries: 42 | 43 | .. code:: ipython3 44 | 45 | import pandas as pd 46 | import pyEX as p 47 | import requests 48 | 49 | Show all Functions 50 | ------------------ 51 | 52 | The following command shows all functions available, 53 | all of which follow the same structure as the examples below. 54 | 55 | .. code:: ipython3 56 | 57 | [_ for _ in dir(p) if _.endswith('DF')] 58 | 59 | Historical Price and Volume for 1 Stock 60 | --------------------------------------- 61 | 62 | Outputs the OHLCV for the given ``ticker``. 63 | 64 | .. code:: ipython3 65 | 66 | history = conn.chartDF(ticker) 67 | 68 | Adding Time Periods or Frequency 69 | -------------------------------- 70 | 71 | Changing the ``timeframe`` variable adjusts the time frame 72 | and frequency of the OHLCV data. 73 | 74 | .. code:: ipython3 75 | 76 | timeframe = '5d' # up to 15 years, or minute-by-minute for the last 30 days 77 | history = conn.chartDF(ticker, timeframe=timeframe) 78 | 79 | Stock Split and Dividends 80 | ------------------------- 81 | 82 | .. warning:: 83 | This feature requires a premium subscription 84 | 85 | .. code:: ipython3 86 | 87 | timeframe = '6m' 88 | dividends = conn.dividendsDF(ticker) 89 | 90 | Sentiment and News 91 | ------------------ 92 | 93 | Outputs the headline, source, summary, URL and image of the given ``ticker``. 94 | 95 | .. code:: ipython3 96 | 97 | news = conn.newsDF(ticker, count=10) 98 | 99 | Insider Trades 100 | -------------- 101 | 102 | .. warning:: 103 | This feature requires a premium subscription 104 | 105 | trades = conn.insiderTransactionsDF(ticker) 106 | 107 | Stream Realtime Data 108 | -------------------- 109 | 110 | Each invocation of this function outputs all current data available for the 111 | ``ticker``. 112 | 113 | .. code:: ipython3 114 | 115 | ticker = 'GE' 116 | 117 | real_time = conn.quote(ticker) 118 | 119 | # convert to Pandas DataFrame 120 | real_time = pd.DataFrame(real_time, index = ['value']).T 121 | 122 | -------------------------------------------------------------------------------- /Docs/JupyterNotebooks/EOD.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Analyze Fundamental Stock Data" 8 | ] 9 | }, 10 | { 11 | "cell_type": "code", 12 | "execution_count": 3, 13 | "metadata": {}, 14 | "outputs": [ 15 | { 16 | "ename": "ModuleNotFoundError", 17 | "evalue": "No module named 'pandas._libs.join'", 18 | "output_type": "error", 19 | "traceback": [ 20 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", 21 | "\u001b[0;31mModuleNotFoundError\u001b[0m Traceback (most recent call last)", 22 | "\u001b[0;32m/var/folders/pk/ntxqk62j0_jb9d_f4plbpzv80000gn/T/ipykernel_19169/63620644.py\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# !pip install eod\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 2\u001b[0m \u001b[0;31m# Libraries\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 3\u001b[0;31m \u001b[0;32mimport\u001b[0m \u001b[0mpandas\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0mpd\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 4\u001b[0m \u001b[0;32mfrom\u001b[0m \u001b[0meod\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mEodHistoricalData\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 5\u001b[0m \u001b[0;32mfrom\u001b[0m \u001b[0mfunctools\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mreduce\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", 23 | "\u001b[0;32m~/Desktop/FinAILabDatasets/venv/lib/python3.8/site-packages/pandas/__init__.py\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[1;32m 48\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mpandas\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcore\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mconfig_init\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 49\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 50\u001b[0;31m from pandas.core.api import (\n\u001b[0m\u001b[1;32m 51\u001b[0m \u001b[0;31m# dtype\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 52\u001b[0m \u001b[0mInt8Dtype\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", 24 | "\u001b[0;32m~/Desktop/FinAILabDatasets/venv/lib/python3.8/site-packages/pandas/core/api.py\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[1;32m 27\u001b[0m \u001b[0mvalue_counts\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 28\u001b[0m )\n\u001b[0;32m---> 29\u001b[0;31m \u001b[0;32mfrom\u001b[0m \u001b[0mpandas\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcore\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0marrays\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mCategorical\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 30\u001b[0m \u001b[0;32mfrom\u001b[0m \u001b[0mpandas\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcore\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0marrays\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mboolean\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mBooleanDtype\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 31\u001b[0m from pandas.core.arrays.floating import (\n", 25 | "\u001b[0;32m~/Desktop/FinAILabDatasets/venv/lib/python3.8/site-packages/pandas/core/arrays/__init__.py\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[1;32m 9\u001b[0m \u001b[0;32mfrom\u001b[0m \u001b[0mpandas\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcore\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0marrays\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mfloating\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mFloatingArray\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 10\u001b[0m \u001b[0;32mfrom\u001b[0m \u001b[0mpandas\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcore\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0marrays\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minteger\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mIntegerArray\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 11\u001b[0;31m \u001b[0;32mfrom\u001b[0m \u001b[0mpandas\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcore\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0marrays\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minterval\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mIntervalArray\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 12\u001b[0m \u001b[0;32mfrom\u001b[0m \u001b[0mpandas\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcore\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0marrays\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmasked\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mBaseMaskedArray\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 13\u001b[0m \u001b[0;32mfrom\u001b[0m \u001b[0mpandas\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcore\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0marrays\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnumpy_\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mPandasArray\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", 26 | "\u001b[0;32m~/Desktop/FinAILabDatasets/venv/lib/python3.8/site-packages/pandas/core/arrays/interval.py\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[1;32m 80\u001b[0m )\n\u001b[1;32m 81\u001b[0m \u001b[0;32mfrom\u001b[0m \u001b[0mpandas\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcore\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mindexers\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mcheck_array_indexer\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 82\u001b[0;31m \u001b[0;32mfrom\u001b[0m \u001b[0mpandas\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcore\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mindexes\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mbase\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mensure_index\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 83\u001b[0m from pandas.core.ops import (\n\u001b[1;32m 84\u001b[0m \u001b[0minvalid_comparison\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", 27 | "\u001b[0;32m~/Desktop/FinAILabDatasets/venv/lib/python3.8/site-packages/pandas/core/indexes/base.py\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[1;32m 24\u001b[0m \u001b[0mlib\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 25\u001b[0m )\n\u001b[0;32m---> 26\u001b[0;31m \u001b[0;32mimport\u001b[0m \u001b[0mpandas\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_libs\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mjoin\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0mlibjoin\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 27\u001b[0m from pandas._libs.lib import (\n\u001b[1;32m 28\u001b[0m \u001b[0mis_datetime_array\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", 28 | "\u001b[0;31mModuleNotFoundError\u001b[0m: No module named 'pandas._libs.join'" 29 | ] 30 | } 31 | ], 32 | "source": [ 33 | "# !pip install eod\n", 34 | "# Libraries\n", 35 | "import pandas as pd\n", 36 | "from eod import EodHistoricalData\n", 37 | "from functools import reduce\n", 38 | "from datetime import datetime, timedelta" 39 | ] 40 | }, 41 | { 42 | "cell_type": "code", 43 | "execution_count": null, 44 | "metadata": {}, 45 | "outputs": [], 46 | "source": [ 47 | "# Importing and assigning the api key\n", 48 | "with open(\"../eodHistoricalData-API.txt\", \"r\") as f:\n", 49 | " api_key = f.read()" 50 | ] 51 | }, 52 | { 53 | "cell_type": "code", 54 | "execution_count": 5, 55 | "metadata": {}, 56 | "outputs": [], 57 | "source": [ 58 | " \n", 59 | "# EOD Historical Data client\n", 60 | "client = EodHistoricalData(\"6190a6350b56a7.50134497\")" 61 | ] 62 | }, 63 | { 64 | "cell_type": "markdown", 65 | "metadata": {}, 66 | "source": [ 67 | "# Formatting Fundamental Data" 68 | ] 69 | }, 70 | { 71 | "cell_type": "code", 72 | "execution_count": 7, 73 | "metadata": {}, 74 | "outputs": [], 75 | "source": [ 76 | "def getFundamentals(ticker):\n", 77 | " \"\"\"\n", 78 | " Returns the fundamental data from the financial data API. Combines the quarterly balance \n", 79 | " sheet, cash flow, income statement, and earnings for a specific stock ticker.\n", 80 | " \"\"\"\n", 81 | " \n", 82 | " # Getting data\n", 83 | " fund_data = client.get_fundamental_equity(ticker)\n", 84 | " \n", 85 | " # Financials\n", 86 | " bal = pd.DataFrame(fund_data['Financials']['Balance_Sheet']['quarterly']).T\n", 87 | " \n", 88 | " cf = pd.DataFrame(fund_data['Financials']['Cash_Flow']['quarterly']).T\n", 89 | " \n", 90 | " inc = pd.DataFrame(fund_data['Financials']['Income_Statement']['quarterly']).T\n", 91 | " \n", 92 | " # Earnings\n", 93 | " earn = pd.DataFrame(fund_data['Earnings']['History']).T\n", 94 | " \n", 95 | " # Merging them together\n", 96 | " df = reduce(\n", 97 | " lambda left,right: pd.merge(\n", 98 | " left,\n", 99 | " right,\n", 100 | " left_index=True, \n", 101 | " right_index=True, \n", 102 | " how='outer',\n", 103 | " suffixes=('', '_drop')\n", 104 | " ), \n", 105 | " [bal, cf, inc, earn]\n", 106 | " )\n", 107 | " \n", 108 | " # Dropping redundant date and duplicate columns\n", 109 | " dup_cols = [i for i in df.columns if \"date\" in i or \"Date\" in i or \"_drop\" in i]\n", 110 | " \n", 111 | " df = df.drop(dup_cols, axis=1)\n", 112 | " \n", 113 | " return df" 114 | ] 115 | }, 116 | { 117 | "cell_type": "code", 118 | "execution_count": 8, 119 | "metadata": {}, 120 | "outputs": [], 121 | "source": [ 122 | "def getPrices(df, ticker):\n", 123 | " \"\"\"\n", 124 | " Gets the stock price at the time for each date in the financial statements for\n", 125 | " the given ticker and dataframe of financial information.\n", 126 | " \"\"\"\n", 127 | " # Getting stock price at the time\n", 128 | " prices = client.get_prices_eod(ticker, period='d')\n", 129 | " \n", 130 | " prices = pd.DataFrame(prices).set_index('date')[['adjusted_close', 'close', 'volume']]\n", 131 | "\n", 132 | " # Converting to date time\n", 133 | " prices.index = pd.to_datetime(prices.index)\n", 134 | "\n", 135 | " # Filling in missing price data\n", 136 | " prices = prices.reindex(\n", 137 | " pd.date_range(prices.index[0], prices.index[-1]),\n", 138 | " method='ffill'\n", 139 | " )\n", 140 | " \n", 141 | " # Converting back to string for merging later\n", 142 | " prices.index = prices.index.strftime(\"%Y-%m-%d\")\n", 143 | " \n", 144 | " price_dates = [i for i in prices.index if i in df.index]\n", 145 | " \n", 146 | " prices = prices.loc[price_dates]\n", 147 | "\n", 148 | " # Joining together\n", 149 | " df = df.join(prices, how='outer')\n", 150 | " \n", 151 | " return df" 152 | ] 153 | }, 154 | { 155 | "cell_type": "code", 156 | "execution_count": 9, 157 | "metadata": {}, 158 | "outputs": [], 159 | "source": [ 160 | "def formatFundamentals(ticker, dropna=False):\n", 161 | " \"\"\"\n", 162 | " Formats the given ticker's fundamental and price data. Cleans up the data by dropping\n", 163 | " any empty/nan values if requested.\n", 164 | " \"\"\"\n", 165 | " \n", 166 | " # Getting fundamental data\n", 167 | " fund_data = getFundamentals(ticker)\n", 168 | " \n", 169 | " # Getting accompanying price data\n", 170 | " df = getPrices(fund_data, ticker)\n", 171 | " \n", 172 | " # Dropping if all items are na in respective row\n", 173 | " df = df.dropna(how='all')\n", 174 | " \n", 175 | " if dropna:\n", 176 | " # Dropping mostly nan columns and rows if requested\n", 177 | " df = df.dropna(\n", 178 | " axis=0,\n", 179 | " thresh=round(df.shape[0]*.3) # If 50% of the values in the row are Nans, drop the whole row\n", 180 | " ).dropna(\n", 181 | " axis=1,\n", 182 | " thresh=round(df.shape[1]*.3) # If 50% of the values in the columns are Nans, drop the whole column\n", 183 | " )\n", 184 | " \n", 185 | " return df" 186 | ] 187 | }, 188 | { 189 | "cell_type": "code", 190 | "execution_count": 10, 191 | "metadata": {}, 192 | "outputs": [ 193 | { 194 | "ename": "HTTPError", 195 | "evalue": "403 Client Error: Forbidden for url: https://eodhistoricaldata.com/api/fundamentals/TSLA?fmt=json&api_token=6190a6350b56a7.50134497", 196 | "output_type": "error", 197 | "traceback": [ 198 | "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m", 199 | "\u001b[1;31mHTTPError\u001b[0m Traceback (most recent call last)", 200 | "\u001b[1;32m\u001b[0m in \u001b[0;36m\u001b[1;34m\u001b[0m\n\u001b[1;32m----> 1\u001b[1;33m \u001b[0mdf\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mformatFundamentals\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;34m\"TSLA\"\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mdropna\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mTrue\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m", 201 | "\u001b[1;32m\u001b[0m in \u001b[0;36mformatFundamentals\u001b[1;34m(ticker, dropna)\u001b[0m\n\u001b[0;32m 6\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 7\u001b[0m \u001b[1;31m# Getting fundamental data\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 8\u001b[1;33m \u001b[0mfund_data\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mgetFundamentals\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mticker\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 9\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 10\u001b[0m \u001b[1;31m# Getting accompanying price data\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", 202 | "\u001b[1;32m\u001b[0m in \u001b[0;36mgetFundamentals\u001b[1;34m(ticker)\u001b[0m\n\u001b[0;32m 6\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 7\u001b[0m \u001b[1;31m# Getting data\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 8\u001b[1;33m \u001b[0mfund_data\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mclient\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mget_fundamental_equity\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mticker\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 9\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 10\u001b[0m \u001b[1;31m# Financials\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", 203 | "\u001b[1;32mD:\\Anaconda\\lib\\site-packages\\eod\\fundamental_economic_data\\fundamental_api\\fundamental_data.py\u001b[0m in \u001b[0;36mget_fundamental_equity\u001b[1;34m(self, symbol, **query_params)\u001b[0m\n\u001b[0;32m 34\u001b[0m \"\"\"\n\u001b[0;32m 35\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mendpoint\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mURL_FUNDAMENTAL\u001b[0m \u001b[1;33m+\u001b[0m \u001b[0msymbol\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mupper\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m---> 36\u001b[1;33m \u001b[1;32mreturn\u001b[0m \u001b[0msuper\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mhandle_request\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mendpoint\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mquery_params\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 37\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 38\u001b[0m \u001b[1;32mdef\u001b[0m \u001b[0mget_fundamentals_bulk\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mexchange\u001b[0m\u001b[1;33m:\u001b[0m\u001b[0mstr\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;33m**\u001b[0m\u001b[0mquery_params\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", 204 | "\u001b[1;32mD:\\Anaconda\\lib\\site-packages\\eod\\request_handler_class\\request_handler.py\u001b[0m in \u001b[0;36mhandle_request\u001b[1;34m(self, endpoint_url, query_params)\u001b[0m\n\u001b[0;32m 38\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mresp\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mjson\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 39\u001b[0m \u001b[1;32melse\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m---> 40\u001b[1;33m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mresp\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mraise_for_status\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 41\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 42\u001b[0m \u001b[1;32mdef\u001b[0m \u001b[0m__append_fmt\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mdict_to_append\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", 205 | "\u001b[1;32mD:\\Anaconda\\lib\\site-packages\\requests\\models.py\u001b[0m in \u001b[0;36mraise_for_status\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 938\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 939\u001b[0m \u001b[1;32mif\u001b[0m \u001b[0mhttp_error_msg\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m--> 940\u001b[1;33m \u001b[1;32mraise\u001b[0m \u001b[0mHTTPError\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mhttp_error_msg\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mresponse\u001b[0m\u001b[1;33m=\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 941\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 942\u001b[0m \u001b[1;32mdef\u001b[0m \u001b[0mclose\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", 206 | "\u001b[1;31mHTTPError\u001b[0m: 403 Client Error: Forbidden for url: https://eodhistoricaldata.com/api/fundamentals/TSLA?fmt=json&api_token=6190a6350b56a7.50134497" 207 | ] 208 | } 209 | ], 210 | "source": [ 211 | "df = formatFundamentals(\"TSLA\", dropna=True)" 212 | ] 213 | }, 214 | { 215 | "cell_type": "code", 216 | "execution_count": null, 217 | "metadata": { 218 | "jupyter": { 219 | "outputs_hidden": true 220 | } 221 | }, 222 | "outputs": [], 223 | "source": [ 224 | "df" 225 | ] 226 | }, 227 | { 228 | "cell_type": "markdown", 229 | "metadata": {}, 230 | "source": [ 231 | "# Getting Fundamentals from Multiple Companies" 232 | ] 233 | }, 234 | { 235 | "cell_type": "code", 236 | "execution_count": null, 237 | "metadata": {}, 238 | "outputs": [], 239 | "source": [ 240 | "def getMultipleFunds(tickers, api_token):\n", 241 | " \"\"\"\n", 242 | " Gets fundamental data from multiple stock tickers given as a list. Returns\n", 243 | " a large dataframe containing the concatenated information for all the given\n", 244 | " tickers.\n", 245 | " \"\"\"\n", 246 | " \n", 247 | " # Verifying if the list of tickers is compatible\n", 248 | " available = client.get_exchange_symbols(\"US\")\n", 249 | "\n", 250 | " available = set(i['Code'] for i in available)\n", 251 | " \n", 252 | " tickers = [i for i in tickers if i in available]\n", 253 | " \n", 254 | " if len(tickers)==0:\n", 255 | " return \"No valid tickers found.\"\n", 256 | " \n", 257 | " # Iterating through the tickers\n", 258 | " dfs = {}\n", 259 | " \n", 260 | " for ticker in tickers:\n", 261 | " \n", 262 | " dfs[ticker] = formatFundamentals(ticker)\n", 263 | " \n", 264 | " \n", 265 | " return pd.concat(dfs, axis=0)\n", 266 | " " 267 | ] 268 | }, 269 | { 270 | "cell_type": "code", 271 | "execution_count": null, 272 | "metadata": {}, 273 | "outputs": [], 274 | "source": [ 275 | "df = getMultipleFunds([\"asdfase\"], api_key)" 276 | ] 277 | }, 278 | { 279 | "cell_type": "code", 280 | "execution_count": null, 281 | "metadata": {}, 282 | "outputs": [], 283 | "source": [ 284 | "df#.loc['TSLA']" 285 | ] 286 | }, 287 | { 288 | "cell_type": "code", 289 | "execution_count": null, 290 | "metadata": {}, 291 | "outputs": [], 292 | "source": [] 293 | }, 294 | { 295 | "cell_type": "code", 296 | "execution_count": null, 297 | "metadata": {}, 298 | "outputs": [], 299 | "source": [] 300 | } 301 | ], 302 | "metadata": { 303 | "kernelspec": { 304 | "display_name": "Python 3", 305 | "language": "python", 306 | "name": "python3" 307 | }, 308 | "language_info": { 309 | "codemirror_mode": { 310 | "name": "ipython", 311 | "version": 3 312 | }, 313 | "file_extension": ".py", 314 | "mimetype": "text/x-python", 315 | "name": "python", 316 | "nbconvert_exporter": "python", 317 | "pygments_lexer": "ipython3", 318 | "version": "3.8.9" 319 | } 320 | }, 321 | "nbformat": 4, 322 | "nbformat_minor": 4 323 | } 324 | -------------------------------------------------------------------------------- /Docs/JupyterNotebooks/FXCM.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": { 6 | "colab": { 7 | "base_uri": "https://localhost:8080/", 8 | "height": 476 9 | }, 10 | "id": "im0Iqppeu-iY", 11 | "outputId": "17a6e67f-ed03-4316-d7d8-d2ee425a6951" 12 | }, 13 | "source": [ 14 | "* [1. Problem Statement](#0)\n", 15 | "* [2. Getting Started - Load Libraries and Dataset](#1)\n", 16 | " * [2.1. Load Libraries](#1.1) \n", 17 | " * [2.2. Load Dataset](#1.2)" 18 | ] 19 | }, 20 | { 21 | "cell_type": "markdown", 22 | "metadata": {}, 23 | "source": [ 24 | "\n", 25 | "# 1. Problem Statement" 26 | ] 27 | }, 28 | { 29 | "cell_type": "markdown", 30 | "metadata": {}, 31 | "source": [ 32 | "Content:\n", 33 | " Stock\n", 34 | " Many stocks/Indices\n", 35 | " Crypto\n", 36 | " FX\n", 37 | " Finacial\n", 38 | " Real Time Data\n", 39 | " List of tickers available " 40 | ] 41 | }, 42 | { 43 | "cell_type": "code", 44 | "execution_count": 1, 45 | "metadata": {}, 46 | "outputs": [], 47 | "source": [ 48 | "import numpy as np\n", 49 | "import pandas as pd\n", 50 | "# !pip install fxcmpy\n" 51 | ] 52 | }, 53 | { 54 | "cell_type": "code", 55 | "execution_count": 1, 56 | "metadata": {}, 57 | "outputs": [ 58 | { 59 | "ename": "SyntaxError", 60 | "evalue": "invalid syntax (635105306.py, line 1)", 61 | "output_type": "error", 62 | "traceback": [ 63 | "\u001b[0;36m File \u001b[0;32m\"/var/folders/pk/ntxqk62j0_jb9d_f4plbpzv80000gn/T/ipykernel_17618/635105306.py\"\u001b[0;36m, line \u001b[0;32m1\u001b[0m\n\u001b[0;31m Content:\u001b[0m\n\u001b[0m ^\u001b[0m\n\u001b[0;31mSyntaxError\u001b[0m\u001b[0;31m:\u001b[0m invalid syntax\n" 64 | ] 65 | } 66 | ], 67 | "source": [ 68 | "Content:\n", 69 | " Crypto\n", 70 | " FXD\n", 71 | " 291167618\n", 72 | " sx1Ri\n", 73 | " 75f648c9c497b9b9498aad5dd5e631fe5efb3d70\n", 74 | " High Frequency Data\n", 75 | " List of tickers available " 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": 1, 81 | "metadata": { 82 | "colab": { 83 | "base_uri": "https://localhost:8080/", 84 | "height": 861 85 | }, 86 | "id": "6a0GoVM_xpF0", 87 | "outputId": "a0864b8b-d934-4d72-fc61-38f7d9bc08eb" 88 | }, 89 | "outputs": [ 90 | { 91 | "name": "stderr", 92 | "output_type": "stream", 93 | "text": [ 94 | "|ERROR|2021-12-29 12:50:34,948|Socket returns unknown error.\n" 95 | ] 96 | }, 97 | { 98 | "ename": "ServerError", 99 | "evalue": "Can not connect to FXCM Server.", 100 | "output_type": "error", 101 | "traceback": [ 102 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", 103 | "\u001b[0;31mServerError\u001b[0m Traceback (most recent call last)", 104 | "\u001b[0;32m/var/folders/pk/ntxqk62j0_jb9d_f4plbpzv80000gn/T/ipykernel_18611/3007257472.py\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[1;32m 4\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mdatetime\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0mdt\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 5\u001b[0m \u001b[0mTOKEN\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m'63747df64b67d1dd9fb495a227354d35162a5faa'\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 6\u001b[0;31m \u001b[0mcon\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mfxcmpy\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mfxcmpy\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0maccess_token\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mTOKEN\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mlog_level\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'error'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mlog_file\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 7\u001b[0m \u001b[0mprint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcon\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget_instruments\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 8\u001b[0m \u001b[0mstart\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mdt\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdatetime\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;36m2019\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m12\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", 105 | "\u001b[0;32m~/Desktop/FinAILabDatasets/venv/lib/python3.8/site-packages/fxcmpy/fxcmpy.py\u001b[0m in \u001b[0;36m__init__\u001b[0;34m(self, access_token, config_file, log_file, log_level, server, proxy_url, proxy_port, proxy_type)\u001b[0m\n\u001b[1;32m 215\u001b[0m \u001b[0;32mraise\u001b[0m \u001b[0mServerError\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Can not find FXCM Server.'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 216\u001b[0m \u001b[0;32melif\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mconnection_status\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;34m'aborted'\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 217\u001b[0;31m \u001b[0;32mraise\u001b[0m \u001b[0mServerError\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Can not connect to FXCM Server.'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 218\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 219\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__collect_account_ids__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", 106 | "\u001b[0;31mServerError\u001b[0m: Can not connect to FXCM Server." 107 | ] 108 | } 109 | ], 110 | "source": [ 111 | "# 28ff5032b04279ad8ec9b4a52733e5451faa0cd7\n", 112 | "# 28ff5032b04279ad8ec9b4a52733e5451faa0cd7\n", 113 | "import fxcmpy \n", 114 | "import datetime as dt\n", 115 | "TOKEN = '63747df64b67d1dd9fb495a227354d35162a5faa'\n", 116 | "con = fxcmpy.fxcmpy(access_token=TOKEN, log_level='error', log_file=None)\n", 117 | "print(con.get_instruments())\n", 118 | "start = dt.datetime(2019, 12, 1)\n", 119 | "end = dt.datetime(2020, 1, 4)\n", 120 | "data = con.get_candles('SPX500', period='D1', start= start, end= end)\n", 121 | "data" 122 | ] 123 | }, 124 | { 125 | "cell_type": "code", 126 | "execution_count": null, 127 | "metadata": {}, 128 | "outputs": [], 129 | "source": [] 130 | }, 131 | { 132 | "cell_type": "code", 133 | "execution_count": null, 134 | "metadata": { 135 | "colab": { 136 | "base_uri": "https://localhost:8080/", 137 | "height": 170 138 | }, 139 | "id": "a24eFaCl8S0y", 140 | "outputId": "f0b9408d-bf34-4972-fdad-97d5e6f63617" 141 | }, 142 | "outputs": [ 143 | { 144 | "name": "stdout", 145 | "output_type": "stream", 146 | "text": [ 147 | "Collecting pyti\n", 148 | " Downloading https://files.pythonhosted.org/packages/0f/9a/913e5bc3c3e812b490338fc9096b608cf0e19d1c3cd5b1c2b58b77b69b85/pyti-0.2.2-py2.py3-none-any.whl\n", 149 | "Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from pyti) (1.17.4)\n", 150 | "Requirement already satisfied: pandas in /usr/local/lib/python3.6/dist-packages (from pyti) (0.25.3)\n", 151 | "Requirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.6/dist-packages (from pandas->pyti) (2018.9)\n", 152 | "Requirement already satisfied: python-dateutil>=2.6.1 in /usr/local/lib/python3.6/dist-packages (from pandas->pyti) (2.6.1)\n", 153 | "Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.6/dist-packages (from python-dateutil>=2.6.1->pandas->pyti) (1.12.0)\n", 154 | "Installing collected packages: pyti\n", 155 | "Successfully installed pyti-0.2.2\n" 156 | ] 157 | } 158 | ], 159 | "source": [ 160 | "# https://fxcmpy.tpq.io/index.html\n", 161 | "! pip install pyti" 162 | ] 163 | }, 164 | { 165 | "cell_type": "markdown", 166 | "metadata": {}, 167 | "source": [ 168 | "### Streaming Data" 169 | ] 170 | }, 171 | { 172 | "cell_type": "code", 173 | "execution_count": 5, 174 | "metadata": {}, 175 | "outputs": [ 176 | { 177 | "ename": "ServerError", 178 | "evalue": "Can not connect to FXCM Server.", 179 | "output_type": "error", 180 | "traceback": [ 181 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", 182 | "\u001b[0;31mServerError\u001b[0m Traceback (most recent call last)", 183 | "\u001b[0;32m/var/folders/pk/ntxqk62j0_jb9d_f4plbpzv80000gn/T/ipykernel_15431/2711754635.py\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[0;32m----> 1\u001b[0;31m \u001b[0mcon\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mfxcmpy\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mfxcmpy\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mconfig_file\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'fxcm.cfg'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", 184 | "\u001b[0;32m~/Desktop/FinAILabDatasets/venv/lib/python3.8/site-packages/fxcmpy/fxcmpy.py\u001b[0m in \u001b[0;36m__init__\u001b[0;34m(self, access_token, config_file, log_file, log_level, server, proxy_url, proxy_port, proxy_type)\u001b[0m\n\u001b[1;32m 216\u001b[0m \u001b[0;32mraise\u001b[0m \u001b[0mServerError\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Can not find FXCM Server.'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 217\u001b[0m \u001b[0;32melif\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mconnection_status\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;34m'aborted'\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 218\u001b[0;31m \u001b[0;32mraise\u001b[0m \u001b[0mServerError\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Can not connect to FXCM Server.'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 219\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 220\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__collect_account_ids__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", 185 | "\u001b[0;31mServerError\u001b[0m: Can not connect to FXCM Server." 186 | ] 187 | } 188 | ], 189 | "source": [] 190 | }, 191 | { 192 | "cell_type": "code", 193 | "execution_count": 1, 194 | "metadata": {}, 195 | "outputs": [ 196 | { 197 | "ename": "NameError", 198 | "evalue": "name 'con' is not defined", 199 | "output_type": "error", 200 | "traceback": [ 201 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", 202 | "\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)", 203 | "\u001b[0;32m/var/folders/pk/ntxqk62j0_jb9d_f4plbpzv80000gn/T/ipykernel_14669/3909221652.py\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# crypto example\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mcon\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msubscribe_market_data\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m'BTC/USD'\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m'XRP/USD'\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", 204 | "\u001b[0;31mNameError\u001b[0m: name 'con' is not defined" 205 | ] 206 | } 207 | ], 208 | "source": [ 209 | "# crypto example\n", 210 | "con.subscribe_market_data(['BTC/USD', 'XRP/USD'])" 211 | ] 212 | }, 213 | { 214 | "cell_type": "code", 215 | "execution_count": null, 216 | "metadata": { 217 | "id": "2-vq03aGNjOe" 218 | }, 219 | "outputs": [], 220 | "source": [ 221 | "from pyti.bollinger_bands import upper_bollinger_band as " 222 | ] 223 | } 224 | ], 225 | "metadata": { 226 | "colab": { 227 | "collapsed_sections": [], 228 | "name": "Copy of FXCM.ipynb", 229 | "provenance": [] 230 | }, 231 | "kernelspec": { 232 | "display_name": "Python 3", 233 | "language": "python", 234 | "name": "python3" 235 | }, 236 | "language_info": { 237 | "codemirror_mode": { 238 | "name": "ipython", 239 | "version": 3 240 | }, 241 | "file_extension": ".py", 242 | "mimetype": "text/x-python", 243 | "name": "python", 244 | "nbconvert_exporter": "python", 245 | "pygments_lexer": "ipython3", 246 | "version": "3.8.9" 247 | } 248 | }, 249 | "nbformat": 4, 250 | "nbformat_minor": 1 251 | } 252 | -------------------------------------------------------------------------------- /Docs/JupyterNotebooks/FundamentalAnalysis.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Fundamental Analysis" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "## Table of Contents\n", 15 | "\n", 16 | "* [Getting Started - Load Libraries and Dataset](#0) \n", 17 | "* [List Options](#1)\n", 18 | "* [List Companies](#2)\n", 19 | "* [List Exchanges](#3)\n", 20 | "* [Historical Data for One Stock](#4)\n", 21 | "* [Stock Fundamentals](#5)\n", 22 | "* [Sentiment](#6)\n", 23 | "* [Financial Statements](#7)\n", 24 | "* [Key Metrics](#8)\n", 25 | "* [Financial Ratios](#9)\n", 26 | "* [Growth](#10)" 27 | ] 28 | }, 29 | { 30 | "cell_type": "markdown", 31 | "metadata": {}, 32 | "source": [ 33 | "\n", 34 | "## Getting Started - Load Libraries and Dataset" 35 | ] 36 | }, 37 | { 38 | "cell_type": "code", 39 | "execution_count": null, 40 | "metadata": {}, 41 | "outputs": [], 42 | "source": [ 43 | "import FundamentalAnalysis as fa\n", 44 | "import financedatabase as fd\n", 45 | "import pandas as pd\n", 46 | "ticker = \"TSLA\"\n", 47 | "api_key = \"7b95034d43c6ccb530b841102ac7bc8f\"\n" 48 | ] 49 | }, 50 | { 51 | "cell_type": "markdown", 52 | "metadata": {}, 53 | "source": [ 54 | "The below command lists all options available to be queried" 55 | ] 56 | }, 57 | { 58 | "cell_type": "markdown", 59 | "metadata": {}, 60 | "source": [ 61 | "\n", 62 | "## List Options" 63 | ] 64 | }, 65 | { 66 | "cell_type": "code", 67 | "execution_count": null, 68 | "metadata": {}, 69 | "outputs": [], 70 | "source": [ 71 | "# Options: 'cryptocurrencies', 'currencies', 'equities', 'etfs' or 'funds'\n", 72 | "options = fd.show_options('cryptocurrencies', equities_selection=None, country=None, sector=None, industry=None)\n", 73 | "options = pd.DataFrame(options)\n", 74 | "options" 75 | ] 76 | }, 77 | { 78 | "cell_type": "markdown", 79 | "metadata": {}, 80 | "source": [ 81 | "\n", 82 | "## List Companies" 83 | ] 84 | }, 85 | { 86 | "cell_type": "code", 87 | "execution_count": null, 88 | "metadata": {}, 89 | "outputs": [], 90 | "source": [ 91 | "# Show the available companies\n", 92 | "companies = fa.available_companies(api_key)\n", 93 | "companies.sort_values('symbol')\n" 94 | ] 95 | }, 96 | { 97 | "cell_type": "markdown", 98 | "metadata": {}, 99 | "source": [ 100 | "\n", 101 | "## List Exchanges" 102 | ] 103 | }, 104 | { 105 | "cell_type": "code", 106 | "execution_count": null, 107 | "metadata": {}, 108 | "outputs": [], 109 | "source": [ 110 | "# Supported exchanges\n", 111 | "companies.exchange.unique()" 112 | ] 113 | }, 114 | { 115 | "cell_type": "markdown", 116 | "metadata": {}, 117 | "source": [ 118 | "\n", 119 | "## Historical Data for 1 Stock\n" 120 | ] 121 | }, 122 | { 123 | "cell_type": "code", 124 | "execution_count": null, 125 | "metadata": {}, 126 | "outputs": [], 127 | "source": [ 128 | "# General stock data\n", 129 | "stock_data = fa.stock_data(ticker, period=\"ytd\", interval=\"1d\")\n", 130 | "\n", 131 | "# Detailed stock data\n", 132 | "stock_data_detailed = fa.stock_data_detailed(ticker, api_key, begin=\"2000-01-01\", end=\"2020-01-01\")\n", 133 | "stock_data_detailed" 134 | ] 135 | }, 136 | { 137 | "cell_type": "markdown", 138 | "metadata": {}, 139 | "source": [ 140 | "\n", 141 | "## Stock Fundamentals\n" 142 | ] 143 | }, 144 | { 145 | "cell_type": "code", 146 | "execution_count": null, 147 | "metadata": {}, 148 | "outputs": [], 149 | "source": [ 150 | "# Company Profile\n", 151 | "profile = fa.profile(ticker, api_key)\n", 152 | "profile" 153 | ] 154 | }, 155 | { 156 | "cell_type": "code", 157 | "execution_count": null, 158 | "metadata": {}, 159 | "outputs": [], 160 | "source": [ 161 | "# Latest Quote\n", 162 | "quotes = fa.quote(ticker, api_key)\n", 163 | "quotes" 164 | ] 165 | }, 166 | { 167 | "cell_type": "markdown", 168 | "metadata": {}, 169 | "source": [ 170 | "\n", 171 | "## Sentiment" 172 | ] 173 | }, 174 | { 175 | "cell_type": "code", 176 | "execution_count": null, 177 | "metadata": {}, 178 | "outputs": [], 179 | "source": [ 180 | "# Analysts Recommendation/Ratings\n", 181 | "ratings = fa.rating(ticker, api_key)\n", 182 | "ratings" 183 | ] 184 | }, 185 | { 186 | "cell_type": "markdown", 187 | "metadata": {}, 188 | "source": [ 189 | "\n", 190 | "## Financial Statements\n", 191 | "- Premium Subscription Required" 192 | ] 193 | }, 194 | { 195 | "cell_type": "code", 196 | "execution_count": null, 197 | "metadata": {}, 198 | "outputs": [], 199 | "source": [ 200 | "# Obtain DCFs over time\n", 201 | "dcf_annually = fa.discounted_cash_flow(ticker, api_key, period=\"annual\")\n", 202 | "dcf_quarterly = fa.discounted_cash_flow(ticker, api_key, period=\"quarter\")" 203 | ] 204 | }, 205 | { 206 | "cell_type": "code", 207 | "execution_count": null, 208 | "metadata": {}, 209 | "outputs": [], 210 | "source": [ 211 | "dcf_annually" 212 | ] 213 | }, 214 | { 215 | "cell_type": "code", 216 | "execution_count": null, 217 | "metadata": {}, 218 | "outputs": [], 219 | "source": [ 220 | "dcf_quarterly" 221 | ] 222 | }, 223 | { 224 | "cell_type": "code", 225 | "execution_count": null, 226 | "metadata": {}, 227 | "outputs": [], 228 | "source": [ 229 | "# Balance Sheet statements\n", 230 | "balance_sheet_annually = fa.balance_sheet_statement(ticker, api_key, period=\"annual\")\n", 231 | "balance_sheet_quarterly = fa.balance_sheet_statement(ticker, api_key, period=\"quarter\")" 232 | ] 233 | }, 234 | { 235 | "cell_type": "code", 236 | "execution_count": null, 237 | "metadata": {}, 238 | "outputs": [], 239 | "source": [ 240 | "balance_sheet_annually" 241 | ] 242 | }, 243 | { 244 | "cell_type": "code", 245 | "execution_count": null, 246 | "metadata": {}, 247 | "outputs": [], 248 | "source": [ 249 | "# Income Statements\n", 250 | "income_statement_annually = fa.income_statement(ticker, api_key, period=\"annual\")\n", 251 | "income_statement_quarterly = fa.income_statement(ticker, api_key, period=\"quarter\")" 252 | ] 253 | }, 254 | { 255 | "cell_type": "code", 256 | "execution_count": null, 257 | "metadata": {}, 258 | "outputs": [], 259 | "source": [ 260 | "income_statement_quarterly" 261 | ] 262 | }, 263 | { 264 | "cell_type": "code", 265 | "execution_count": null, 266 | "metadata": {}, 267 | "outputs": [], 268 | "source": [ 269 | "# Cash Flow Statements\n", 270 | "cash_flow_statement_annually = fa.cash_flow_statement(ticker, api_key, period=\"annual\")\n", 271 | "cash_flow_statement_quarterly = fa.cash_flow_statement(ticker, api_key, period=\"quarter\")" 272 | ] 273 | }, 274 | { 275 | "cell_type": "code", 276 | "execution_count": null, 277 | "metadata": {}, 278 | "outputs": [], 279 | "source": [ 280 | "cash_flow_statement_annually" 281 | ] 282 | }, 283 | { 284 | "cell_type": "markdown", 285 | "metadata": {}, 286 | "source": [ 287 | "\n", 288 | "## Key Metrics" 289 | ] 290 | }, 291 | { 292 | "cell_type": "code", 293 | "execution_count": null, 294 | "metadata": {}, 295 | "outputs": [], 296 | "source": [ 297 | "# Key Metrics\n", 298 | "key_metrics_annually = fa.key_metrics(ticker, api_key, period=\"annual\")\n", 299 | "key_metrics_quarterly = fa.key_metrics(ticker, api_key, period=\"quarter\")" 300 | ] 301 | }, 302 | { 303 | "cell_type": "code", 304 | "execution_count": null, 305 | "metadata": {}, 306 | "outputs": [], 307 | "source": [ 308 | "key_metrics_quarterly" 309 | ] 310 | }, 311 | { 312 | "cell_type": "markdown", 313 | "metadata": {}, 314 | "source": [ 315 | "\n", 316 | "## Financial Ratios" 317 | ] 318 | }, 319 | { 320 | "cell_type": "code", 321 | "execution_count": null, 322 | "metadata": {}, 323 | "outputs": [], 324 | "source": [ 325 | "# Large set of in-depth ratios\n", 326 | "financial_ratios_annually = fa.financial_ratios(ticker, api_key, period=\"annual\")\n", 327 | "financial_ratios_quarterly = fa.financial_ratios(ticker, api_key, period=\"quarter\")\n", 328 | "financial_ratios_quarterly" 329 | ] 330 | }, 331 | { 332 | "cell_type": "markdown", 333 | "metadata": {}, 334 | "source": [ 335 | "\n", 336 | "## Growth" 337 | ] 338 | }, 339 | { 340 | "cell_type": "code", 341 | "execution_count": null, 342 | "metadata": {}, 343 | "outputs": [], 344 | "source": [ 345 | "# Growth of the company\n", 346 | "growth_annually = fa.financial_statement_growth(ticker, api_key, period=\"annual\")\n", 347 | "growth_quarterly = fa.financial_statement_growth(ticker, api_key, period=\"quarter\")\n", 348 | "growth_quarterly" 349 | ] 350 | } 351 | ], 352 | "metadata": { 353 | "kernelspec": { 354 | "display_name": "Python 3", 355 | "language": "python", 356 | "name": "python3" 357 | }, 358 | "language_info": { 359 | "codemirror_mode": { 360 | "name": "ipython", 361 | "version": 3 362 | }, 363 | "file_extension": ".py", 364 | "mimetype": "text/x-python", 365 | "name": "python", 366 | "nbconvert_exporter": "python", 367 | "pygments_lexer": "ipython3", 368 | "version": "3.8.9" 369 | } 370 | }, 371 | "nbformat": 4, 372 | "nbformat_minor": 4 373 | } 374 | -------------------------------------------------------------------------------- /Docs/JupyterNotebooks/Oanda.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Oanda" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "## Table of Contents\n", 15 | "\n", 16 | "* [Getting Started - Load Libraries and Dataset](#0)\n", 17 | "* [Getting Historical Data - Currency](#1)\n", 18 | "* [Setting the Frequency (High-frequency Intraday)](#2)\n", 19 | "* [Streaming High-frequency Realtime Data](#3)\n" 20 | ] 21 | }, 22 | { 23 | "cell_type": "markdown", 24 | "metadata": {}, 25 | "source": [ 26 | "\n", 27 | "## Getting Started - Load Libraries and Dataset" 28 | ] 29 | }, 30 | { 31 | "cell_type": "code", 32 | "execution_count": null, 33 | "metadata": {}, 34 | "outputs": [], 35 | "source": [ 36 | "import pandas as pd\n", 37 | "import tpqoa" 38 | ] 39 | }, 40 | { 41 | "cell_type": "code", 42 | "execution_count": null, 43 | "metadata": {}, 44 | "outputs": [], 45 | "source": [ 46 | "api = tpqoa.tpqoa(\"oanda.cfg\")" 47 | ] 48 | }, 49 | { 50 | "cell_type": "markdown", 51 | "metadata": {}, 52 | "source": [ 53 | "\n", 54 | "## Getting Historical Data - Currency\n" 55 | ] 56 | }, 57 | { 58 | "cell_type": "code", 59 | "execution_count": null, 60 | "metadata": { 61 | "scrolled": true 62 | }, 63 | "outputs": [], 64 | "source": [ 65 | "ticker = \"US30_USD\"\n", 66 | "start = \"2018-09-01\"\n", 67 | "end = \"2019-09-01\"\n", 68 | "\n", 69 | "api.get_history(ticker, start, end, \"D\", \"B\")" 70 | ] 71 | }, 72 | { 73 | "cell_type": "code", 74 | "execution_count": null, 75 | "metadata": { 76 | "scrolled": true 77 | }, 78 | "outputs": [], 79 | "source": [ 80 | "api.get_history(\"EUR_USD\", \"2018-09-01\", \"2019-09-01\", \"D\", \"B\")" 81 | ] 82 | }, 83 | { 84 | "cell_type": "code", 85 | "execution_count": null, 86 | "metadata": { 87 | "scrolled": true 88 | }, 89 | "outputs": [], 90 | "source": [ 91 | "api.get_history(\"EUR_USD\", \"2018-09-01\", \"2019-09-01\", \"D\", \"A\")" 92 | ] 93 | }, 94 | { 95 | "cell_type": "markdown", 96 | "metadata": {}, 97 | "source": [ 98 | "\n", 99 | "## Setting the Frequency (High-frequency Intraday)" 100 | ] 101 | }, 102 | { 103 | "cell_type": "code", 104 | "execution_count": null, 105 | "metadata": { 106 | "scrolled": true 107 | }, 108 | "outputs": [], 109 | "source": [ 110 | "api.get_history(\"EUR_USD\", \"2019-08-01\", \"2019-09-01\", \"M1\", \"B\")" 111 | ] 112 | }, 113 | { 114 | "cell_type": "code", 115 | "execution_count": null, 116 | "metadata": { 117 | "scrolled": true 118 | }, 119 | "outputs": [], 120 | "source": [ 121 | "api.get_history(\"EUR_USD\", \"2019-09-01\", \"2019-09-04\", \"S5\", \"B\")" 122 | ] 123 | }, 124 | { 125 | "cell_type": "code", 126 | "execution_count": null, 127 | "metadata": { 128 | "scrolled": true 129 | }, 130 | "outputs": [], 131 | "source": [ 132 | "api.get_history(\"EUR_USD\", \"2019-09-01\", \"2019-09-04\", \"S30\", \"B\")" 133 | ] 134 | }, 135 | { 136 | "cell_type": "markdown", 137 | "metadata": {}, 138 | "source": [ 139 | "\n", 140 | "## Streaming High-frequency Realtime Data" 141 | ] 142 | }, 143 | { 144 | "cell_type": "code", 145 | "execution_count": null, 146 | "metadata": { 147 | "scrolled": true 148 | }, 149 | "outputs": [], 150 | "source": [ 151 | "api.stream_data('EUR_USD', stop=10) " 152 | ] 153 | }, 154 | { 155 | "cell_type": "code", 156 | "execution_count": null, 157 | "metadata": {}, 158 | "outputs": [], 159 | "source": [ 160 | "api.stop_stream()" 161 | ] 162 | } 163 | ], 164 | "metadata": { 165 | "kernelspec": { 166 | "display_name": "Python 3", 167 | "language": "python", 168 | "name": "python3" 169 | }, 170 | "language_info": { 171 | "codemirror_mode": { 172 | "name": "ipython", 173 | "version": 3 174 | }, 175 | "file_extension": ".py", 176 | "mimetype": "text/x-python", 177 | "name": "python", 178 | "nbconvert_exporter": "python", 179 | "pygments_lexer": "ipython3", 180 | "version": "3.8.9" 181 | } 182 | }, 183 | "nbformat": 4, 184 | "nbformat_minor": 2 185 | } 186 | -------------------------------------------------------------------------------- /Docs/JupyterNotebooks/financial-data-webscraping.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "from datetime import datetime\n", 10 | "import lxml\n", 11 | "from lxml import html\n", 12 | "import requests\n", 13 | "import numpy as np\n", 14 | "import pandas as pd" 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": 2, 20 | "metadata": {}, 21 | "outputs": [], 22 | "source": [ 23 | "def get_page(url):\n", 24 | " return requests.get(url)" 25 | ] 26 | }, 27 | { 28 | "cell_type": "code", 29 | "execution_count": 3, 30 | "metadata": {}, 31 | "outputs": [], 32 | "source": [ 33 | "def parse_rows(table_rows):\n", 34 | " parsed_rows = []\n", 35 | "\n", 36 | " for table_row in table_rows:\n", 37 | " parsed_row = []\n", 38 | " el = table_row.xpath(\"./div\")\n", 39 | "\n", 40 | " none_count = 0\n", 41 | "\n", 42 | " for rs in el:\n", 43 | " try:\n", 44 | " (text,) = rs.xpath('.//span/text()[1]')\n", 45 | " parsed_row.append(text)\n", 46 | " except ValueError:\n", 47 | " parsed_row.append(np.NaN)\n", 48 | " none_count += 1\n", 49 | "\n", 50 | " if (none_count < 4):\n", 51 | " parsed_rows.append(parsed_row)\n", 52 | " \n", 53 | " return pd.DataFrame(parsed_rows)" 54 | ] 55 | }, 56 | { 57 | "cell_type": "code", 58 | "execution_count": 4, 59 | "metadata": {}, 60 | "outputs": [], 61 | "source": [ 62 | "def clean_data(df):\n", 63 | " df = df.set_index(0) \n", 64 | " df = df.transpose()\n", 65 | " \n", 66 | " cols = list(df.columns)\n", 67 | " cols[0] = 'Date'\n", 68 | " df = df.set_axis(cols, axis='columns', inplace=False)\n", 69 | " \n", 70 | " numeric_columns = list(df.columns)[1::] \n", 71 | "\n", 72 | " for column_index in range(1, len(df.columns)): \n", 73 | " df.iloc[:,column_index] = df.iloc[:,column_index].str.replace(',', '') \n", 74 | " df.iloc[:,column_index] = df.iloc[:,column_index].astype(np.float64)\n", 75 | " \n", 76 | " return df" 77 | ] 78 | }, 79 | { 80 | "cell_type": "code", 81 | "execution_count": 5, 82 | "metadata": {}, 83 | "outputs": [], 84 | "source": [ 85 | "def scrape_table(url):\n", 86 | " page = get_page(url);\n", 87 | " tree = html.fromstring(page.content)\n", 88 | " table_rows = tree.xpath(\"//div[contains(@class, 'D(tbr)')]\") \n", 89 | " df = parse_rows(table_rows)\n", 90 | " df = clean_data(df)\n", 91 | " return df" 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": 6, 97 | "metadata": {}, 98 | "outputs": [ 99 | { 100 | "ename": "KeyError", 101 | "evalue": "0", 102 | "output_type": "error", 103 | "traceback": [ 104 | "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m", 105 | "\u001b[1;31mKeyError\u001b[0m Traceback (most recent call last)", 106 | "\u001b[1;32mD:\\Anaconda\\lib\\site-packages\\pandas\\core\\indexes\\base.py\u001b[0m in \u001b[0;36mget_loc\u001b[1;34m(self, key, method, tolerance)\u001b[0m\n\u001b[0;32m 2656\u001b[0m \u001b[1;32mtry\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m-> 2657\u001b[1;33m \u001b[1;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0m_engine\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mget_loc\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mkey\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 2658\u001b[0m \u001b[1;32mexcept\u001b[0m \u001b[0mKeyError\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", 107 | "\u001b[1;32mpandas/_libs/index.pyx\u001b[0m in \u001b[0;36mpandas._libs.index.IndexEngine.get_loc\u001b[1;34m()\u001b[0m\n", 108 | "\u001b[1;32mpandas/_libs/index.pyx\u001b[0m in \u001b[0;36mpandas._libs.index.IndexEngine.get_loc\u001b[1;34m()\u001b[0m\n", 109 | "\u001b[1;32mpandas/_libs/hashtable_class_helper.pxi\u001b[0m in \u001b[0;36mpandas._libs.hashtable.PyObjectHashTable.get_item\u001b[1;34m()\u001b[0m\n", 110 | "\u001b[1;32mpandas/_libs/hashtable_class_helper.pxi\u001b[0m in \u001b[0;36mpandas._libs.hashtable.PyObjectHashTable.get_item\u001b[1;34m()\u001b[0m\n", 111 | "\u001b[1;31mKeyError\u001b[0m: 0", 112 | "\nDuring handling of the above exception, another exception occurred:\n", 113 | "\u001b[1;31mKeyError\u001b[0m Traceback (most recent call last)", 114 | "\u001b[1;32m\u001b[0m in \u001b[0;36m\u001b[1;34m\u001b[0m\n\u001b[0;32m 1\u001b[0m \u001b[0msymbol\u001b[0m \u001b[1;33m=\u001b[0m \u001b[1;34m'AAPL'\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 2\u001b[1;33m \u001b[0mdf_balance_sheet\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mscrape_table\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;34m'https://finance.yahoo.com/quote/'\u001b[0m \u001b[1;33m+\u001b[0m \u001b[0msymbol\u001b[0m \u001b[1;33m+\u001b[0m \u001b[1;34m'/balance-sheet?p='\u001b[0m \u001b[1;33m+\u001b[0m \u001b[0msymbol\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m", 115 | "\u001b[1;32m\u001b[0m in \u001b[0;36mscrape_table\u001b[1;34m(url)\u001b[0m\n\u001b[0;32m 4\u001b[0m \u001b[0mtable_rows\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mtree\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mxpath\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;34m\"//div[contains(@class, 'D(tbr)')]\"\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 5\u001b[0m \u001b[0mdf\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mparse_rows\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mtable_rows\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 6\u001b[1;33m \u001b[0mdf\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mclean_data\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mdf\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 7\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[0mdf\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", 116 | "\u001b[1;32m\u001b[0m in \u001b[0;36mclean_data\u001b[1;34m(df)\u001b[0m\n\u001b[0;32m 1\u001b[0m \u001b[1;32mdef\u001b[0m \u001b[0mclean_data\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mdf\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 2\u001b[1;33m \u001b[0mdf\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mdf\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mset_index\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;36m0\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 3\u001b[0m \u001b[0mdf\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mdf\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mtranspose\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 4\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 5\u001b[0m \u001b[0mcols\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mlist\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mdf\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mcolumns\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", 117 | "\u001b[1;32mD:\\Anaconda\\lib\\site-packages\\pandas\\core\\frame.py\u001b[0m in \u001b[0;36mset_index\u001b[1;34m(self, keys, drop, append, inplace, verify_integrity)\u001b[0m\n\u001b[0;32m 4176\u001b[0m \u001b[0mnames\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;32mNone\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 4177\u001b[0m \u001b[1;32melse\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m-> 4178\u001b[1;33m \u001b[0mlevel\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mframe\u001b[0m\u001b[1;33m[\u001b[0m\u001b[0mcol\u001b[0m\u001b[1;33m]\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0m_values\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 4179\u001b[0m \u001b[0mnames\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mcol\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 4180\u001b[0m \u001b[1;32mif\u001b[0m \u001b[0mdrop\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", 118 | "\u001b[1;32mD:\\Anaconda\\lib\\site-packages\\pandas\\core\\frame.py\u001b[0m in \u001b[0;36m__getitem__\u001b[1;34m(self, key)\u001b[0m\n\u001b[0;32m 2925\u001b[0m \u001b[1;32mif\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mcolumns\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mnlevels\u001b[0m \u001b[1;33m>\u001b[0m \u001b[1;36m1\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 2926\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0m_getitem_multilevel\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mkey\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m-> 2927\u001b[1;33m \u001b[0mindexer\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mcolumns\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mget_loc\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mkey\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 2928\u001b[0m \u001b[1;32mif\u001b[0m \u001b[0mis_integer\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mindexer\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 2929\u001b[0m \u001b[0mindexer\u001b[0m \u001b[1;33m=\u001b[0m \u001b[1;33m[\u001b[0m\u001b[0mindexer\u001b[0m\u001b[1;33m]\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", 119 | "\u001b[1;32mD:\\Anaconda\\lib\\site-packages\\pandas\\core\\indexes\\base.py\u001b[0m in \u001b[0;36mget_loc\u001b[1;34m(self, key, method, tolerance)\u001b[0m\n\u001b[0;32m 2657\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0m_engine\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mget_loc\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mkey\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 2658\u001b[0m \u001b[1;32mexcept\u001b[0m \u001b[0mKeyError\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m-> 2659\u001b[1;33m \u001b[1;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0m_engine\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mget_loc\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0m_maybe_cast_indexer\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mkey\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 2660\u001b[0m \u001b[0mindexer\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mself\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mget_indexer\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m[\u001b[0m\u001b[0mkey\u001b[0m\u001b[1;33m]\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mmethod\u001b[0m\u001b[1;33m=\u001b[0m\u001b[0mmethod\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mtolerance\u001b[0m\u001b[1;33m=\u001b[0m\u001b[0mtolerance\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 2661\u001b[0m \u001b[1;32mif\u001b[0m \u001b[0mindexer\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mndim\u001b[0m \u001b[1;33m>\u001b[0m \u001b[1;36m1\u001b[0m \u001b[1;32mor\u001b[0m \u001b[0mindexer\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0msize\u001b[0m \u001b[1;33m>\u001b[0m \u001b[1;36m1\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", 120 | "\u001b[1;32mpandas/_libs/index.pyx\u001b[0m in \u001b[0;36mpandas._libs.index.IndexEngine.get_loc\u001b[1;34m()\u001b[0m\n", 121 | "\u001b[1;32mpandas/_libs/index.pyx\u001b[0m in \u001b[0;36mpandas._libs.index.IndexEngine.get_loc\u001b[1;34m()\u001b[0m\n", 122 | "\u001b[1;32mpandas/_libs/hashtable_class_helper.pxi\u001b[0m in \u001b[0;36mpandas._libs.hashtable.PyObjectHashTable.get_item\u001b[1;34m()\u001b[0m\n", 123 | "\u001b[1;32mpandas/_libs/hashtable_class_helper.pxi\u001b[0m in \u001b[0;36mpandas._libs.hashtable.PyObjectHashTable.get_item\u001b[1;34m()\u001b[0m\n", 124 | "\u001b[1;31mKeyError\u001b[0m: 0" 125 | ] 126 | } 127 | ], 128 | "source": [ 129 | "symbol = 'AAPL'\n", 130 | "df_balance_sheet = scrape_table('https://finance.yahoo.com/quote/' + symbol + '/balance-sheet?p=' + symbol)" 131 | ] 132 | }, 133 | { 134 | "cell_type": "code", 135 | "execution_count": null, 136 | "metadata": {}, 137 | "outputs": [], 138 | "source": [ 139 | "df_balance_sheet" 140 | ] 141 | }, 142 | { 143 | "cell_type": "code", 144 | "execution_count": null, 145 | "metadata": {}, 146 | "outputs": [], 147 | "source": [ 148 | "df_income_statement = scrape_table('https://finance.yahoo.com/quote/' + symbol + '/financials?p=' + symbol)\n" 149 | ] 150 | }, 151 | { 152 | "cell_type": "code", 153 | "execution_count": null, 154 | "metadata": {}, 155 | "outputs": [], 156 | "source": [ 157 | "df_income_statement" 158 | ] 159 | }, 160 | { 161 | "cell_type": "code", 162 | "execution_count": null, 163 | "metadata": {}, 164 | "outputs": [], 165 | "source": [ 166 | "df_cashflow_statement = scrape_table('https://finance.yahoo.com/quote/' + symbol + '/cash-flow?p=' + symbol)\n" 167 | ] 168 | }, 169 | { 170 | "cell_type": "code", 171 | "execution_count": null, 172 | "metadata": {}, 173 | "outputs": [], 174 | "source": [ 175 | "df_cashflow_statement" 176 | ] 177 | }, 178 | { 179 | "cell_type": "markdown", 180 | "metadata": {}, 181 | "source": [ 182 | "### Define one function that scraps everything and puts in a Single Dataframe for a given ticker" 183 | ] 184 | }, 185 | { 186 | "cell_type": "code", 187 | "execution_count": null, 188 | "metadata": {}, 189 | "outputs": [], 190 | "source": [ 191 | "def scrape(symbol):\n", 192 | " print('Attempting to scrape data for ' + symbol)\n", 193 | "\n", 194 | " df_balance_sheet = scrape_table('https://finance.yahoo.com/quote/' + symbol + '/balance-sheet?p=' + symbol)\n", 195 | " df_balance_sheet = df_balance_sheet.set_index('Date')\n", 196 | "\n", 197 | " df_income_statement = scrape_table('https://finance.yahoo.com/quote/' + symbol + '/financials?p=' + symbol)\n", 198 | " df_income_statement = df_income_statement.set_index('Date')\n", 199 | " \n", 200 | " df_cash_flow = scrape_table('https://finance.yahoo.com/quote/' + symbol + '/cash-flow?p=' + symbol)\n", 201 | " df_cash_flow = df_cash_flow.set_index('Date')\n", 202 | " \n", 203 | " df_joined = df_balance_sheet \\\n", 204 | " .join(df_income_statement, on='Date', how='outer', rsuffix=' - Income Statement') \\\n", 205 | " .join(df_cash_flow, on='Date', how='outer', rsuffix=' - Cash Flow') \\\n", 206 | " .dropna(axis=1, how='all') \\\n", 207 | " .reset_index()\n", 208 | " \n", 209 | " df_joined.insert(1, 'Symbol', symbol)\n", 210 | " print('Successfully scraped data for ' + symbol)\n", 211 | " return df_joined\n", 212 | " " 213 | ] 214 | }, 215 | { 216 | "cell_type": "code", 217 | "execution_count": null, 218 | "metadata": {}, 219 | "outputs": [], 220 | "source": [ 221 | "financial_data_reliance = scrape('RELIANCE.NS')" 222 | ] 223 | }, 224 | { 225 | "cell_type": "code", 226 | "execution_count": null, 227 | "metadata": {}, 228 | "outputs": [], 229 | "source": [ 230 | "financial_data_aapl" 231 | ] 232 | }, 233 | { 234 | "cell_type": "markdown", 235 | "metadata": {}, 236 | "source": [ 237 | "### Scrape for a list of symbols" 238 | ] 239 | }, 240 | { 241 | "cell_type": "code", 242 | "execution_count": null, 243 | "metadata": {}, 244 | "outputs": [], 245 | "source": [ 246 | "def scrape_multi_symbols(symbols):\n", 247 | " return pd.concat([scrape(symbol) for symbol in symbols], sort=False)" 248 | ] 249 | }, 250 | { 251 | "cell_type": "code", 252 | "execution_count": null, 253 | "metadata": {}, 254 | "outputs": [], 255 | "source": [ 256 | "scrape_multi_symbols(['MSFT','TSLA'])" 257 | ] 258 | } 259 | ], 260 | "metadata": { 261 | "kernelspec": { 262 | "display_name": "Python 3", 263 | "language": "python", 264 | "name": "python3" 265 | }, 266 | "language_info": { 267 | "codemirror_mode": { 268 | "name": "ipython", 269 | "version": 3 270 | }, 271 | "file_extension": ".py", 272 | "mimetype": "text/x-python", 273 | "name": "python", 274 | "nbconvert_exporter": "python", 275 | "pygments_lexer": "ipython3", 276 | "version": "3.7.3" 277 | } 278 | }, 279 | "nbformat": 4, 280 | "nbformat_minor": 4 281 | } 282 | -------------------------------------------------------------------------------- /Docs/JupyterNotebooks/stooq.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Stooq" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "## Table of Contents\n", 15 | "\n", 16 | "* [Getting Started - Load Libraries and Dataset](#0)\n", 17 | "* [Historical Price and Volume for 1 Stock](#1)\n" 18 | ] 19 | }, 20 | { 21 | "cell_type": "markdown", 22 | "metadata": {}, 23 | "source": [ 24 | "\n", 25 | "## Getting Started - Load Libraries and Dataset" 26 | ] 27 | }, 28 | { 29 | "cell_type": "code", 30 | "execution_count": 2, 31 | "metadata": {}, 32 | "outputs": [], 33 | "source": [ 34 | "import pandas as pd\n", 35 | "import numpy as np\n", 36 | "import pandas_datareader.data as web\n", 37 | "from datetime import datetime" 38 | ] 39 | }, 40 | { 41 | "cell_type": "markdown", 42 | "metadata": {}, 43 | "source": [ 44 | "\n", 45 | "## Historical Price and Volume for 1 Stock" 46 | ] 47 | }, 48 | { 49 | "cell_type": "code", 50 | "execution_count": 3, 51 | "metadata": {}, 52 | "outputs": [ 53 | { 54 | "data": { 55 | "text/html": [ 56 | "
\n", 57 | "\n", 70 | "\n", 71 | " \n", 72 | " \n", 73 | " \n", 74 | " \n", 75 | " \n", 76 | " \n", 77 | " \n", 78 | " \n", 79 | " \n", 80 | " \n", 81 | " \n", 82 | " \n", 83 | " \n", 84 | " \n", 85 | " \n", 86 | " \n", 87 | " \n", 88 | " \n", 89 | " \n", 90 | " \n", 91 | " \n", 92 | " \n", 93 | " \n", 94 | " \n", 95 | " \n", 96 | " \n", 97 | " \n", 98 | " \n", 99 | " \n", 100 | " \n", 101 | " \n", 102 | " \n", 103 | " \n", 104 | " \n", 105 | " \n", 106 | " \n", 107 | " \n", 108 | " \n", 109 | " \n", 110 | " \n", 111 | " \n", 112 | " \n", 113 | " \n", 114 | " \n", 115 | " \n", 116 | " \n", 117 | " \n", 118 | " \n", 119 | " \n", 120 | " \n", 121 | " \n", 122 | " \n", 123 | " \n", 124 | " \n", 125 | " \n", 126 | " \n", 127 | " \n", 128 | " \n", 129 | " \n", 130 | " \n", 131 | " \n", 132 | " \n", 133 | " \n", 134 | " \n", 135 | " \n", 136 | " \n", 137 | " \n", 138 | " \n", 139 | " \n", 140 | " \n", 141 | " \n", 142 | " \n", 143 | " \n", 144 | " \n", 145 | " \n", 146 | " \n", 147 | " \n", 148 | " \n", 149 | " \n", 150 | " \n", 151 | " \n", 152 | " \n", 153 | " \n", 154 | " \n", 155 | " \n", 156 | " \n", 157 | " \n", 158 | " \n", 159 | " \n", 160 | " \n", 161 | " \n", 162 | " \n", 163 | " \n", 164 | " \n", 165 | " \n", 166 | " \n", 167 | " \n", 168 | " \n", 169 | " \n", 170 | " \n", 171 | " \n", 172 | " \n", 173 | " \n", 174 | " \n", 175 | " \n", 176 | " \n", 177 | " \n", 178 | " \n", 179 | "
OpenHighLowCloseVolume
Date
2019-12-31311.580313.150311.220312.88058745562
2019-12-30313.940314.090311.600312.13051211140
2019-12-27314.710314.770313.290313.85043775840
2019-12-26312.670313.940312.660313.93031914361
2019-12-24312.510312.560311.950312.28020851613
..................
2005-03-0398.73298.92797.94198.37276529410
2005-03-0297.99998.94797.89298.33379155581
2005-03-0198.00998.62398.00998.38258623358
2005-02-2898.31398.44197.40297.87285556312
2005-02-2597.59998.73297.51098.54575386071
\n", 180 | "

3737 rows × 5 columns

\n", 181 | "
" 182 | ], 183 | "text/plain": [ 184 | " Open High Low Close Volume\n", 185 | "Date \n", 186 | "2019-12-31 311.580 313.150 311.220 312.880 58745562\n", 187 | "2019-12-30 313.940 314.090 311.600 312.130 51211140\n", 188 | "2019-12-27 314.710 314.770 313.290 313.850 43775840\n", 189 | "2019-12-26 312.670 313.940 312.660 313.930 31914361\n", 190 | "2019-12-24 312.510 312.560 311.950 312.280 20851613\n", 191 | "... ... ... ... ... ...\n", 192 | "2005-03-03 98.732 98.927 97.941 98.372 76529410\n", 193 | "2005-03-02 97.999 98.947 97.892 98.333 79155581\n", 194 | "2005-03-01 98.009 98.623 98.009 98.382 58623358\n", 195 | "2005-02-28 98.313 98.441 97.402 97.872 85556312\n", 196 | "2005-02-25 97.599 98.732 97.510 98.545 75386071\n", 197 | "\n", 198 | "[3737 rows x 5 columns]" 199 | ] 200 | }, 201 | "execution_count": 3, 202 | "metadata": {}, 203 | "output_type": "execute_result" 204 | } 205 | ], 206 | "source": [ 207 | "# adjust the variables below\n", 208 | "ticker = 'SPY'\n", 209 | "start = datetime(1990,1,1)\n", 210 | "end = datetime(2020,1,1)\n", 211 | "\n", 212 | "df = web.DataReader(ticker, 'stooq', start, end)\n", 213 | "df" 214 | ] 215 | } 216 | ], 217 | "metadata": { 218 | "interpreter": { 219 | "hash": "366707ea24bdd07af2745ddffc3e5dd8201944abddf6c7f0911e7ecd4d105ee5" 220 | }, 221 | "kernelspec": { 222 | "display_name": "Python 3.8.9 64-bit ('venv': venv)", 223 | "language": "python", 224 | "name": "python3" 225 | }, 226 | "language_info": { 227 | "codemirror_mode": { 228 | "name": "ipython", 229 | "version": 3 230 | }, 231 | "file_extension": ".py", 232 | "mimetype": "text/x-python", 233 | "name": "python", 234 | "nbconvert_exporter": "python", 235 | "pygments_lexer": "ipython3", 236 | "version": "3.8.9" 237 | }, 238 | "orig_nbformat": 4 239 | }, 240 | "nbformat": 4, 241 | "nbformat_minor": 2 242 | } 243 | -------------------------------------------------------------------------------- /Docs/JupyterNotebooks/yfinance.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Yahoo Finance" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "## Table of Contents\n", 15 | "\n", 16 | "* [Getting Started - Load Libraries and Dataset](#0)\n", 17 | "* [Historical Price and Volume for 1 Stock](#1)\n", 18 | "* [Adding Time Periods](#2)\n", 19 | "* [Frequency Setting](#3)\n", 20 | "* [Stock Splits and Dividends](#4)\n", 21 | "* [Multiple Tickers](#5)\n", 22 | "* [Finanical Indices](#6)\n", 23 | "* [Currencies](#7)\n", 24 | "* [Crypto](#8)\n", 25 | "* [Mutual Funds](#9)\n", 26 | "* [Treasury Rates](#10)\n", 27 | "* [Stock Fundamentals](#11)\n", 28 | "* [Financials](#12)\n", 29 | "* [Put Call Options](#13)\n", 30 | "* [Stream Real Time Data](#14)\n" 31 | ] 32 | }, 33 | { 34 | "cell_type": "markdown", 35 | "metadata": {}, 36 | "source": [ 37 | "\n", 38 | "## Getting Started - Load Libraries and Dataset" 39 | ] 40 | }, 41 | { 42 | "cell_type": "code", 43 | "execution_count": null, 44 | "metadata": {}, 45 | "outputs": [], 46 | "source": [ 47 | "import numpy as np\n", 48 | "import yfinance as yf" 49 | ] 50 | }, 51 | { 52 | "cell_type": "markdown", 53 | "metadata": {}, 54 | "source": [ 55 | "\n", 56 | "## Historical Price and Volume for 1 Stock" 57 | ] 58 | }, 59 | { 60 | "cell_type": "code", 61 | "execution_count": null, 62 | "metadata": {}, 63 | "outputs": [], 64 | "source": [ 65 | "ticker = 'GE'" 66 | ] 67 | }, 68 | { 69 | "cell_type": "code", 70 | "execution_count": null, 71 | "metadata": {}, 72 | "outputs": [], 73 | "source": [ 74 | "yf.download(ticker)" 75 | ] 76 | }, 77 | { 78 | "cell_type": "code", 79 | "execution_count": null, 80 | "metadata": {}, 81 | "outputs": [], 82 | "source": [ 83 | "GE = yf.download(ticker)" 84 | ] 85 | }, 86 | { 87 | "cell_type": "code", 88 | "execution_count": null, 89 | "metadata": {}, 90 | "outputs": [], 91 | "source": [ 92 | "GE.head()" 93 | ] 94 | }, 95 | { 96 | "cell_type": "code", 97 | "execution_count": null, 98 | "metadata": {}, 99 | "outputs": [], 100 | "source": [ 101 | "GE.tail()" 102 | ] 103 | }, 104 | { 105 | "cell_type": "markdown", 106 | "metadata": {}, 107 | "source": [ 108 | "\n", 109 | "## Adding Time Periods" 110 | ] 111 | }, 112 | { 113 | "cell_type": "code", 114 | "execution_count": null, 115 | "metadata": {}, 116 | "outputs": [], 117 | "source": [ 118 | "yf.download(ticker, start = \"2014-01-01\", end = \"2018-12-31\")" 119 | ] 120 | }, 121 | { 122 | "cell_type": "code", 123 | "execution_count": null, 124 | "metadata": {}, 125 | "outputs": [], 126 | "source": [ 127 | "GE = yf.download(ticker, start = \"2014-01-01\", end = \"2018-12-31\")" 128 | ] 129 | }, 130 | { 131 | "cell_type": "code", 132 | "execution_count": null, 133 | "metadata": {}, 134 | "outputs": [], 135 | "source": [ 136 | "GE.info()" 137 | ] 138 | }, 139 | { 140 | "cell_type": "code", 141 | "execution_count": null, 142 | "metadata": {}, 143 | "outputs": [], 144 | "source": [ 145 | "yf.download(ticker, period = \"ytd\")" 146 | ] 147 | }, 148 | { 149 | "cell_type": "code", 150 | "execution_count": null, 151 | "metadata": {}, 152 | "outputs": [], 153 | "source": [ 154 | "yf.download(ticker, period = \"1mo\")" 155 | ] 156 | }, 157 | { 158 | "cell_type": "code", 159 | "execution_count": null, 160 | "metadata": {}, 161 | "outputs": [], 162 | "source": [ 163 | "yf.download(ticker, period = \"5d\")" 164 | ] 165 | }, 166 | { 167 | "cell_type": "code", 168 | "execution_count": null, 169 | "metadata": {}, 170 | "outputs": [], 171 | "source": [ 172 | "yf.download(ticker, period = \"10y\")" 173 | ] 174 | }, 175 | { 176 | "cell_type": "markdown", 177 | "metadata": {}, 178 | "source": [ 179 | "\n", 180 | "## Frequency Setting" 181 | ] 182 | }, 183 | { 184 | "cell_type": "code", 185 | "execution_count": null, 186 | "metadata": {}, 187 | "outputs": [], 188 | "source": [ 189 | "yf.download('GE',period='1mo',interval='1h')" 190 | ] 191 | }, 192 | { 193 | "cell_type": "code", 194 | "execution_count": null, 195 | "metadata": {}, 196 | "outputs": [], 197 | "source": [ 198 | "yf.download('GE',period='1mo',interval='5m')" 199 | ] 200 | }, 201 | { 202 | "cell_type": "code", 203 | "execution_count": null, 204 | "metadata": {}, 205 | "outputs": [], 206 | "source": [ 207 | "GE = yf.download('GE',period='5d',interval='5m')" 208 | ] 209 | }, 210 | { 211 | "cell_type": "code", 212 | "execution_count": null, 213 | "metadata": {}, 214 | "outputs": [], 215 | "source": [ 216 | "GE.head()" 217 | ] 218 | }, 219 | { 220 | "cell_type": "code", 221 | "execution_count": null, 222 | "metadata": {}, 223 | "outputs": [], 224 | "source": [ 225 | "#Pre or post market data\n", 226 | "GE=yf.download('GE',prepost=True,period='5d',interval='5m')" 227 | ] 228 | }, 229 | { 230 | "cell_type": "markdown", 231 | "metadata": {}, 232 | "source": [ 233 | "\n", 234 | "## Stock Splits and Dividends" 235 | ] 236 | }, 237 | { 238 | "cell_type": "code", 239 | "execution_count": null, 240 | "metadata": {}, 241 | "outputs": [], 242 | "source": [ 243 | "ticker = \"AAPL\"" 244 | ] 245 | }, 246 | { 247 | "cell_type": "code", 248 | "execution_count": null, 249 | "metadata": {}, 250 | "outputs": [], 251 | "source": [ 252 | "# action = True for dividend and Stock Split\n", 253 | "AAPL = yf.download(ticker, period=\"10y\", actions = True)" 254 | ] 255 | }, 256 | { 257 | "cell_type": "code", 258 | "execution_count": null, 259 | "metadata": {}, 260 | "outputs": [], 261 | "source": [ 262 | "AAPL.head()" 263 | ] 264 | }, 265 | { 266 | "cell_type": "code", 267 | "execution_count": null, 268 | "metadata": {}, 269 | "outputs": [], 270 | "source": [ 271 | "AAPL[AAPL[\"Dividends\"]>0]" 272 | ] 273 | }, 274 | { 275 | "cell_type": "code", 276 | "execution_count": null, 277 | "metadata": {}, 278 | "outputs": [], 279 | "source": [ 280 | "AAPL.loc[\"2019-08-05\":\"2019-08-15\"]" 281 | ] 282 | }, 283 | { 284 | "cell_type": "code", 285 | "execution_count": null, 286 | "metadata": {}, 287 | "outputs": [], 288 | "source": [ 289 | "AAPL.loc[\"2019-08-05\":\"2019-08-15\"].diff()" 290 | ] 291 | }, 292 | { 293 | "cell_type": "code", 294 | "execution_count": null, 295 | "metadata": {}, 296 | "outputs": [], 297 | "source": [ 298 | "AAPL[AAPL[\"Stock Splits\"] > 0]" 299 | ] 300 | }, 301 | { 302 | "cell_type": "markdown", 303 | "metadata": {}, 304 | "source": [ 305 | "\n", 306 | "## Multiple Tickers" 307 | ] 308 | }, 309 | { 310 | "cell_type": "code", 311 | "execution_count": null, 312 | "metadata": {}, 313 | "outputs": [], 314 | "source": [ 315 | "ticker = ['GE', 'AAPL','FB']" 316 | ] 317 | }, 318 | { 319 | "cell_type": "code", 320 | "execution_count": null, 321 | "metadata": {}, 322 | "outputs": [], 323 | "source": [ 324 | " yf.download(ticker, period=\"5y\")" 325 | ] 326 | }, 327 | { 328 | "cell_type": "code", 329 | "execution_count": null, 330 | "metadata": {}, 331 | "outputs": [], 332 | "source": [ 333 | " stock=yf.download(ticker, period=\"5y\").Close" 334 | ] 335 | }, 336 | { 337 | "cell_type": "code", 338 | "execution_count": null, 339 | "metadata": {}, 340 | "outputs": [], 341 | "source": [ 342 | "import matplotlib.pyplot as plt" 343 | ] 344 | }, 345 | { 346 | "cell_type": "code", 347 | "execution_count": null, 348 | "metadata": {}, 349 | "outputs": [], 350 | "source": [ 351 | "stock.plot()\n", 352 | "plt.show()" 353 | ] 354 | }, 355 | { 356 | "cell_type": "markdown", 357 | "metadata": {}, 358 | "source": [ 359 | "\n", 360 | "## Financial Indices" 361 | ] 362 | }, 363 | { 364 | "cell_type": "code", 365 | "execution_count": null, 366 | "metadata": {}, 367 | "outputs": [], 368 | "source": [ 369 | "index = ['^DJI', '^GSPC']" 370 | ] 371 | }, 372 | { 373 | "cell_type": "code", 374 | "execution_count": null, 375 | "metadata": {}, 376 | "outputs": [], 377 | "source": [ 378 | "stock = yf.download(index,period='10y').Close" 379 | ] 380 | }, 381 | { 382 | "cell_type": "code", 383 | "execution_count": null, 384 | "metadata": {}, 385 | "outputs": [], 386 | "source": [ 387 | "stock.plot()\n", 388 | "plt.show()" 389 | ] 390 | }, 391 | { 392 | "cell_type": "code", 393 | "execution_count": null, 394 | "metadata": {}, 395 | "outputs": [], 396 | "source": [ 397 | "#Total Return\n", 398 | "index = ['^DJITR', '^SP500TR']" 399 | ] 400 | }, 401 | { 402 | "cell_type": "code", 403 | "execution_count": null, 404 | "metadata": {}, 405 | "outputs": [], 406 | "source": [ 407 | "indexes = yf.download(index,period='10y').Close" 408 | ] 409 | }, 410 | { 411 | "cell_type": "code", 412 | "execution_count": null, 413 | "metadata": {}, 414 | "outputs": [], 415 | "source": [ 416 | "indexes" 417 | ] 418 | }, 419 | { 420 | "cell_type": "markdown", 421 | "metadata": {}, 422 | "source": [ 423 | "\n", 424 | "## Currencies" 425 | ] 426 | }, 427 | { 428 | "cell_type": "code", 429 | "execution_count": null, 430 | "metadata": {}, 431 | "outputs": [], 432 | "source": [ 433 | "#Tickers\n", 434 | "ticker1 = \"EURUSD=X\"\n", 435 | "ticker2 = \"USDEUR=X\"" 436 | ] 437 | }, 438 | { 439 | "cell_type": "code", 440 | "execution_count": null, 441 | "metadata": {}, 442 | "outputs": [], 443 | "source": [ 444 | "yf.download(ticker1,period='5y')" 445 | ] 446 | }, 447 | { 448 | "cell_type": "code", 449 | "execution_count": null, 450 | "metadata": {}, 451 | "outputs": [], 452 | "source": [ 453 | "yf.download(ticker2,period='5y')" 454 | ] 455 | }, 456 | { 457 | "cell_type": "markdown", 458 | "metadata": {}, 459 | "source": [ 460 | "\n", 461 | "## Crypto" 462 | ] 463 | }, 464 | { 465 | "cell_type": "code", 466 | "execution_count": null, 467 | "metadata": {}, 468 | "outputs": [], 469 | "source": [ 470 | "#Tickers\n", 471 | "ticker1 = [\"BTC-USD\", \"ETH-USD\"]" 472 | ] 473 | }, 474 | { 475 | "cell_type": "code", 476 | "execution_count": null, 477 | "metadata": {}, 478 | "outputs": [], 479 | "source": [ 480 | "data = yf.download(ticker1,start='2019-08-01',end='2020-05-01')" 481 | ] 482 | }, 483 | { 484 | "cell_type": "code", 485 | "execution_count": null, 486 | "metadata": {}, 487 | "outputs": [], 488 | "source": [ 489 | "data.head()" 490 | ] 491 | }, 492 | { 493 | "cell_type": "markdown", 494 | "metadata": {}, 495 | "source": [ 496 | "\n", 497 | "## Mutual Funds" 498 | ] 499 | }, 500 | { 501 | "cell_type": "code", 502 | "execution_count": null, 503 | "metadata": {}, 504 | "outputs": [], 505 | "source": [ 506 | "#Tickers\n", 507 | "#20+Y Treasury Bobd ETF and Vivoldi Multi-Strategy Fund Class\n", 508 | "ticker1 = [\"TLT\", \"OMOIX\"]" 509 | ] 510 | }, 511 | { 512 | "cell_type": "code", 513 | "execution_count": null, 514 | "metadata": {}, 515 | "outputs": [], 516 | "source": [ 517 | "data = yf.download(ticker1,start='2019-08-01',end='2020-05-01')" 518 | ] 519 | }, 520 | { 521 | "cell_type": "code", 522 | "execution_count": null, 523 | "metadata": {}, 524 | "outputs": [], 525 | "source": [ 526 | "data.head()" 527 | ] 528 | }, 529 | { 530 | "cell_type": "markdown", 531 | "metadata": {}, 532 | "source": [ 533 | "\n", 534 | "## Treasury Rates" 535 | ] 536 | }, 537 | { 538 | "cell_type": "code", 539 | "execution_count": null, 540 | "metadata": {}, 541 | "outputs": [], 542 | "source": [ 543 | "#10Y and 5Y Treasury Rates\n", 544 | "ticker1 = [\"^TNX\", \"^FVX\"]" 545 | ] 546 | }, 547 | { 548 | "cell_type": "code", 549 | "execution_count": null, 550 | "metadata": {}, 551 | "outputs": [], 552 | "source": [ 553 | "data = yf.download(ticker1,period=\"5y\")" 554 | ] 555 | }, 556 | { 557 | "cell_type": "code", 558 | "execution_count": null, 559 | "metadata": {}, 560 | "outputs": [], 561 | "source": [ 562 | "data.head()" 563 | ] 564 | }, 565 | { 566 | "cell_type": "markdown", 567 | "metadata": {}, 568 | "source": [ 569 | "\n", 570 | "## Stock Fundamentals" 571 | ] 572 | }, 573 | { 574 | "cell_type": "code", 575 | "execution_count": null, 576 | "metadata": {}, 577 | "outputs": [], 578 | "source": [ 579 | "ticker =\"DIS\"\n", 580 | "dis = yf.Ticker(ticker)" 581 | ] 582 | }, 583 | { 584 | "cell_type": "code", 585 | "execution_count": null, 586 | "metadata": {}, 587 | "outputs": [], 588 | "source": [ 589 | "dis.ticker" 590 | ] 591 | }, 592 | { 593 | "cell_type": "code", 594 | "execution_count": null, 595 | "metadata": {}, 596 | "outputs": [], 597 | "source": [ 598 | "data=dis.history()" 599 | ] 600 | }, 601 | { 602 | "cell_type": "code", 603 | "execution_count": null, 604 | "metadata": {}, 605 | "outputs": [], 606 | "source": [ 607 | "dis.info" 608 | ] 609 | }, 610 | { 611 | "cell_type": "code", 612 | "execution_count": null, 613 | "metadata": {}, 614 | "outputs": [], 615 | "source": [ 616 | "import pandas as pd\n", 617 | "df = pd.Series(dis.info,name=\"DIS\").to_frame().T\n", 618 | "df" 619 | ] 620 | }, 621 | { 622 | "cell_type": "code", 623 | "execution_count": null, 624 | "metadata": {}, 625 | "outputs": [], 626 | "source": [ 627 | "ticker = [\"MSFT\",\"FB\"]" 628 | ] 629 | }, 630 | { 631 | "cell_type": "code", 632 | "execution_count": null, 633 | "metadata": {}, 634 | "outputs": [], 635 | "source": [ 636 | "for i in ticker:\n", 637 | " df.loc[\"{}\".format(i)] = pd.Series(yf.Ticker(i).info)" 638 | ] 639 | }, 640 | { 641 | "cell_type": "code", 642 | "execution_count": null, 643 | "metadata": {}, 644 | "outputs": [], 645 | "source": [ 646 | "df.info()" 647 | ] 648 | }, 649 | { 650 | "cell_type": "markdown", 651 | "metadata": {}, 652 | "source": [ 653 | "\n", 654 | "## Financials" 655 | ] 656 | }, 657 | { 658 | "cell_type": "code", 659 | "execution_count": null, 660 | "metadata": {}, 661 | "outputs": [], 662 | "source": [ 663 | "dis" 664 | ] 665 | }, 666 | { 667 | "cell_type": "code", 668 | "execution_count": null, 669 | "metadata": {}, 670 | "outputs": [], 671 | "source": [ 672 | "dis.balance_sheet" 673 | ] 674 | }, 675 | { 676 | "cell_type": "code", 677 | "execution_count": null, 678 | "metadata": {}, 679 | "outputs": [], 680 | "source": [ 681 | "dis.financials" 682 | ] 683 | }, 684 | { 685 | "cell_type": "code", 686 | "execution_count": null, 687 | "metadata": { 688 | "scrolled": true 689 | }, 690 | "outputs": [], 691 | "source": [ 692 | "dis.cashflow" 693 | ] 694 | }, 695 | { 696 | "cell_type": "markdown", 697 | "metadata": {}, 698 | "source": [ 699 | "\n", 700 | "## Put Call Options" 701 | ] 702 | }, 703 | { 704 | "cell_type": "code", 705 | "execution_count": null, 706 | "metadata": {}, 707 | "outputs": [], 708 | "source": [ 709 | "dis" 710 | ] 711 | }, 712 | { 713 | "cell_type": "code", 714 | "execution_count": null, 715 | "metadata": {}, 716 | "outputs": [], 717 | "source": [ 718 | "dis.option_chain()" 719 | ] 720 | }, 721 | { 722 | "cell_type": "code", 723 | "execution_count": null, 724 | "metadata": {}, 725 | "outputs": [], 726 | "source": [ 727 | "calls = dis.option_chain()[0]\n", 728 | "calls" 729 | ] 730 | }, 731 | { 732 | "cell_type": "code", 733 | "execution_count": null, 734 | "metadata": {}, 735 | "outputs": [], 736 | "source": [ 737 | "puts = dis.option_chain()[1]\n", 738 | "puts" 739 | ] 740 | }, 741 | { 742 | "cell_type": "markdown", 743 | "metadata": {}, 744 | "source": [ 745 | "\n", 746 | "## Stream Realtime Data" 747 | ] 748 | }, 749 | { 750 | "cell_type": "code", 751 | "execution_count": null, 752 | "metadata": {}, 753 | "outputs": [], 754 | "source": [ 755 | "import time" 756 | ] 757 | }, 758 | { 759 | "cell_type": "code", 760 | "execution_count": null, 761 | "metadata": {}, 762 | "outputs": [], 763 | "source": [ 764 | "ticker1 =\"EURUSD=X\"" 765 | ] 766 | }, 767 | { 768 | "cell_type": "code", 769 | "execution_count": null, 770 | "metadata": {}, 771 | "outputs": [], 772 | "source": [ 773 | "data = yf.download(ticker1,interval = '1m', period='1d')" 774 | ] 775 | }, 776 | { 777 | "cell_type": "code", 778 | "execution_count": null, 779 | "metadata": {}, 780 | "outputs": [], 781 | "source": [ 782 | "print(data.index[-1], data.iloc[-1,3])" 783 | ] 784 | }, 785 | { 786 | "cell_type": "code", 787 | "execution_count": null, 788 | "metadata": {}, 789 | "outputs": [], 790 | "source": [ 791 | "# #Every 5 second data corresponding to 5 seconds\n", 792 | "# while True:\n", 793 | "# time.sleep(5)\n", 794 | "# data = yf.download(ticker1,interval = '1m', period='1d')\n", 795 | "# print(data.index[-1], data.iloc[-1,3])" 796 | ] 797 | } 798 | ], 799 | "metadata": { 800 | "kernelspec": { 801 | "display_name": "Python 3", 802 | "language": "python", 803 | "name": "python3" 804 | }, 805 | "language_info": { 806 | "codemirror_mode": { 807 | "name": "ipython", 808 | "version": 3 809 | }, 810 | "file_extension": ".py", 811 | "mimetype": "text/x-python", 812 | "name": "python", 813 | "nbconvert_exporter": "python", 814 | "pygments_lexer": "ipython3", 815 | "version": "3.8.9" 816 | } 817 | }, 818 | "nbformat": 4, 819 | "nbformat_minor": 4 820 | } 821 | -------------------------------------------------------------------------------- /Docs/Macroeconomic.rst: -------------------------------------------------------------------------------- 1 | .. _Macroeconomic: 2 | 3 | 4 | Macroeconomic Data 5 | ================ 6 | 7 | The Data of fundamentals for several financial instruments across several frenquncies can be downloaded freely. 8 | There are many alternatives out there (FRED, AlphaVantage, Quandl etc.). 9 | 10 | This page describes how to download the data from following sources. 11 | 12 | Economic Indicators 13 | ------------------- 14 | - `Alphavantage `_ 15 | 16 | Treasury Rates 17 | -------------- 18 | - `FRED `_ 19 | -------------------------------------------------------------------------------- /Docs/News.rst: -------------------------------------------------------------------------------- 1 | .. _Sentiments: 2 | 3 | News and Sentiments 4 | ================ 5 | 6 | The sentiment data for several instruments and market from thousands of sources can be downloaded freely and realtime 7 | 8 | This page describes how to download the data from different sources. 9 | 10 | Sentiment data: 11 | ----------------------------- 12 | 13 | The data can be obtained from the following sources. Click to view the code to retrieve it 14 | 15 | - `Finviz `_ 16 | 17 | - `FRED `_ 18 | 19 | - `IEX `_ 20 | 21 | 22 | Insider Trades 23 | ----------------------------- 24 | 25 | - `Finviz `_ 26 | 27 | - `IEX `_ -------------------------------------------------------------------------------- /Docs/Oanda.rst: -------------------------------------------------------------------------------- 1 | .. _Oanda: 2 | 3 | Oanda 4 | ===== 5 | 6 | .. note:: 7 | Refer to `Oanda Jupyter Notebook `_ for more details. 8 | 9 | Table of Contents 10 | ----------------- 11 | 12 | - `Installation`_ 13 | - `Usage`_ 14 | - `Historical OHLA and Volume for 1 Currency`_ 15 | - `Setting the Frequency`_ 16 | 17 | Installation 18 | ------------ 19 | 20 | Install with pip: 21 | 22 | .. code:: ipython3 23 | 24 | pip install oandapyV20 25 | 26 | Or install with Github: 27 | 28 | .. code:: ipython3 29 | 30 | pip install git+https://github.com/hootnot/oanda-api-v20.git 31 | 32 | Usage 33 | ----- 34 | 35 | .. note:: 36 | This library requires a config file for accessing the API. 37 | An example config file can be found `here `_. 38 | 39 | You also need to set up an account on `Oanda's Website `_ 40 | to receive an access token and username. 41 | 42 | Import all necessary libraries: 43 | 44 | .. code:: ipython3 45 | 46 | import pandas as pd 47 | import tpqoa 48 | 49 | api = tpqoa.tpqoa("oanda.cfg") 50 | 51 | Historical OHLA and Volume for 1 Currency 52 | ----------------------------------------- 53 | 54 | Outputs the OHLCV for the given ``ticker``, within the given ``start`` and ``end`` dates. 55 | 56 | .. code:: ipython3 57 | 58 | ticker = "US30_USD" 59 | start = "2018-09-01" 60 | end = "2019-09-01" 61 | 62 | api.get_history(ticker, start, end, "D", "B") 63 | 64 | Setting the Frequency 65 | --------------------- 66 | 67 | Sets the frequency to every 1 minute, dented by ``M1`` 68 | 69 | .. code:: ipython3 70 | 71 | api.get_history("EUR_USD", "2019-08-01", "2019-09-01", "M1", "B") 72 | 73 | Sets the frequency to about every 5 seconds, using ``S5`` 74 | 75 | .. code:: ipython3 76 | 77 | api.get_history("EUR_USD", "2019-09-01", "2019-09-04", "S5", "B") -------------------------------------------------------------------------------- /Docs/OptionFuture.rst: -------------------------------------------------------------------------------- 1 | .. _OptionFuture: 2 | 3 | ================= 4 | Options and Future 5 | ================= 6 | 7 | The Data of Options and Future from several indices for thousands of tickers across several frenquncies can be downloaded freely. 8 | There are many alternatives out there (YahooFinance, AlphaVantage, Quandl etc.). 9 | 10 | This page describes how to download the data from different sources. 11 | 12 | Options and Future - Volume and Price data: 13 | ----------------------------- 14 | 15 | The data can be obtained from the following sources. Click to view the code to retrieve it 16 | 17 | - `Yahoo Finance `_ 18 | 19 | - `Quandl `_ 20 | 21 | 22 | 23 | Changing Time period 24 | ----------------------------- 25 | 26 | - `Yahoo Finance `_ 27 | 28 | 29 | - `Quandl `_ 30 | 31 | 32 | Realtime Data 33 | ----------------------------- 34 | 35 | - `Yahoo Finance `_ 36 | -------------------------------------------------------------------------------- /Docs/Realtime.rst: -------------------------------------------------------------------------------- 1 | .. _ML_Sup: 2 | 3 | Real Time Data 4 | ========================== 5 | 6 | To use the rl baselines with custom environments, they just need to follow the *gym* interface. 7 | That is to say, your environment must implement the following methods (and inherits from OpenAI Gym Class): 8 | 9 | 10 | .. note:: 11 | If you are using images as input, the input values must be in [0, 255] as the observation 12 | is normalized (dividing by 255 to have values in [0, 1]) when using CNN policies. 13 | 14 | 15 | 16 | .. code-block:: python 17 | 18 | import gym 19 | from gym import spaces 20 | 21 | class CustomEnv(gym.Env): 22 | """Custom Environment that follows gym interface""" 23 | metadata = {'render.modes': ['human']} 24 | 25 | def __init__(self, arg1, arg2, ...): 26 | super(CustomEnv, self).__init__() 27 | # Define action and observation space 28 | # They must be gym.spaces objects 29 | # Example when using discrete actions: 30 | self.action_space = spaces.Discrete(N_DISCRETE_ACTIONS) 31 | # Example for using image as input: 32 | self.observation_space = spaces.Box(low=0, high=255, 33 | shape=(HEIGHT, WIDTH, N_CHANNELS), dtype=np.uint8) 34 | 35 | def step(self, action): 36 | ... 37 | return observation, reward, done, info 38 | def reset(self): 39 | ... 40 | return observation # reward, done, info can't be included 41 | def render(self, mode='human'): 42 | ... 43 | def close (self): 44 | ... 45 | 46 | 47 | Then you can define and train a RL agent with: 48 | 49 | .. code-block:: python 50 | 51 | # Instantiate the env 52 | env = CustomEnv(arg1, ...) 53 | # Define and Train the agent 54 | model = A2C('CnnPolicy', env).learn(total_timesteps=1000) 55 | 56 | 57 | To check that your environment follows the gym interface, please use: 58 | 59 | .. code-block:: python 60 | 61 | from stable_baselines.common.env_checker import check_env 62 | 63 | env = CustomEnv(arg1, ...) 64 | # It will check your custom environment and output additional warnings if needed 65 | check_env(env) 66 | 67 | 68 | 69 | We have created a `colab notebook `_ for 70 | a concrete example of creating a custom environment. 71 | 72 | You can also find a `complete guide online `_ 73 | on creating a custom Gym environment. 74 | 75 | 76 | Optionally, you can also register the environment with gym, 77 | that will allow you to create the RL agent in one line (and use ``gym.make()`` to instantiate the env). 78 | 79 | 80 | In the project, for testing purposes, we use a custom environment named ``IdentityEnv`` 81 | defined `in this file `_. 82 | An example of how to use it can be found `here `_. 83 | -------------------------------------------------------------------------------- /Docs/Sources.rst: -------------------------------------------------------------------------------- 1 | .. _Sources: 2 | 3 | Data Sources 4 | ============ 5 | 6 | There are following Data sources for Finance: 7 | 8 | - `Yahoo Finance `_ 9 | - `Alphavantage `_ 10 | - `FRED `_ 11 | - `Oanda `_ 12 | - `FXCM `_ 13 | - `EOD Historical Data `_ 14 | - Stooq 15 | - Tiingo 16 | - Marketstack 17 | - IEX 18 | 19 | 20 | * :ref:`Details on Yahoo Finance ` 21 | 22 | 23 | Details on Yahoo Finance `YahooFinance page `_ 24 | ------------------------------------------------------------------ 25 | -------------------------------------------------------------------------------- /Docs/Stooq.rst: -------------------------------------------------------------------------------- 1 | .. _Stooq: 2 | 3 | Stooq 4 | ========= 5 | 6 | Stooq is an odd one. This website looks about 20 years old but it is a real hidden gem. 7 | By searching a ticker and going to ‘historical data’, you can get historical data going back over 20 years. 8 | You can also download a .csv. Unfortunately, there is no API access but its a great resource nonetheless. 9 | 10 | .. note:: 11 | Refer to `Stooq Jupyter Notebook `_ for more details. 12 | 13 | Table of Contents 14 | ----------------- 15 | 16 | - `Installation`_ 17 | - `Usage`_ 18 | - `Historical Price and Volume for 1 Stock`_ 19 | - `Adding Time Periods`_ 20 | - `Mutual Funds`_ 21 | 22 | Installation 23 | ------------ 24 | 25 | Install with pip: 26 | 27 | .. code:: ipython3 28 | 29 | pip install pandas-datareader 30 | 31 | Usage 32 | ----- 33 | 34 | Import all necessary libraries: 35 | 36 | .. code:: ipython3 37 | 38 | import pandas as pd 39 | import numpy as np 40 | import pandas_datareader.data as web 41 | from datetime import datetime 42 | 43 | .. note:: 44 | Replace the ticker variable to whatever you would like from the `Stooq Website `_ 45 | 46 | Historical Price and Volume for 1 Stock 47 | --------------------------------------- 48 | 49 | Gets the OHLCV for the given ``ticker``. 50 | 51 | .. code:: ipython3 52 | 53 | # adjust the variables below 54 | ticker = 'AAPL' 55 | 56 | df = web.DataReader(ticker, 'stooq') 57 | df 58 | 59 | Adding Time Periods 60 | ------------------- 61 | 62 | Extends the previous call by using ``start`` and ``end`` to denote a timeframe. 63 | 64 | .. code:: ipython3 65 | 66 | # adjust the variables below 67 | ticker = 'AAPL' 68 | start = datetime(1990,1,1) 69 | end = datetime(2020,1,1) 70 | 71 | df = web.DataReader(ticker, 'stooq', start, end) 72 | df 73 | 74 | Mutual Funds 75 | --------------- 76 | 77 | Another example showing this method can be called for more than equities. 78 | 79 | .. code:: ipython3 80 | 81 | mutual_fund = 'SPY' 82 | start = datetime(1990,1,1) 83 | end = datetime(2020,1,1) 84 | 85 | df = web.DataReader(mutual_fund, 'stooq', start, end) 86 | df 87 | -------------------------------------------------------------------------------- /Docs/YahooFinance.rst: -------------------------------------------------------------------------------- 1 | .. _YahooFinance: 2 | 3 | Yahoo Finance 4 | ============= 5 | 6 | Yahoo! Finance is a component of Yahoo’s network. It is the most widely used business news website in the United States, featuring stock quotes, press announcements, financial reports, and original content, as well as financial news, data, and commentary. They provide market data, fundamental and option data, market analysis, and news for cryptocurrencies, fiat currencies, commodities futures, equities, and bonds, as well as fundamental and option data, market analysis, and news. 7 | 8 | .. note:: 9 | Refer to `Yahoo Finance Jupyter Notebook `_ for more details. 10 | 11 | 12 | Table of Contents 13 | ----------------- 14 | 15 | - `Installation`_ 16 | - `Usage`_ 17 | - `Historical Price and Volume for 1 Stock`_ 18 | - `Adding Time Periods`_ 19 | - `Frequency Setting`_ 20 | - `Stock Split and Dividends`_ 21 | - `Importing Many Stocks`_ 22 | - `Financial Indices`_ 23 | - `Currencies`_ 24 | - `Cryptocurrencies`_ 25 | - `Mutual Funds`_ 26 | - `Treasury Rates`_ 27 | - `Stock Fundamentals`_ 28 | - `Financials`_ 29 | - `Put Call Options`_ 30 | - `Stream Realtime Data`_ 31 | 32 | Installation 33 | ------------ 34 | 35 | Install yfinance using pip: 36 | 37 | .. code:: ipython3 38 | 39 | pip install yfinance --upgrade --no-cache-dir 40 | 41 | .. note:: 42 | To install yfinance using conda, see `this `_ 43 | 44 | Usage 45 | ----- 46 | 47 | .. note:: 48 | YFinance automatically uses Pandas DataFrames. 49 | 50 | Import all necessary libraries: 51 | 52 | .. code:: ipython3 53 | 54 | import numpy as np 55 | import yfinance as yf 56 | 57 | Historical Price and Volume for 1 Stock 58 | --------------------------------------- 59 | 60 | Outputs a Pandas DataFrame containing the values for 61 | open, high, low, close, and volume (OHLCV) of an equity. 62 | 63 | .. code:: ipython3 64 | 65 | ticker = 'GE' 66 | yf.download(ticker) 67 | 68 | Adding Time Periods 69 | ------------------- 70 | 71 | Uses ``start`` and ``end`` to denote a time period to get the data from above between. 72 | 73 | .. code:: ipython3 74 | 75 | yf.download(ticker, start = "2014-01-01", end = "2018-12-31") 76 | GE = yf.download(ticker, start = "2014-01-01", end = "2018-12-31") 77 | GE.info() 78 | 79 | Output structure: 80 | 81 | .. parsed-literal:: 82 | 83 | 84 | DatetimeIndex: 1257 entries, 2014-01-02 to 2018-12-28 85 | Data columns (total 6 columns): 86 | Open 1257 non-null float64 87 | High 1257 non-null float64 88 | Low 1257 non-null float64 89 | Close 1257 non-null float64 90 | Adj Close 1257 non-null float64 91 | Volume 1257 non-null int64 92 | dtypes: float64(5), int64(1) 93 | memory usage: 68.7 KB 94 | 95 | Alternative, static time periods: 96 | 97 | .. code:: ipython3 98 | 99 | yf.download(ticker, period = "ytd") 100 | yf.download(ticker, period = "1mo") 101 | yf.download(ticker, period = "5d") 102 | yf.download(ticker, period = "10y") 103 | 104 | 105 | Frequency Setting 106 | ----------------- 107 | 108 | Outputs a similar Pandas DataFrame that breaks the OHLCV down into smaller 109 | minute or hour intervals. 110 | 111 | 112 | .. code:: ipython3 113 | 114 | yf.download('GE',period='1mo',interval='1h') 115 | yf.download('GE',period='1mo',interval='5m') 116 | GE = yf.download('GE',period='5d',interval='5m') 117 | 118 | You can even get pre and post market data using ``prepost``: 119 | 120 | .. code:: ipython3 121 | 122 | GE=yf.download('GE',prepost=True,period='5d',interval='5m') 123 | 124 | Stock Split and Dividends 125 | ------------------------- 126 | 127 | Gets the quarterly dividend data for the given ``ticker``. 128 | 129 | .. code:: ipython3 130 | 131 | ticker = "AAPL" 132 | # action = True for dividend and Stock Split 133 | AAPL = yf.download(ticker, period="10y", actions = True) 134 | AAPL.head() 135 | 136 | You can use Pandas to narrow the data down by date or other 137 | features, such as stock splits. 138 | 139 | .. code:: ipython3 140 | 141 | AAPL[AAPL["Dividends"]>0] 142 | AAPL.loc["2019-08-05":"2019-08-15"].diff() 143 | AAPL[AAPL["Stock Splits"] > 0] 144 | 145 | Importing Many Stocks 146 | --------------------- 147 | 148 | Use an array to get data on more than one stock. 149 | 150 | .. code:: ipython3 151 | 152 | ticker = ['GE', 'AAPL','FB'] 153 | yf.download(ticker, period="5y") 154 | 155 | .. code:: ipython3 156 | 157 | stock=yf.download(ticker, period="5y").Close 158 | 159 | 160 | Financial Indices 161 | ----------------- 162 | 163 | Getting OHLCV data on multiple indices with the ``download`` function. 164 | 165 | .. code:: ipython3 166 | 167 | index = ['^DJI', '^GSPC'] 168 | 169 | .. code:: ipython3 170 | 171 | stock = yf.download(index,period='10y').Close 172 | 173 | 174 | .. code:: ipython3 175 | 176 | #Total Return 177 | index = ['^DJITR', '^SP500TR'] 178 | 179 | .. code:: ipython3 180 | 181 | indexes = yf.download(index,period='10y').Close 182 | 183 | 184 | 185 | Currencies 186 | --------------- 187 | 188 | Getting currency OHLCV data with the ``download`` function. 189 | 190 | .. code:: ipython3 191 | 192 | #Tickers 193 | ticker1 = "EURUSD=X" 194 | ticker2 = "USDEUR=X" 195 | 196 | .. code:: ipython3 197 | 198 | yf.download(ticker1,period='5y') 199 | 200 | .. code:: ipython3 201 | 202 | yf.download(ticker2,period='5y') 203 | 204 | 205 | 206 | 207 | 208 | 209 | Cryptocurrencies 210 | ---------------- 211 | 212 | Getting crypto OHLCV data with the ``download`` function. 213 | 214 | .. code:: ipython3 215 | 216 | #Tickers 217 | ticker1 = ["BTC-USD", "ETH-USD"] 218 | 219 | .. code:: ipython3 220 | 221 | data = yf.download(ticker1,start='2019-08-01',end='2020-05-01') 222 | 223 | 224 | 225 | 226 | Mutual Funds 227 | --------------- 228 | 229 | Getting mutual fund data with the ``download`` function. 230 | 231 | .. code:: ipython3 232 | 233 | #Tickers 234 | #20+Y Treasury Bobd ETF and Vivoldi Multi-Strategy Fund Class 235 | ticker1 = ["TLT", "OMOIX"] 236 | 237 | .. code:: ipython3 238 | 239 | data = yf.download(ticker1,start='2019-08-01',end='2020-05-01') 240 | 241 | 242 | 243 | 244 | Treasury Rates 245 | --------------- 246 | 247 | Getting treasury rates with the ``download`` function. 248 | 249 | .. code:: ipython3 250 | 251 | #10Y and 5Y Treasury Rates 252 | ticker1 = ["^TNX", "^FVX"] 253 | 254 | .. code:: ipython3 255 | 256 | data = yf.download(ticker1,period="5y") 257 | 258 | 259 | Stock Fundamentals 260 | ------------------ 261 | 262 | To get fundamentals, use the ``Ticker`` object to instantiate new 263 | values. 264 | 265 | .. code:: ipython3 266 | 267 | ticker ="DIS" 268 | dis = yf.Ticker(ticker) 269 | 270 | Simply list the current ticker 271 | 272 | .. code:: ipython3 273 | 274 | dis.ticker 275 | 276 | .. parsed-literal:: 277 | 278 | 'DIS' 279 | 280 | Outputs 150+ features on the ticker, including: 281 | ``sector``, ``website``, ``ebitda``, ``targetLowPrice``, ``currentRatio``, 282 | ``currentPrice``, ``debtToEquity``, and ``totalRevenue``. 283 | 284 | .. code:: ipython3 285 | 286 | data=dis.info 287 | 288 | Summary of the information from the ``Ticker`` object. 289 | 290 | .. code:: ipython3 291 | 292 | ticker = ["MSFT","FB"] 293 | 294 | .. code:: ipython3 295 | 296 | for i in ticker: 297 | df.loc["{}".format(i)] = pd.Series(yf.Ticker(i).info) 298 | 299 | .. code:: ipython3 300 | 301 | df.info() 302 | 303 | Financials 304 | ---------- 305 | 306 | Designate your desired ticker. 307 | 308 | .. code:: ipython3 309 | 310 | ticker ="DIS" 311 | dis = yf.Ticker(ticker) 312 | 313 | Gets the balance sheet. 314 | 315 | .. code:: ipython3 316 | 317 | dis.balance_sheet 318 | 319 | Gets the income statement. 320 | 321 | .. code:: ipython3 322 | 323 | dis.financials 324 | 325 | Gets the statement of cash flows. 326 | 327 | .. code:: ipython3 328 | 329 | dis.cashflow 330 | 331 | Put Call Options 332 | ---------------- 333 | 334 | .. note:: 335 | This output does not default to a Pandas DataFrame. 336 | 337 | Designate your desired ticker. 338 | 339 | .. code:: ipython3 340 | 341 | ticker = "DIS" 342 | dis = yf.Ticker(ticker) 343 | 344 | Gets the ``call``, ``contractSymbol``, ``lastTradeDate``, ``strike``, 345 | ``lastPrice``, ``bid``, and ``ask``. 346 | 347 | .. code:: ipython3 348 | 349 | dis.option_chain() 350 | 351 | .. code:: ipython3 352 | 353 | calls = dis.option_chain()[0] 354 | calls 355 | 356 | .. code:: ipython3 357 | 358 | puts = dis.option_chain()[1] 359 | puts 360 | 361 | Stream Realtime Data 362 | -------------------- 363 | 364 | Continuously gets the latest data in 1 minute intervals. 365 | 366 | .. code:: ipython3 367 | 368 | import time 369 | 370 | .. code:: ipython3 371 | 372 | ticker1 ="EURUSD=X" 373 | data = yf.download(ticker1,interval = '1m', period='1d') 374 | print(data.index[-1], data.iloc[-1,3]) 375 | #Every 5 second data corresponding to 5 seconds 376 | while True: 377 | time.sleep(5) 378 | data = yf.download(ticker1,interval = '1m', period='1d') 379 | print(data.index[-1], data.iloc[-1,3]) 380 | -------------------------------------------------------------------------------- /Docs/finviz.rst: -------------------------------------------------------------------------------- 1 | .. _finviz: 2 | 3 | FinViz 4 | ====== 5 | 6 | .. note:: 7 | This library is ideal for fundamentals and sentiment analysis projects. 8 | 9 | 10 | .. note:: 11 | Refer to `FinViz Jupyter Notebook `_ for more details. 12 | 13 | Table of Contents 14 | ----------------- 15 | 16 | - `Installation`_ 17 | - `Usage`_ 18 | - `Stock Fundamentals`_ 19 | - `Ticker Description`_ 20 | - `Multiple Tickers`_ 21 | - `Sentiment and News`_ 22 | - `Insider Trades`_ 23 | 24 | Installation 25 | ------------ 26 | 27 | Install with pip: 28 | 29 | .. code:: ipython3 30 | 31 | pip install finvizfinance 32 | 33 | Or install from github: 34 | 35 | .. code:: ipython3 36 | 37 | git clone https://github.com/lit26/finvizfinance.git 38 | 39 | Usage 40 | ----- 41 | 42 | Import all necessary libraries: 43 | 44 | .. code:: ipython3 45 | 46 | from finvizfinance.quote import finvizfinance 47 | import pandas as pd 48 | 49 | .. code:: ipython3 50 | 51 | stock = finvizfinance('tsla') 52 | 53 | Stock Fundamentals 54 | ------------------ 55 | 56 | Getting information (fundamentals, description, outer rating, stock news, inside trader) of an individual stock. 57 | 58 | .. code:: ipython3 59 | 60 | chart = stock.ticker_charts() 61 | chart 62 | 63 | .. code:: ipython3 64 | 65 | stock_fundament = stock.ticker_fundament() 66 | 67 | Ticker Description 68 | ------------------ 69 | 70 | Outputs a brief description of the chosen stock. 'Tesla, Inc. designs, develops, manufactures, ...' 71 | 72 | .. code:: ipython3 73 | 74 | description = stock.ticker_description() 75 | 76 | Multiple Tickers 77 | ---------------- 78 | 79 | Getting multiple tickers' information according to the filters. 80 | 81 | .. code:: ipython3 82 | 83 | from finvizfinance.screener.overview import Overview 84 | 85 | foverview = Overview() 86 | filters_dict = {'Index':'S&P 500','Sector':'Basic Materials'} 87 | foverview.set_filter(filters_dict=filters_dict) 88 | df = foverview.screener_view() 89 | 90 | Sentiment and News 91 | ------------------ 92 | 93 | Gets recent financial news, including a rating for sentiment. 94 | 95 | .. code:: ipython3 96 | 97 | outer_ratings_df = stock.ticker_outer_ratings() 98 | 99 | .. code:: ipython3 100 | 101 | news_df = stock.ticker_news() 102 | 103 | .. code:: ipython3 104 | 105 | from finvizfinance.news import News 106 | 107 | fnews = News() 108 | all_news = fnews.get_news() 109 | 110 | all_news['news'].head() # 'blogs' 111 | 112 | Insider Trades 113 | -------------- 114 | 115 | Outputs a Pandas DataFrame of insider trades, their relationship, cost, value, 116 | number of shares, and more. 117 | 118 | 119 | .. code:: ipython3 120 | 121 | inside_trader_df = stock.ticker_inside_trader() 122 | 123 | .. code:: ipython3 124 | 125 | from finvizfinance.insider import Insider 126 | 127 | finsider = Insider(option='top owner trade') 128 | # option: latest, top week, top owner trade 129 | # default: latest 130 | 131 | insider_trader = finsider.get_insider() 132 | 133 | 134 | 135 | -------------------------------------------------------------------------------- /Docs/oanda_example.cfg: -------------------------------------------------------------------------------- 1 | [oanda] 2 | access_token = ajskdbfqiansdbygouhqpoiwenfg4u879uefn <-- random token 3 | account_id = username 4 | account_type = practice -------------------------------------------------------------------------------- /Docs/quandl.rst: -------------------------------------------------------------------------------- 1 | .. _quandl: 2 | 3 | Quandl 4 | ========= 5 | 6 | Quandl has many data sources to get different types of stock market data. However, some are free and some are paid. Wiki is the free data source of Quandl to get the data of the end of the day prices of 3000+ US equities. It is curated by Quandl community and also provides information about the dividends and split. 7 | 8 | Quandl also provides paid data source of minute and lower frequencies. 9 | 10 | To get the stock market data, you need to first install the quandl module if it is not already installed using the pip command as shown below. 11 | 12 | You need to get your own API Key from quandl to get the stock market data using the below code. If you are facing issue in getting the API key then you can refer to this link. 13 | 14 | After you get your key, assign the variable QUANDL_API_KEY with that key. Then set the start date, end date and the ticker of the asset whose stock market data you want to fetch. 15 | 16 | The quandl get method takes this stock market data as input and returns the open, high, low, close, volume, adjusted values and other information. 17 | 18 | 19 | To do- Add details from following sites 20 | - https://blog.quantinsti.com/stock-market-data-analysis-python/ 21 | - https://towardsdatascience.com/python-i-have-tested-quandl-api-and-how-to-get-real-estates-economics-data-in-one-line-of-code-a13806ca9bb 22 | - https://medium.datadriveninvestor.com/financial-data-431b75975bb#cc62 23 | - Add more description into each of the component. 24 | - Add the details about how to see the list of all tickers available for download in each section. 25 | - Provide a link to the jupyter notebook for this. 26 | 27 | .. note:: 28 | Refer to `Quandl Jupyter Notebook `_ for more details. 29 | 30 | Table of Contents 31 | ----------------- 32 | 33 | - `Installation`_ 34 | - `Usage`_ 35 | - `Historical Price and Volume for 1 Stock`_ 36 | - `Adding Time Periods`_ 37 | - `Dividends`_ 38 | - `Cryptocurrencies`_ 39 | - `Mutual Funds`_ 40 | - `Treasury Rates`_ 41 | - `Stock Fundamentals`_ 42 | - `Futures and Options`_ 43 | 44 | Installation 45 | ------------ 46 | 47 | Install with pip: 48 | 49 | .. code:: ipython3 50 | 51 | pip install quandl 52 | 53 | Usage 54 | ----- 55 | 56 | .. note:: 57 | Before working with this API, you will need to obtain 58 | a key from `Nasdaq Data Link `_ 59 | 60 | Quandl is the library used for accessing the `Nasdaq Data Link `_ 61 | database, so all of the queries below follow a similar pattern that can be reproduced with 62 | any of the ID codes from the database. 63 | 64 | Import all necessary libraries: 65 | 66 | .. code:: ipython3 67 | 68 | import quandl 69 | import pandas as pd 70 | import numpy as np 71 | from datetime import datetime 72 | from matplotlib import pyplot as plt 73 | import seaborn as sns 74 | 75 | .. code:: ipython3 76 | 77 | # To get your API key, sign up for a free Quandl account. 78 | # Then, you can find your API key on Quandl account settings page. 79 | QUANDL_API_KEY = 'REPLACE-THIS-TEXT-WITH-A-REAL-API-KEY' 80 | 81 | 82 | # This is to prompt you to change the Quandl Key 83 | if QUANDL_API_KEY == 'REPLACE-THIS-TEXT-WITH-A-REAL-API-KEY': 84 | raise Exception("Please provide a valid Quandl API key!") 85 | 86 | .. code:: ipython3 87 | 88 | quandl.ApiConfig.api_key = QUANDL_API_KEY 89 | 90 | Historical Price and Volume for 1 Stock 91 | --------------------------------------- 92 | 93 | Outputs the OHLCV, as well as dividend data and adjusted OHLCV for the given ``ticker``. 94 | 95 | .. code:: ipython3 96 | 97 | # Set the start and end date 98 | start_date = '1990-01-01' 99 | end_date = '2018-03-01' 100 | 101 | # Set the ticker name 102 | ticker = 'AMZN' 103 | 104 | .. code:: ipython3 105 | 106 | data = quandl.get('WIKI/'+ticker) 107 | 108 | 109 | Adding Time Periods 110 | ------------------- 111 | 112 | Uses ``start`` and ``end`` to denote a time period for the query. 113 | 114 | .. code:: ipython3 115 | 116 | data = quandl.get('WIKI/'+ticker, 117 | start_date=start, 118 | end_date=end) 119 | data.head() 120 | 121 | 122 | Dividends 123 | --------- 124 | 125 | Outputs the Dividend and Read Dividend. 126 | 127 | .. code:: ipython3 128 | 129 | sp = quandl.get('YALE/SPCOMP', start_date='2015-04-01', end_date='2021-10-01') 130 | sp[['Dividend', 'Real Dividend']] 131 | 132 | 133 | Cryptocurrencies 134 | --------------- 135 | 136 | Outputs the date and price of bitcoin. 137 | 138 | .. code:: ipython3 139 | 140 | # bitcoin price 141 | btc = quandl.get('BCHAIN/MKPRU', start_date='2020-12-29', end_date='2021-12-29') 142 | btc 143 | 144 | Mutual Funds 145 | --------------- 146 | 147 | Plots the mutual fund assests to GDP from the ``start_date`` to the ``end_date``. 148 | 149 | .. code:: ipython3 150 | 151 | # Mutual Fund Assets to GDP for World 152 | mf = quandl.get('FRED/DDDI071WA156NWDB', start_date='1980-04-01', end_date='2020-10-01') 153 | mf.plot(title = 'Mutual Fund Assets to GDP', figsize=(20, 6)) 154 | 155 | Treasury Rates 156 | --------------- 157 | 158 | Plots the real long-term treasury rates from the ``start_date`` to the ``end_date``. 159 | 160 | .. code:: ipython3 161 | 162 | mf = quandl.get('USTREASURY/REALLONGTERM', start_date='2000-04-01', end_date='2020-10-01') 163 | mf.plot(title = 'Treasury Real Long-Term Rates', figsize=(20, 6)) 164 | 165 | 166 | Stock Fundamentals 167 | ------------------ 168 | 169 | Outputs earnings, CPI, price, long interest rate, and PE ratio. 170 | 171 | .. code:: ipython3 172 | 173 | sp = quandl.get('YALE/SPCOMP', start_date='2015-04-01', end_date='2021-10-01') 174 | sp 175 | 176 | Futures and Options 177 | ------------------- 178 | 179 | Outputs various long, short, and spread data. 180 | 181 | .. code:: ipython3 182 | 183 | fo = quandl.get('CFTC/1170E1_FO_ALL', start_date='2015-04-01', end_date='2021-10-01') 184 | fo 185 | 186 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ## Fetching-Financial-Data 2 | 3 | This repository includes a number of libraries and short examples to demonstrate how we can pull financial data(mostly for equities). It's basically everything you need to get that financial data and start performing analysis/building algorithms on top of that. 4 | 5 | Seperate notebook for each different library has been included - 6 | 7 | 1. [yfinance](https://github.com/ranaroussi/yfinance) - uses a RESTful API to connect to Yahoo Finance and pull the required data. 8 | 2. [yahoofinances](https://github.com/JECSand/yahoofinancials) - uses web scraping to pull data from Yahoo Finance, therefore can be slow when compared to yfinance. Also, we cannot get intraday data using this library(in yfinance, we can). 9 | 3. [alphavantage](https://github.com/RomelTorres/alpha_vantage) - Requires an API key(free), this is an amazing option to fetch intraday data, yfinance can also do this, but this is another alternative in case yfinance stops supporting that. 10 | 11 | Get Your API Key here - https://www.alphavantage.co/support/#api-key 12 | 13 | 4. financial-data-webscraping - To scrape financial data such as Balance sheets, income statements and cashflow statements for fundamental analysis using BeautifulSoup. 14 | 5. [FundamentalAnalysis](https://github.com/JerBouma/FundamentalAnalysis) - Amazing library, does all the heavy-lifting for you, and does all the magic behind the scenes, to get you the data in a clean format, however this can be used only for a handful of stock exchanges(shown in notebook), if you are looking at stocks in the US, you just have the perfect tool for the task. This also required an API key, and remember that you can make only 250 Calls per API key, so you either need to upgrade or get multiple API keys. Amazing package, shout out to Jeroen Bouma. 15 | 16 | Get your API Key here - https://financialmodelingprep.com/developer/docs/ 17 | 18 | 19 | ### Our Recommendation 20 | 21 | * I personally prefer yfinance for technical analysis, because it has an easy-to-use API and very convenient most of the times. 22 | 23 | * For Fundamental analysis, FundamentalAnalysis package is the best, as it requires no data cleaning and can be used directly to get detailed financial statements of a company, however it has coverage limitations and doesn't cover a lot many stock exchanges, so you can choose between Web Scraping and FundamentalAnalysis package as per your requirement. 24 | -------------------------------------------------------------------------------- /conf.py: -------------------------------------------------------------------------------- 1 | import sys, os 2 | 3 | sys.path.insert(0, os.path.abspath('extensions')) 4 | 5 | extensions = [ 6 | "sphinx_design", 7 | "nbsphinx", 8 | 'sphinx_tabs.tabs', 9 | 'sphinx_panels', 10 | 'sphinx.ext.autosectionlabel', 11 | 12 | ] 13 | 14 | 15 | 16 | html_permalinks_icon = '#' 17 | html_theme = 'press' 18 | 19 | nbsphinx_allow_errors = True 20 | 21 | source_suffix = [".rst", ".md"] 22 | 23 | 24 | # panels_add_bootstrap_css = False 25 | 26 | # The master toctree document. 27 | master_doc = 'index' 28 | 29 | autosectionlabel_prefix_document = True 30 | -------------------------------------------------------------------------------- /feature_tracker_table.csv: -------------------------------------------------------------------------------- 1 | EQUITIES,FIXED INCOME,FX,COMMODITIES,CRYPTO,STOCK FUNDAMENTALS,MACROECONOMICS DATA,NEWS AND SENTIMENTS,FINANCIAL DATASETS,ALTERNATIVE DATA 2 | `YFinance `_,`YFinance `_,`YFinance `_,`YFinance `_,`YFinance `_,`YFinance `_,`Alphavantage `_,`Alphavantage `_,`YFinance `_,`YFinance `_ 3 | `Alphavantage `_,`Alphavantage `_,`Alphavantage `_,`Quandl `_,`Alphavantage `_,`Quandl `_,`Quandl `_,`Quandl `_,`Alphavantage `_,`Alphavantage `_ 4 | `Quandl `_,`Quandl `_,`Pandas `_,`Pandas `_,`Quandl `_,`Fundamental Analysis `_,`FRED `_,`FRED `_,`Quandl `_, 5 | `Fundamental Analysis `_,`FRED `_,`Oanda `_,`Stooq `_,`Pandas `_,`Pandas `_,`Pandas `_,`Pandas `_,`Fundamental Analysis `_, 6 | `Pandas `_,,,`IEX `_,`Stooq `_,`IEX `_,`IEX `_,`IEX `_,`Pandas `_, 7 | `Oanda `_,,,,`IEX `_,`FinViz `_,,`FinViz `_,`IEX `_, 8 | ,,,,`FinViz `_,,,,, -------------------------------------------------------------------------------- /homepage_tabs.rst: -------------------------------------------------------------------------------- 1 | .. tabs:: 2 | 3 | .. tab:: Equities 4 | 5 | .. panels:: 6 | 7 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Equities.html#single-stock---volume-and-price-data 8 | :text: Single Stock - Volume and Price Data 9 | :classes: stretched-link 10 | 11 | --- 12 | 13 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Equities.html#indices 14 | :text: Indices 15 | :classes: stretched-link 16 | 17 | --- 18 | 19 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Equities.html#stock-splits-and-dividends 20 | :text: Stock Splits and Dividends 21 | :classes: stretched-link 22 | 23 | --- 24 | 25 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Equities.html#technical-indicators 26 | :text: Technical Indicators 27 | :classes: stretched-link 28 | 29 | --- 30 | 31 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Equities.html#stock-fundamentals 32 | :text: Stock Fundamentals 33 | :classes: stretched-link 34 | 35 | --- 36 | 37 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Equities.html#financials 38 | :text: Financials 39 | :classes: stretched-link 40 | 41 | --- 42 | 43 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Equities.html#frequency-setting 44 | :text: (High)Frequency Setting 45 | :classes: stretched-link 46 | 47 | --- 48 | 49 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Equities.html#changing-time-period 50 | :text: Changing Time Period 51 | :classes: stretched-link 52 | 53 | --- 54 | 55 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Equities.html#realtime-data 56 | :text: Realtime Data 57 | :classes: stretched-link 58 | 59 | .. tab:: Fixed Income 60 | 61 | .. panels:: 62 | 63 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/FixedIncome.html#treasury-rates 64 | :text: Treasury Rates 65 | :classes: stretched-link 66 | 67 | --- 68 | 69 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/FixedIncome.html#mutual-funds 70 | :text: Mutual Funds 71 | :classes: stretched-link 72 | 73 | --- 74 | 75 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/FixedIncome.html#changing-time-period 76 | :text: Changing Time Period 77 | :classes: stretched-link 78 | 79 | --- 80 | 81 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/FixedIncome.html#realtime-data 82 | :text: Realtime Data 83 | :classes: stretched-link 84 | 85 | .. tab:: FX 86 | 87 | .. panels:: 88 | 89 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/FX.html#fx-spot---volume-and-price-data 90 | :text: FX Spot - Volume and Price Data 91 | :classes: stretched-link 92 | 93 | --- 94 | 95 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/FX.html#changing-time-period 96 | :text: Changing Time Period 97 | :classes: stretched-link 98 | 99 | --- 100 | 101 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/FX.html#realtime-data 102 | :text: Realtime Data 103 | :classes: stretched-link 104 | 105 | .. tab:: Commodities 106 | 107 | .. panels:: 108 | 109 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Commodities.html#single-commodity---volume-and-price-data 110 | :text: Single Commodity - Volume and Price Data 111 | :classes: stretched-link 112 | 113 | --- 114 | 115 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Commodities.html#changing-time-period 116 | :text: Changing Time Period 117 | :classes: stretched-link 118 | 119 | --- 120 | 121 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Commodities.html#realtime-data 122 | :text: Realtime Data 123 | :classes: stretched-link 124 | 125 | .. tab:: Crypto 126 | 127 | .. panels:: 128 | 129 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Crypto.html#single-cryptocurrency---volume-and-price-data 130 | :text: Single Commodity - Volume and Price Data 131 | :classes: stretched-link 132 | 133 | --- 134 | 135 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Crypto.html#changing-time-period 136 | :text: Changing Time Period 137 | :classes: stretched-link 138 | 139 | --- 140 | 141 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Crypto.html#realtime-data 142 | :text: Realtime Data 143 | :classes: stretched-link 144 | 145 | .. tab:: Stock Fundamentals 146 | 147 | .. panels:: 148 | 149 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Fundamentals.html#stock-fundamentals 150 | :text: Stock Fundamentals 151 | :classes: stretched-link 152 | 153 | --- 154 | 155 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Fundamentals.html#financial-ratios 156 | :text: Financial Ratios 157 | :classes: stretched-link 158 | 159 | --- 160 | 161 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Fundamentals.html#financials 162 | :text: Financials 163 | :classes: stretched-link 164 | 165 | --- 166 | 167 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Fundamentals.html#frequency-setting(including-high-frequency) 168 | :text: (High)Frequency Setting 169 | :classes: stretched-link 170 | 171 | --- 172 | 173 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Fundamentals.html#changing-time-period 174 | :text: Changing Time Period 175 | :classes: stretched-link 176 | 177 | --- 178 | 179 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Fundamentals.html#realtime-data 180 | :text: Realtime Data 181 | :classes: stretched-link 182 | 183 | .. tab:: Options and Futures 184 | 185 | .. panels:: 186 | 187 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/OptionFuture.html#options-and-future---volume-and-price-data 188 | :text: Single Commodity - Volume and Price Data 189 | :classes: stretched-link 190 | 191 | --- 192 | 193 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/OptionFuture.html#changing-time-period 194 | :text: Changing Time Period 195 | :classes: stretched-link 196 | 197 | --- 198 | 199 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/OptionFuture.html#realtime-data 200 | :text: Realtime Data 201 | :classes: stretched-link 202 | 203 | .. tab:: Macro Data 204 | 205 | .. panels:: 206 | 207 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Macroeconomic.html#economic-indicators 208 | :text: Economic Indicators 209 | :classes: stretched-link 210 | 211 | --- 212 | 213 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/Macroeconomic.html#treasury-rates 214 | :text: Treasury Rates 215 | :classes: stretched-link 216 | 217 | .. tab:: News/Sentiment 218 | 219 | .. panels:: 220 | 221 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/News.html#sentiment-data 222 | :text: Sentiment Data 223 | :classes: stretched-link 224 | 225 | --- 226 | 227 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/News.html#insider-trades 228 | :text: Insider Trades 229 | :classes: stretched-link 230 | 231 | .. tab:: Alt Data 232 | 233 | .. panels:: 234 | 235 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/AlternativeData.html#volume-and-price-data 236 | :text: Volume and Price Data 237 | :classes: stretched-link 238 | 239 | --- 240 | 241 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/AlternativeData.html#importing-many-stocks 242 | :text: Importing Many Stocks 243 | :classes: stretched-link 244 | 245 | --- 246 | 247 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/AlternativeData.html#indices 248 | :text: Indices 249 | :classes: stretched-link 250 | 251 | --- 252 | 253 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/AlternativeData.html#stock-splits-and-dividends 254 | :text: Stock Splits and Dividends 255 | :classes: stretched-link 256 | 257 | --- 258 | 259 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/AlternativeData.html#technical-indicators 260 | :text: Technical Indicators 261 | :classes: stretched-link 262 | 263 | --- 264 | 265 | .. link-button:: https://finailabdatasets.readthedocs.io/en/latest/Docs/AlternativeData.html#which-algorithm-should-i-use-? 266 | :text: Algorithms 267 | :classes: stretched-link 268 | 269 | -------------------------------------------------------------------------------- /index.rst: -------------------------------------------------------------------------------- 1 | FinAILab's Datasets! - Free Financial data for building AI and Machine Learning in Finance 2 | ========================================================================================== 3 | 4 | This documentation contains the information about the **Free financial dataset** for thousands 5 | of asset classes, macroeconomic data, fundamentals and alternative data that can be used for building models on `FinAILab `_. 6 | 7 | This doumentation shows you how to get massive amounts of Financial Data and explains how to **install required Libraries** and how to **download/import the data** with few lines of Python Code. 8 | 9 | The data covered in this documentation include: 10 | ----------------------------------------------- 11 | - Historical Price and Volume Data for 100,000+ Symbols/Instruments. 12 | - 50+ Exchanges all around the world. 13 | - Real-time and Historical Data (back to 1960s) 14 | - High-frequency real-time Data 15 | - Foreign Exchange (FOREX): 150+ Currency Pairs 16 | - 500+ Cryptocurrencies 17 | - Commodities (Crude Oil, Gold, Silver, etc.) 18 | - Futures and Option data 19 | - Macroeconomic variables 20 | - Stock Options, Stock Splits and Dividends for 5000+ Stocks 21 | - Fundamentals, Metrics and Ratios for Stocks, Bonds, Indexes, Mutual Funds and ETFs 22 | - Balance Sheets, Cashflow and Profit and Loss Statements (P&L) 23 | - 50+ Technical Indicators (i.e. SMA, Bollinger Bands). 24 | 25 | Financial Datasets - Summary by source and types: 26 | ------------------------------------------------- 27 | 28 | 29 | .. include:: homepage_tabs.rst 30 | 31 | .. toctree:: 32 | :maxdepth: 1 33 | :caption: Source 34 | 35 | Docs/YahooFinance 36 | Docs/Alphavantage 37 | Docs/FundamentalAnalysis 38 | Docs/quandl 39 | Docs/FRED 40 | Docs/Stooq 41 | Docs/IEX 42 | Docs/Oanda 43 | Docs/finviz 44 | 45 | .. toctree:: 46 | :maxdepth: 1 47 | :caption: Categories 48 | 49 | Docs/Equities 50 | Docs/FixedIncome 51 | Docs/FX 52 | Docs/Commodities 53 | Docs/Crypto 54 | Docs/Fundamentals 55 | Docs/OptionFuture 56 | Docs/Macroeconomic 57 | Docs/News 58 | Docs/AlternativeData 59 | 60 | 61 | 62 | Contributing 63 | ------------ 64 | 65 | To any interested in making the FinAIML better, there are still some improvements 66 | that need to be done. 67 | A full TODO list is available in the `roadmap `_. 68 | 69 | If you want to contribute, please go through `CONTRIBUTING.md `_ first. 70 | 71 | Indices and tables 72 | ------------------- 73 | 74 | * :ref:`genindex` 75 | * :ref:`search` 76 | * :ref:`modindex` 77 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | gym 2 | pandas 3 | sphinx 4 | nbsphinx 5 | sphinx-press-theme 6 | sphinx-tabs 7 | sphinx-panels 8 | sphinx-design 9 | ipykernel 10 | numpy 11 | yfinance 12 | alphavantage 13 | datetime 14 | finvizfinance 15 | requests 16 | seaborn 17 | matplotlib 18 | -------------------------------------------------------------------------------- /spelling_wordlist.txt: -------------------------------------------------------------------------------- 1 | py 2 | env 3 | atari 4 | argparse 5 | Argparse 6 | TensorFlow 7 | feedforward 8 | envs 9 | VecEnv 10 | pretrain 11 | petrained 12 | tf 13 | np 14 | mujoco 15 | cpu 16 | ndarray 17 | ndarrays 18 | timestep 19 | timesteps 20 | stepsize 21 | dataset 22 | adam 23 | fn 24 | normalisation 25 | Kullback 26 | Leibler 27 | boolean 28 | deserialized 29 | pretrained 30 | minibatch 31 | subprocesses 32 | ArgumentParser 33 | Tensorflow 34 | Gaussian 35 | approximator 36 | minibatches 37 | hyperparameters 38 | hyperparameter 39 | vectorized 40 | rl 41 | colab 42 | dataloader 43 | npz 44 | datasets 45 | vf 46 | logits 47 | num 48 | Utils 49 | backpropagate 50 | prepend 51 | NaN 52 | preprocessing 53 | Cloudpickle 54 | async 55 | multiprocess 56 | tensorflow 57 | mlp 58 | cnn 59 | neglogp 60 | tanh 61 | coef 62 | repo 63 | Huber 64 | params 65 | ppo 66 | arxiv 67 | Arxiv 68 | func 69 | DQN 70 | Uhlenbeck 71 | Ornstein 72 | multithread 73 | cancelled 74 | Tensorboard 75 | parallelize 76 | customising 77 | serializable 78 | Multiprocessed 79 | cartpole 80 | toolset 81 | lstm 82 | rescale 83 | ffmpeg 84 | avconv 85 | unnormalized 86 | Github 87 | pre 88 | preprocess 89 | backend 90 | attr 91 | preprocess 92 | Antonin 93 | Raffin 94 | araffin 95 | Homebrew 96 | Numpy 97 | Theano 98 | rollout 99 | kfac 100 | Piecewise 101 | csv 102 | nvidia 103 | visdom 104 | tensorboard 105 | preprocessed 106 | namespace 107 | sklearn 108 | GoalEnv 109 | BaseCallback 110 | Keras 111 | --------------------------------------------------------------------------------