├── .gitmodules ├── LICENSE ├── README.MD ├── check_db.pl ├── datasets.sh ├── datasets_info.pl └── pairs.conf /.gitmodules: -------------------------------------------------------------------------------- 1 | [submodule "BacktestTool"] 2 | path = BacktestTool 3 | url = https://github.com/xFFFFF/Gekko-BacktestTool.git 4 | [submodule "gdrive"] 5 | path = gdrive 6 | url = https://github.com/prasmussen/gdrive 7 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2018 Filip la Gre 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.MD: -------------------------------------------------------------------------------- 1 | # What is it? 2 | [![Powered by Gekko-BacktestTool](https://img.shields.io/badge/Powered%20by-Gekko--BacktestTool-blue.svg)](https://github.com/xFFFFF/Gekko-BacktestTool) ![Perl](https://img.shields.io/badge/Made%20in-Perl-blue.svg) [![HitCount](http://hits.dwyl.com/xFFFFF/Gekko-Datasets.svg)](http://hits.dwyl.com/xFFFFF/Gekko-Datasets) [![GA](https://ga-beacon.appspot.com/UA-118674108-2/r)](https://github.com/xFFFFF/Gekko-Datasets) 3 | [![Donate with Bitcoin](https://en.cryptobadges.io/badge/small/32G2cYTNFJ8heKUbALWSgGvYQikyJ9dHZp)](https://en.cryptobadges.io/donate/32G2cYTNFJ8heKUbALWSgGvYQikyJ9dHZp) 4 | [![Donate with Ethereum](https://en.cryptobadges.io/badge/small/0x50b7611b6dC8a4073cB4eF12A6b045f644c3a3Aa)](https://en.cryptobadges.io/donate/0x50b7611b6dC8a4073cB4eF12A6b045f644c3a3Aa) 5 | Ready to use [Gekko trading bot](https://github.com/askmike/gekko) SQLite dump files with history use to do backtests. Just copy downloaded file to the Gekko's *history* directory and you get for example: full history of Binance BTC pairs. 6 | 7 | The data is systematized in two ways: in order to improve Gekko's performance, full history data is divided into separate files sorted by exchange and currency pairs. Each database file contains all possible assets from a given exchange-currency. 8 | Second option: data for the last days (7, 14, 30, 60) have all possible currencies and assets from a given exchange in one db file. 9 | 10 | The files are updated every day with new data after 23:15 GMT. 11 | 12 | Currently available datasets 13 | - **[Binance](https://www.binance.com/?ref=17905068)** - *BTC*, *BNB*, *ETH*, *USDT* (full history) 14 | - **Bitfinex** - *BTC*, *ETH*, *USD*, *EUR*, *GBP*, *JPY* (full history) 15 | - **Poloniex** - *BTC*, *ETH*, *XMR* (full history), *USDT* (from 2017-07-01) 16 | - **GDAX** - *USD*, *BTC*, *EUR*, *GBP* (full history) 17 | - **Kraken** - *XBT*, *ETH*, *USD*, *EUR*, *CAD*, *GBP*, *JPY* (full history) 18 | 19 | Available soon / currently import 20 | - **Poloniex** - *USDT* (full history) 21 | - **BitX (aka Luno)** - *MYR, NGN, ZAR* (full history) 22 | 23 | ## Haven't more free space 24 | I do not have more space on Google Drive - full 15 GB and datasets do not fit. That is why some of the files can not be found there ... if someone wants to support the project, he can give up his drive or set up a new one (you need a phone number to verify - I do not have any more). Then create the API key according to this [tutorial](http://www.iperiusbackup.net/en/how-to-enable-google-drive-api-and-get-client-credentials/). 25 | 26 | # Download 27 | Dumps are compressed by zip and stored on Google Drive, detailed information about candles are in .info files. 28 | 29 | Last 7, 14, 30, 60 days: [drive.google.com](https://goo.gl/dzKLmz) 30 | Full history: [drive.google.com](https://goo.gl/KVpVVR) 31 | 32 | The current size of all uncompressed full period databases is about 21 GB. 33 | 34 | # Install 35 | 1. Go to Google Drive [here](https://goo.gl/dzKLmz) or [here](https://goo.gl/KVpVVR) 36 | 2. Select the dataset that interests you, click with the right mouse button and select Download 37 | ![Download](https://i.imgur.com/bg7Nrzt.jpg) 38 | 3. Go to Your gekko's main directory 39 | 4. Create a *history* folder in the main folder gekko if it does not exist 40 | ![History folder](https://i.imgur.com/Ct6fnvn.jpg) 41 | 5. Uncompress downloaded file, and copy db file (for eg binance_0.1.db) to *history* folder 42 | 6. Restart Gekko 43 | 44 | # Local datasets update 45 | ![BacktestTool logo](https://camo.githubusercontent.com/07020ba43de383ba3b49ccb87424fa5683e5ba2b/68747470733a2f2f692e696d6775722e636f6d2f473344637637692e706e67) 46 | I recommend [Gekko-BacktestTool](https://github.com/xFFFFF/Gekko-BacktestTool) app for self-updating via exchange API import. With one command you can import all new candles. For example, for dataset binance-usdt, use the command: 47 | `./backtest.pl -i -p binance:USDT:ALL -f last -t now` 48 | The rest is done automatically. 49 | 50 | # Do you want to make a mirror? 51 | Feel free if you want to process / modify the data contained in this repository. For those interested, I provide scripts whose task is to update datasets from last value to current time and share dumps on Google Drive. The [datasets.sh](datasets.sh) (main script) and [datasets_info.pl](datasets_info.pl) (generating detailed information about candles) files are located in the /root/gekko directory, and my separate copies for each datasets in the following subdirectories: binance-usdt, binance-btc, binance-bnb, binance-eth, poloniex-usdt, poloniex-xmr. Add the [datasets.sh](datasets.sh) file to cron, and the rest happens by itself. Scripts require: [Gekko](https://github.com/askmike/gekko), [Gekko-BacktestTool](https://github.com/xFFFFF/Gekko-BacktestTool) and [Gdrive](https://github.com/prasmussen/gdrive). 52 | 53 | # See also 54 | - [Free strategies for Gekko](https://github.com/xFFFFF/Gekko-Strategies) 55 | - [Gekko's BacktestTool](https://github.com/xFFFFF/Gekko-BacktestTool) 56 | -------------------------------------------------------------------------------- /check_db.pl: -------------------------------------------------------------------------------- 1 | # Its script scaning all databases for missed candles, and add command to import to sh file. 2 | 3 | #!/usr/bin/perl -w 4 | use strict; 5 | use DBI; 6 | use POSIX qw(strftime); 7 | use File::chdir; 8 | 9 | open my $fh, '<', 'pairs.conf'; 10 | chomp(my @pairs = <$fh>); 11 | close $fh; 12 | 13 | foreach (@pairs) { 14 | my @exchange = split /-/, $_; 15 | my $dbh = DBI->connect("dbi:SQLite:dbname=$_/history/$exchange[0]_0.1.db", "", "", { RaiseError => 0 }); 16 | my $stmt = qq(SELECT name FROM sqlite_master WHERE type='table' AND name LIKE '%candles%';); 17 | my $sth = $dbh->prepare( $stmt ); 18 | my $rv = $sth->execute(); 19 | while (my @table = $sth->fetchrow_array()) { 20 | my @table2 = split /_/, $table[0]; 21 | my $stmt = qq(SELECT start FROM @table ORDER BY start ASC;); 22 | my $sth = $dbh->prepare( $stmt ); 23 | my $rv = $sth->execute() or die $DBI::errstr; 24 | my $fmt='%Y-%m-%d %H:%M:%S'; 25 | my $expected; 26 | while (my @row=$sth->fetchrow_array()) { 27 | if (defined $expected && $row[0] != $expected) { 28 | my $f = sprintf "%s", strftime($fmt, gmtime ($expected)); 29 | my $t = sprintf "%s", strftime($fmt, gmtime ($row[0]+60)); 30 | print "perl backtest.pl -i -p $exchange[0]:$table2[1]:$table2[2] -f $f -t $t\n"; 31 | local $CWD = "$_"; 32 | unlink 'backtest_missed.sh' if -e 'backtest_missed.sh'; 33 | print `echo "perl backtest.pl -i -p $exchange[0]:$table2[1]:$table2[2] -f \'$f\' -t \'$t\'" >> backtest_missed.sh`; 34 | 35 | local $CWD = ".."; 36 | } 37 | $expected=$row[0]+60 38 | } 39 | } 40 | } 41 | 42 | -------------------------------------------------------------------------------- /datasets.sh: -------------------------------------------------------------------------------- 1 | #!/usr/local/bin/bash 2 | export PATH="/usr/local/bin:/usr/bin:/bin" 3 | 4 | 5 | # Patch to parrent dir of above directories 6 | path='/root/gekko' 7 | all=0 8 | 9 | # Dir names of Gekko copies 10 | readarray array < $path/pairs.conf 11 | 12 | echo "This is Gekko Trading Bot datasets for backtests. Webpage with always actually information: https://github.com/xFFFFF/Gekko-Datasets 13 | 14 | UNCOMPRESSED FILES SIZE" > $path/README.TXT 15 | 16 | function drive () { 17 | gdrive delete `gdrive list -m 1000 | grep $1 | cut -d ' ' -f1 | head -1` 18 | gdrive upload -p 1cdaEPTA2Z_DJWCkbfidlSJVg8gJinK78 $1 19 | } 20 | 21 | for i in ${array[@]} 22 | do 23 | cd $path/$i 24 | 25 | # Run Gekko-BacktestTool in update candles mode 26 | if [[ $i == gdax* ]] || [[ $i == poloniex* ]]; then 27 | perl $path/$i/backtest.pl -i -f last -t now 28 | else 29 | perl $path/$i/backtest.pl -i -p `echo $i | cut -d'-' -f1`:`echo $i | cut -d'-' -f2 | awk '{print toupper($0)}'`:ALL -f last -t now 30 | fi 31 | 32 | # Run script which generate file with statistics for candles 33 | cd history 34 | fname="`echo $i | cut -d'-' -f1`_0.1.db" 35 | echo "DB filename: $fname" > $i.info 36 | echo "Size: `ls -hl $fname | awk '{print $5}'` (`ls -l $fname | awk '{print $5}'` bytes) 37 | " >> $i.info 38 | perl $path/datasets_info.pl $path/$i/history/`echo $i | cut -d'-' -f1`_0.1.db >> $i.info 39 | size=`ls -l $fname | awk '{print $5}'` 40 | ((sizeh=$size / 1024 / 1024)) 41 | echo "$i: $sizeh MB" >> $path/README.TXT 42 | ((all=all+size)) 43 | 44 | # Delete old Google Drive file, and upload new in defined directory 45 | drive $i.info 46 | rm $i.info 47 | 48 | # Prepare and upload Gekko's database 49 | [ -f $i.zip ] && rm $i.zip 50 | zip -9 $i.zip `echo $i | cut -d'-' -f1`_0.1.db 51 | drive $i.zip 52 | 53 | done 54 | 55 | # Last data for readme 56 | ((allh=$all / 1024 / 1024)) 57 | echo "SUM: $allh MB" >> $path/README.TXT 58 | drive "$path/README.TXT" 59 | -------------------------------------------------------------------------------- /datasets_info.pl: -------------------------------------------------------------------------------- 1 | #!/usr/bin/perl 2 | use DBI; 3 | no warnings; 4 | 5 | my $dbh = DBI->connect( 6 | "dbi:SQLite:dbname=$ARGV[0]", 7 | "", 8 | "", 9 | { RaiseError => 0 }, 10 | ); 11 | my $stmt = qq(SELECT name FROM sqlite_master WHERE type='table' AND name LIKE '%candles%';); 12 | my $sth = $dbh->prepare( $stmt ); 13 | my $rv = $sth->execute(); 14 | my @row; 15 | while (@row = $sth->fetchrow_array()) { 16 | $stmt = qq(SELECT datetime(start, 'unixepoch') FROM `main`.@row order by start ASC LIMIT 1;); 17 | my $sth = $dbh->prepare( $stmt ); 18 | my $rv = $sth->execute(); 19 | my @row2; 20 | while (@row2 = $sth->fetchrow_array()) { 21 | print "@row\nstart: @row2\n"; 22 | } 23 | $stmt = qq(SELECT datetime(start, 'unixepoch') FROM `main`.@row order by start DESC LIMIT 1;); 24 | my $sth = $dbh->prepare( $stmt ); 25 | my $rv = $sth->execute(); 26 | my @row2; 27 | while (@row2 = $sth->fetchrow_array()) { 28 | print "end: @row2\n"; 29 | } 30 | $stmt = qq(select count(*) FROM @row;); 31 | my $sth = $dbh->prepare( $stmt ); 32 | my $rv = $sth->execute(); 33 | while (@row2 = $sth->fetchrow_array()) { 34 | print "candles: @row2\n"; 35 | } 36 | my $stmt = qq(SELECT avg(vwp) FROM `main`.@row); 37 | my $sth = $dbh->prepare( $stmt ); 38 | my $rv = $sth->execute(); 39 | while (@row2 = $sth->fetchrow_array()) { 40 | print "avarage price: @row2\n"; 41 | } 42 | my $stmt = qq(SELECT min(low) FROM `main`.@row;); 43 | my $sth = $dbh->prepare( $stmt ); 44 | my $rv = $sth->execute(); 45 | while (@row2 = $sth->fetchrow_array()) { 46 | print "lowest price: @row2\n"; 47 | } 48 | my $stmt = qq(SELECT max(high) FROM `main`.@row;); 49 | my $sth = $dbh->prepare( $stmt ); 50 | my $rv = $sth->execute(); 51 | while (@row2 = $sth->fetchrow_array()) { 52 | print "highest price: @row2\n"; 53 | } 54 | my $stmt = qq(SELECT sum(trades) FROM `main`.@row;); 55 | my $sth = $dbh->prepare( $stmt ); 56 | my $rv = $sth->execute(); 57 | while (@row2 = $sth->fetchrow_array()) { 58 | print "trades: @row2\n"; 59 | } 60 | my $stmt = qq(SELECT sum(volume) FROM `main`.@row;); 61 | my $sth = $dbh->prepare( $stmt ); 62 | my $rv = $sth->execute(); 63 | while (@row2 = $sth->fetchrow_array()) { 64 | print "volume: @row2\n\n"; 65 | } 66 | } 67 | -------------------------------------------------------------------------------- /pairs.conf: -------------------------------------------------------------------------------- 1 | binance-bnb 2 | binance-btc 3 | binance-eth 4 | binance-usdt 5 | bitfinex-eth 6 | bitfinex-eur 7 | bitfinex-gbp 8 | bitfinex-jpy 9 | gdax-eur 10 | gdax-gbp 11 | kraken-cad 12 | kraken-eth 13 | kraken-eur 14 | kraken-gbp 15 | kraken-jpy 16 | kraken-usd 17 | kraken-xbt 18 | poloniex-eth 19 | poloniex-usdt 20 | poloniex-xmr --------------------------------------------------------------------------------