Merge branch 'develop' into db_keep_orders

This commit is contained in:
Matthias 2020-09-01 07:51:16 +02:00
commit d6d3a02a23
56 changed files with 1038 additions and 423 deletions

View File

@ -123,7 +123,6 @@ Telegram is not mandatory. However, this is a great way to control your bot. Mor
- `/help`: Show help message - `/help`: Show help message
- `/version`: Show version - `/version`: Show version
## Development branches ## Development branches
The project is currently setup in two main branches: The project is currently setup in two main branches:

View File

@ -5,6 +5,9 @@ This page explains the different parameters of the bot and how to run it.
!!! Note !!! Note
If you've used `setup.sh`, don't forget to activate your virtual environment (`source .env/bin/activate`) before running freqtrade commands. If you've used `setup.sh`, don't forget to activate your virtual environment (`source .env/bin/activate`) before running freqtrade commands.
!!! Warning "Up-to-date clock"
The clock on the system running the bot must be accurate, synchronized to a NTP server frequently enough to avoid problems with communication to the exchanges.
## Bot commands ## Bot commands
``` ```

View File

@ -15,61 +15,91 @@ Otherwise `--exchange` becomes mandatory.
### Usage ### Usage
``` ```
usage: freqtrade download-data [-h] [-v] [--logfile FILE] [-V] [-c PATH] [-d PATH] [--userdir PATH] [-p PAIRS [PAIRS ...]] usage: freqtrade download-data [-h] [-v] [--logfile FILE] [-V] [-c PATH]
[--pairs-file FILE] [--days INT] [--dl-trades] [--exchange EXCHANGE] [-d PATH] [--userdir PATH]
[-p PAIRS [PAIRS ...]] [--pairs-file FILE]
[--days INT] [--dl-trades]
[--exchange EXCHANGE]
[-t {1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} [{1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} ...]] [-t {1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} [{1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} ...]]
[--erase] [--data-format-ohlcv {json,jsongz}] [--data-format-trades {json,jsongz}] [--erase]
[--data-format-ohlcv {json,jsongz,hdf5}]
[--data-format-trades {json,jsongz,hdf5}]
optional arguments: optional arguments:
-h, --help show this help message and exit -h, --help show this help message and exit
-p PAIRS [PAIRS ...], --pairs PAIRS [PAIRS ...] -p PAIRS [PAIRS ...], --pairs PAIRS [PAIRS ...]
Show profits for only these pairs. Pairs are space-separated. Show profits for only these pairs. Pairs are space-
separated.
--pairs-file FILE File containing a list of pairs to download. --pairs-file FILE File containing a list of pairs to download.
--days INT Download data for given number of days. --days INT Download data for given number of days.
--dl-trades Download trades instead of OHLCV data. The bot will resample trades to the desired timeframe as specified as --dl-trades Download trades instead of OHLCV data. The bot will
--timeframes/-t. resample trades to the desired timeframe as specified
--exchange EXCHANGE Exchange name (default: `bittrex`). Only valid if no config is provided. as --timeframes/-t.
--exchange EXCHANGE Exchange name (default: `bittrex`). Only valid if no
config is provided.
-t {1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} [{1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} ...], --timeframes {1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} [{1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} ...] -t {1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} [{1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} ...], --timeframes {1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} [{1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} ...]
Specify which tickers to download. Space-separated list. Default: `1m 5m`. Specify which tickers to download. Space-separated
--erase Clean all existing data for the selected exchange/pairs/timeframes. list. Default: `1m 5m`.
--data-format-ohlcv {json,jsongz} --erase Clean all existing data for the selected
Storage format for downloaded candle (OHLCV) data. (default: `json`). exchange/pairs/timeframes.
--data-format-trades {json,jsongz} --data-format-ohlcv {json,jsongz,hdf5}
Storage format for downloaded trades data. (default: `jsongz`). Storage format for downloaded candle (OHLCV) data.
(default: `json`).
--data-format-trades {json,jsongz,hdf5}
Storage format for downloaded trades data. (default:
`jsongz`).
Common arguments: Common arguments:
-v, --verbose Verbose mode (-vv for more, -vvv to get all messages). -v, --verbose Verbose mode (-vv for more, -vvv to get all messages).
--logfile FILE Log to the file specified. Special values are: 'syslog', 'journald'. See the documentation for more details. --logfile FILE Log to the file specified. Special values are:
'syslog', 'journald'. See the documentation for more
details.
-V, --version show program's version number and exit -V, --version show program's version number and exit
-c PATH, --config PATH -c PATH, --config PATH
Specify configuration file (default: `config.json`). Multiple --config options may be used. Can be set to `-` Specify configuration file (default:
to read config from stdin. `userdir/config.json` or `config.json` whichever
exists). Multiple --config options may be used. Can be
set to `-` to read config from stdin.
-d PATH, --datadir PATH -d PATH, --datadir PATH
Path to directory with historical backtesting data. Path to directory with historical backtesting data.
--userdir PATH, --user-data-dir PATH --userdir PATH, --user-data-dir PATH
Path to userdata directory. Path to userdata directory.
``` ```
### Data format ### Data format
Freqtrade currently supports 2 dataformats, `json` (plain "text" json files) and `jsongz` (a gzipped version of json files). Freqtrade currently supports 3 data-formats for both OHLCV and trades data:
* `json` (plain "text" json files)
* `jsongz` (a gzip-zipped version of json files)
* `hdf5` (a high performance datastore)
By default, OHLCV data is stored as `json` data, while trades data is stored as `jsongz` data. By default, OHLCV data is stored as `json` data, while trades data is stored as `jsongz` data.
This can be changed via the `--data-format-ohlcv` and `--data-format-trades` parameters respectivly. This can be changed via the `--data-format-ohlcv` and `--data-format-trades` command line arguments respectively.
To persist this change, you can should also add the following snippet to your configuration, so you don't have to insert the above arguments each time:
If the default dataformat has been changed during download, then the keys `dataformat_ohlcv` and `dataformat_trades` in the configuration file need to be adjusted to the selected dataformat as well. ``` jsonc
// ...
"dataformat_ohlcv": "hdf5",
"dataformat_trades": "hdf5",
// ...
```
If the default data-format has been changed during download, then the keys `dataformat_ohlcv` and `dataformat_trades` in the configuration file need to be adjusted to the selected dataformat as well.
!!! Note !!! Note
You can convert between data-formats using the [convert-data](#subcommand-convert-data) and [convert-trade-data](#subcommand-convert-trade-data) methods. You can convert between data-formats using the [convert-data](#sub-command-convert-data) and [convert-trade-data](#sub-command-convert-trade-data) methods.
#### Subcommand convert data #### Sub-command convert data
``` ```
usage: freqtrade convert-data [-h] [-v] [--logfile FILE] [-V] [-c PATH] usage: freqtrade convert-data [-h] [-v] [--logfile FILE] [-V] [-c PATH]
[-d PATH] [--userdir PATH] [-d PATH] [--userdir PATH]
[-p PAIRS [PAIRS ...]] --format-from [-p PAIRS [PAIRS ...]] --format-from
{json,jsongz} --format-to {json,jsongz} {json,jsongz,hdf5} --format-to
[--erase] {json,jsongz,hdf5} [--erase]
[-t {1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} [{1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} ...]] [-t {1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} [{1m,3m,5m,15m,30m,1h,2h,4h,6h,8h,12h,1d,3d,1w} ...]]
optional arguments: optional arguments:
@ -77,9 +107,9 @@ optional arguments:
-p PAIRS [PAIRS ...], --pairs PAIRS [PAIRS ...] -p PAIRS [PAIRS ...], --pairs PAIRS [PAIRS ...]
Show profits for only these pairs. Pairs are space- Show profits for only these pairs. Pairs are space-
separated. separated.
--format-from {json,jsongz} --format-from {json,jsongz,hdf5}
Source format for data conversion. Source format for data conversion.
--format-to {json,jsongz} --format-to {json,jsongz,hdf5}
Destination format for data conversion. Destination format for data conversion.
--erase Clean all existing data for the selected --erase Clean all existing data for the selected
exchange/pairs/timeframes. exchange/pairs/timeframes.
@ -94,9 +124,10 @@ Common arguments:
details. details.
-V, --version show program's version number and exit -V, --version show program's version number and exit
-c PATH, --config PATH -c PATH, --config PATH
Specify configuration file (default: `config.json`). Specify configuration file (default:
Multiple --config options may be used. Can be set to `userdir/config.json` or `config.json` whichever
`-` to read config from stdin. exists). Multiple --config options may be used. Can be
set to `-` to read config from stdin.
-d PATH, --datadir PATH -d PATH, --datadir PATH
Path to directory with historical backtesting data. Path to directory with historical backtesting data.
--userdir PATH, --user-data-dir PATH --userdir PATH, --user-data-dir PATH
@ -112,23 +143,23 @@ It'll also remove original json data files (`--erase` parameter).
freqtrade convert-data --format-from json --format-to jsongz --datadir ~/.freqtrade/data/binance -t 5m 15m --erase freqtrade convert-data --format-from json --format-to jsongz --datadir ~/.freqtrade/data/binance -t 5m 15m --erase
``` ```
#### Subcommand convert-trade data #### Sub-command convert trade data
``` ```
usage: freqtrade convert-trade-data [-h] [-v] [--logfile FILE] [-V] [-c PATH] usage: freqtrade convert-trade-data [-h] [-v] [--logfile FILE] [-V] [-c PATH]
[-d PATH] [--userdir PATH] [-d PATH] [--userdir PATH]
[-p PAIRS [PAIRS ...]] --format-from [-p PAIRS [PAIRS ...]] --format-from
{json,jsongz} --format-to {json,jsongz} {json,jsongz,hdf5} --format-to
[--erase] {json,jsongz,hdf5} [--erase]
optional arguments: optional arguments:
-h, --help show this help message and exit -h, --help show this help message and exit
-p PAIRS [PAIRS ...], --pairs PAIRS [PAIRS ...] -p PAIRS [PAIRS ...], --pairs PAIRS [PAIRS ...]
Show profits for only these pairs. Pairs are space- Show profits for only these pairs. Pairs are space-
separated. separated.
--format-from {json,jsongz} --format-from {json,jsongz,hdf5}
Source format for data conversion. Source format for data conversion.
--format-to {json,jsongz} --format-to {json,jsongz,hdf5}
Destination format for data conversion. Destination format for data conversion.
--erase Clean all existing data for the selected --erase Clean all existing data for the selected
exchange/pairs/timeframes. exchange/pairs/timeframes.
@ -140,13 +171,15 @@ Common arguments:
details. details.
-V, --version show program's version number and exit -V, --version show program's version number and exit
-c PATH, --config PATH -c PATH, --config PATH
Specify configuration file (default: `config.json`). Specify configuration file (default:
Multiple --config options may be used. Can be set to `userdir/config.json` or `config.json` whichever
`-` to read config from stdin. exists). Multiple --config options may be used. Can be
set to `-` to read config from stdin.
-d PATH, --datadir PATH -d PATH, --datadir PATH
Path to directory with historical backtesting data. Path to directory with historical backtesting data.
--userdir PATH, --user-data-dir PATH --userdir PATH, --user-data-dir PATH
Path to userdata directory. Path to userdata directory.
``` ```
##### Example converting trades ##### Example converting trades
@ -158,21 +191,21 @@ It'll also remove original jsongz data files (`--erase` parameter).
freqtrade convert-trade-data --format-from jsongz --format-to json --datadir ~/.freqtrade/data/kraken --erase freqtrade convert-trade-data --format-from jsongz --format-to json --datadir ~/.freqtrade/data/kraken --erase
``` ```
### Subcommand list-data ### Sub-command list-data
You can get a list of downloaded data using the `list-data` subcommand. You can get a list of downloaded data using the `list-data` sub-command.
``` ```
usage: freqtrade list-data [-h] [-v] [--logfile FILE] [-V] [-c PATH] [-d PATH] usage: freqtrade list-data [-h] [-v] [--logfile FILE] [-V] [-c PATH] [-d PATH]
[--userdir PATH] [--exchange EXCHANGE] [--userdir PATH] [--exchange EXCHANGE]
[--data-format-ohlcv {json,jsongz}] [--data-format-ohlcv {json,jsongz,hdf5}]
[-p PAIRS [PAIRS ...]] [-p PAIRS [PAIRS ...]]
optional arguments: optional arguments:
-h, --help show this help message and exit -h, --help show this help message and exit
--exchange EXCHANGE Exchange name (default: `bittrex`). Only valid if no --exchange EXCHANGE Exchange name (default: `bittrex`). Only valid if no
config is provided. config is provided.
--data-format-ohlcv {json,jsongz} --data-format-ohlcv {json,jsongz,hdf5}
Storage format for downloaded candle (OHLCV) data. Storage format for downloaded candle (OHLCV) data.
(default: `json`). (default: `json`).
-p PAIRS [PAIRS ...], --pairs PAIRS [PAIRS ...] -p PAIRS [PAIRS ...], --pairs PAIRS [PAIRS ...]
@ -194,6 +227,7 @@ Common arguments:
Path to directory with historical backtesting data. Path to directory with historical backtesting data.
--userdir PATH, --user-data-dir PATH --userdir PATH, --user-data-dir PATH
Path to userdata directory. Path to userdata directory.
``` ```
#### Example list-data #### Example list-data
@ -249,7 +283,7 @@ This will download historical candle (OHLCV) data for all the currency pairs you
### Other Notes ### Other Notes
- To use a different directory than the exchange specific default, use `--datadir user_data/data/some_directory`. - To use a different directory than the exchange specific default, use `--datadir user_data/data/some_directory`.
- To change the exchange used to download the historical data from, please use a different configuration file (you'll probably need to adjust ratelimits etc.) - To change the exchange used to download the historical data from, please use a different configuration file (you'll probably need to adjust rate limits etc.)
- To use `pairs.json` from some other directory, use `--pairs-file some_other_dir/pairs.json`. - To use `pairs.json` from some other directory, use `--pairs-file some_other_dir/pairs.json`.
- To download historical candle (OHLCV) data for only 10 days, use `--days 10` (defaults to 30 days). - To download historical candle (OHLCV) data for only 10 days, use `--days 10` (defaults to 30 days).
- Use `--timeframes` to specify what timeframe download the historical candle (OHLCV) data for. Default is `--timeframes 1m 5m` which will download 1-minute and 5-minute data. - Use `--timeframes` to specify what timeframe download the historical candle (OHLCV) data for. Default is `--timeframes 1m 5m` which will download 1-minute and 5-minute data.
@ -257,7 +291,7 @@ This will download historical candle (OHLCV) data for all the currency pairs you
### Trades (tick) data ### Trades (tick) data
By default, `download-data` subcommand downloads Candles (OHLCV) data. Some exchanges also provide historic trade-data via their API. By default, `download-data` sub-command downloads Candles (OHLCV) data. Some exchanges also provide historic trade-data via their API.
This data can be useful if you need many different timeframes, since it is only downloaded once, and then resampled locally to the desired timeframes. This data can be useful if you need many different timeframes, since it is only downloaded once, and then resampled locally to the desired timeframes.
Since this data is large by default, the files use gzip by default. They are stored in your data-directory with the naming convention of `<pair>-trades.json.gz` (`ETH_BTC-trades.json.gz`). Incremental mode is also supported, as for historic OHLCV data, so downloading the data once per week with `--days 8` will create an incremental data-repository. Since this data is large by default, the files use gzip by default. They are stored in your data-directory with the naming convention of `<pair>-trades.json.gz` (`ETH_BTC-trades.json.gz`). Incremental mode is also supported, as for historic OHLCV data, so downloading the data once per week with `--days 8` will create an incremental data-repository.

View File

@ -9,21 +9,20 @@ and are no longer supported. Please avoid their usage in your configuration.
### the `--refresh-pairs-cached` command line option ### the `--refresh-pairs-cached` command line option
`--refresh-pairs-cached` in the context of backtesting, hyperopt and edge allows to refresh candle data for backtesting. `--refresh-pairs-cached` in the context of backtesting, hyperopt and edge allows to refresh candle data for backtesting.
Since this leads to much confusion, and slows down backtesting (while not being part of backtesting) this has been singled out Since this leads to much confusion, and slows down backtesting (while not being part of backtesting) this has been singled out as a separate freqtrade sub-command `freqtrade download-data`.
as a seperate freqtrade subcommand `freqtrade download-data`.
This command line option was deprecated in 2019.7-dev (develop branch) and removed in 2019.9 (master branch). This command line option was deprecated in 2019.7-dev (develop branch) and removed in 2019.9.
### The **--dynamic-whitelist** command line option ### The **--dynamic-whitelist** command line option
This command line option was deprecated in 2018 and removed freqtrade 2019.6-dev (develop branch) This command line option was deprecated in 2018 and removed freqtrade 2019.6-dev (develop branch)
and in freqtrade 2019.7 (master branch). and in freqtrade 2019.7.
### the `--live` command line option ### the `--live` command line option
`--live` in the context of backtesting allowed to download the latest tick data for backtesting. `--live` in the context of backtesting allowed to download the latest tick data for backtesting.
Did only download the latest 500 candles, so was ineffective in getting good backtest data. Did only download the latest 500 candles, so was ineffective in getting good backtest data.
Removed in 2019-7-dev (develop branch) and in freqtrade 2019-8 (master branch) Removed in 2019-7-dev (develop branch) and in freqtrade 2019.8.
### Allow running multiple pairlists in sequence ### Allow running multiple pairlists in sequence
@ -31,6 +30,6 @@ The former `"pairlist"` section in the configuration has been removed, and is re
The old section of configuration parameters (`"pairlist"`) has been deprecated in 2019.11 and has been removed in 2020.4. The old section of configuration parameters (`"pairlist"`) has been deprecated in 2019.11 and has been removed in 2020.4.
### deprecation of bidVolume and askVolume from volumepairlist ### deprecation of bidVolume and askVolume from volume-pairlist
Since only quoteVolume can be compared between assets, the other options (bidVolume, askVolume) have been deprecated in 2020.4. Since only quoteVolume can be compared between assets, the other options (bidVolume, askVolume) have been deprecated in 2020.4.

View File

@ -52,6 +52,7 @@ The fastest and easiest way to start up is to use docker-compose.develop which g
* [docker-compose](https://docs.docker.com/compose/install/) * [docker-compose](https://docs.docker.com/compose/install/)
#### Starting the bot #### Starting the bot
##### Use the develop dockerfile ##### Use the develop dockerfile
``` bash ``` bash
@ -74,7 +75,7 @@ docker-compose up
docker-compose build docker-compose build
``` ```
##### Execing (effectively SSH into the container) ##### Executing (effectively SSH into the container)
The `exec` command requires that the container already be running, if you want to start it The `exec` command requires that the container already be running, if you want to start it
that can be effected by `docker-compose up` or `docker-compose run freqtrade_develop` that can be effected by `docker-compose up` or `docker-compose run freqtrade_develop`
@ -129,7 +130,7 @@ First of all, have a look at the [VolumePairList](https://github.com/freqtrade/f
This is a simple Handler, which however serves as a good example on how to start developing. This is a simple Handler, which however serves as a good example on how to start developing.
Next, modify the classname of the Handler (ideally align this with the module filename). Next, modify the class-name of the Handler (ideally align this with the module filename).
The base-class provides an instance of the exchange (`self._exchange`) the pairlist manager (`self._pairlistmanager`), as well as the main configuration (`self._config`), the pairlist dedicated configuration (`self._pairlistconfig`) and the absolute position within the list of pairlists. The base-class provides an instance of the exchange (`self._exchange`) the pairlist manager (`self._pairlistmanager`), as well as the main configuration (`self._config`), the pairlist dedicated configuration (`self._pairlistconfig`) and the absolute position within the list of pairlists.
@ -149,7 +150,7 @@ Configuration for the chain of Pairlist Handlers is done in the bot configuratio
By convention, `"number_assets"` is used to specify the maximum number of pairs to keep in the pairlist. Please follow this to ensure a consistent user experience. By convention, `"number_assets"` is used to specify the maximum number of pairs to keep in the pairlist. Please follow this to ensure a consistent user experience.
Additional parameters can be configured as needed. For instance, `VolumePairList` uses `"sort_key"` to specify the sorting value - however feel free to specify whatever is necessary for your great algorithm to be successfull and dynamic. Additional parameters can be configured as needed. For instance, `VolumePairList` uses `"sort_key"` to specify the sorting value - however feel free to specify whatever is necessary for your great algorithm to be successful and dynamic.
#### short_desc #### short_desc
@ -165,7 +166,7 @@ This is called with each iteration of the bot (only if the Pairlist Handler is a
It must return the resulting pairlist (which may then be passed into the chain of Pairlist Handlers). It must return the resulting pairlist (which may then be passed into the chain of Pairlist Handlers).
Validations are optional, the parent class exposes a `_verify_blacklist(pairlist)` and `_whitelist_for_active_markets(pairlist)` to do default filtering. Use this if you limit your result to a certain number of pairs - so the endresult is not shorter than expected. Validations are optional, the parent class exposes a `_verify_blacklist(pairlist)` and `_whitelist_for_active_markets(pairlist)` to do default filtering. Use this if you limit your result to a certain number of pairs - so the end-result is not shorter than expected.
#### filter_pairlist #### filter_pairlist
@ -173,13 +174,13 @@ This method is called for each Pairlist Handler in the chain by the pairlist man
This is called with each iteration of the bot - so consider implementing caching for compute/network heavy calculations. This is called with each iteration of the bot - so consider implementing caching for compute/network heavy calculations.
It get's passed a pairlist (which can be the result of previous pairlists) as well as `tickers`, a pre-fetched version of `get_tickers()`. It gets passed a pairlist (which can be the result of previous pairlists) as well as `tickers`, a pre-fetched version of `get_tickers()`.
The default implementation in the base class simply calls the `_validate_pair()` method for each pair in the pairlist, but you may override it. So you should either implement the `_validate_pair()` in your Pairlist Handler or override `filter_pairlist()` to do something else. The default implementation in the base class simply calls the `_validate_pair()` method for each pair in the pairlist, but you may override it. So you should either implement the `_validate_pair()` in your Pairlist Handler or override `filter_pairlist()` to do something else.
If overridden, it must return the resulting pairlist (which may then be passed into the next Pairlist Handler in the chain). If overridden, it must return the resulting pairlist (which may then be passed into the next Pairlist Handler in the chain).
Validations are optional, the parent class exposes a `_verify_blacklist(pairlist)` and `_whitelist_for_active_markets(pairlist)` to do default filters. Use this if you limit your result to a certain number of pairs - so the endresult is not shorter than expected. Validations are optional, the parent class exposes a `_verify_blacklist(pairlist)` and `_whitelist_for_active_markets(pairlist)` to do default filters. Use this if you limit your result to a certain number of pairs - so the end result is not shorter than expected.
In `VolumePairList`, this implements different methods of sorting, does early validation so only the expected number of pairs is returned. In `VolumePairList`, this implements different methods of sorting, does early validation so only the expected number of pairs is returned.
@ -203,7 +204,7 @@ Most exchanges supported by CCXT should work out of the box.
Check if the new exchange supports Stoploss on Exchange orders through their API. Check if the new exchange supports Stoploss on Exchange orders through their API.
Since CCXT does not provide unification for Stoploss On Exchange yet, we'll need to implement the exchange-specific parameters ourselfs. Best look at `binance.py` for an example implementation of this. You'll need to dig through the documentation of the Exchange's API on how exactly this can be done. [CCXT Issues](https://github.com/ccxt/ccxt/issues) may also provide great help, since others may have implemented something similar for their projects. Since CCXT does not provide unification for Stoploss On Exchange yet, we'll need to implement the exchange-specific parameters ourselves. Best look at `binance.py` for an example implementation of this. You'll need to dig through the documentation of the Exchange's API on how exactly this can be done. [CCXT Issues](https://github.com/ccxt/ccxt/issues) may also provide great help, since others may have implemented something similar for their projects.
### Incomplete candles ### Incomplete candles
@ -276,6 +277,7 @@ git checkout -b new_release <commitid>
Determine if crucial bugfixes have been made between this commit and the current state, and eventually cherry-pick these. Determine if crucial bugfixes have been made between this commit and the current state, and eventually cherry-pick these.
* Merge the release branch (master) into this branch.
* Edit `freqtrade/__init__.py` and add the version matching the current date (for example `2019.7` for July 2019). Minor versions can be `2019.7.1` should we need to do a second release that month. Version numbers must follow allowed versions from PEP0440 to avoid failures pushing to pypi. * Edit `freqtrade/__init__.py` and add the version matching the current date (for example `2019.7` for July 2019). Minor versions can be `2019.7.1` should we need to do a second release that month. Version numbers must follow allowed versions from PEP0440 to avoid failures pushing to pypi.
* Commit this part * Commit this part
* push that branch to the remote and create a PR against the master branch * push that branch to the remote and create a PR against the master branch
@ -283,14 +285,14 @@ Determine if crucial bugfixes have been made between this commit and the current
### Create changelog from git commits ### Create changelog from git commits
!!! Note !!! Note
Make sure that the master branch is uptodate! Make sure that the master branch is up-to-date!
``` bash ``` bash
# Needs to be done before merging / pulling that branch. # Needs to be done before merging / pulling that branch.
git log --oneline --no-decorate --no-merges master..new_release git log --oneline --no-decorate --no-merges master..new_release
``` ```
To keep the release-log short, best wrap the full git changelog into a collapsible details secction. To keep the release-log short, best wrap the full git changelog into a collapsible details section.
```markdown ```markdown
<details> <details>
@ -314,6 +316,9 @@ Once the PR against master is merged (best right after merging):
### pypi ### pypi
!!! Note
This process is now automated as part of Github Actions.
To create a pypi release, please run the following commands: To create a pypi release, please run the following commands:
Additional requirement: `wheel`, `twine` (for uploading), account on pypi with proper permissions. Additional requirement: `wheel`, `twine` (for uploading), account on pypi with proper permissions.

View File

@ -12,6 +12,9 @@ Optionally, [docker-compose](https://docs.docker.com/compose/install/) should be
Once you have Docker installed, simply prepare the config file (e.g. `config.json`) and run the image for `freqtrade` as explained below. Once you have Docker installed, simply prepare the config file (e.g. `config.json`) and run the image for `freqtrade` as explained below.
!!! Warning "Up-to-date clock"
The clock on the system running the bot must be accurate, synchronized to a NTP server frequently enough to avoid problems with communication to the exchanges.
## Freqtrade with docker-compose ## Freqtrade with docker-compose
Freqtrade provides an official Docker image on [Dockerhub](https://hub.docker.com/r/freqtradeorg/freqtrade/), as well as a [docker-compose file](https://github.com/freqtrade/freqtrade/blob/develop/docker-compose.yml) ready for usage. Freqtrade provides an official Docker image on [Dockerhub](https://hub.docker.com/r/freqtradeorg/freqtrade/), as well as a [docker-compose file](https://github.com/freqtrade/freqtrade/blob/develop/docker-compose.yml) ready for usage.

View File

@ -37,13 +37,9 @@ Freqtrade is a crypto-currency algorithmic trading software developed in python
## Requirements ## Requirements
### Up to date clock
The clock on the system running the bot must be accurate, synchronized to a NTP server frequently enough to avoid problems with communication to the exchanges.
### Hardware requirements ### Hardware requirements
To run this bot we recommend you a cloud instance with a minimum of: To run this bot we recommend you a linux cloud instance with a minimum of:
- 2GB RAM - 2GB RAM
- 1GB disk space - 1GB disk space

View File

@ -18,6 +18,9 @@ Click each one for install guide:
We also recommend a [Telegram bot](telegram-usage.md#setup-your-telegram-bot), which is optional but recommended. We also recommend a [Telegram bot](telegram-usage.md#setup-your-telegram-bot), which is optional but recommended.
!!! Warning "Up-to-date clock"
The clock on the system running the bot must be accurate, synchronized to a NTP server frequently enough to avoid problems with communication to the exchanges.
## Quick start ## Quick start
Freqtrade provides the Linux/MacOS Easy Installation script to install all dependencies and help you configure the bot. Freqtrade provides the Linux/MacOS Easy Installation script to install all dependencies and help you configure the bot.

View File

@ -1,2 +1,2 @@
mkdocs-material==5.5.7 mkdocs-material==5.5.11
mdx_truly_sane_lists==1.2 mdx_truly_sane_lists==1.2

View File

@ -116,6 +116,7 @@ python3 scripts/rest_client.py --config rest_config.json <command> [optional par
| `trades` | List last trades. | `trades` | List last trades.
| `delete_trade <trade_id>` | Remove trade from the database. Tries to close open orders. Requires manual handling of this trade on the exchange. | `delete_trade <trade_id>` | Remove trade from the database. Tries to close open orders. Requires manual handling of this trade on the exchange.
| `show_config` | Shows part of the current configuration with relevant settings to operation | `show_config` | Shows part of the current configuration with relevant settings to operation
| `logs` | Shows last log messages
| `status` | Lists all open trades | `status` | Lists all open trades
| `count` | Displays number of trades used and available | `count` | Displays number of trades used and available
| `profit` | Display a summary of your profit/loss from close trades and some stats about your performance | `profit` | Display a summary of your profit/loss from close trades and some stats about your performance
@ -138,78 +139,83 @@ python3 scripts/rest_client.py help
``` output ``` output
Possible commands: Possible commands:
balance balance
Get the account balance Get the account balance.
:returns: json object
blacklist blacklist
Show the current blacklist Show the current blacklist.
:param add: List of coins to add (example: "BNB/BTC") :param add: List of coins to add (example: "BNB/BTC")
:returns: json object
count count
Returns the amount of open trades Return the amount of open trades.
:returns: json object
daily daily
Returns the amount of open trades Return the amount of open trades.
:returns: json object
delete_trade
Delete trade from the database.
Tries to close open orders. Requires manual handling of this asset on the exchange.
:param trade_id: Deletes the trade with this ID from the database.
edge edge
Returns information about edge Return information about edge.
:returns: json object
forcebuy forcebuy
Buy an asset Buy an asset.
:param pair: Pair to buy (ETH/BTC) :param pair: Pair to buy (ETH/BTC)
:param price: Optional - price to buy :param price: Optional - price to buy
:returns: json object of the trade
forcesell forcesell
Force-sell a trade Force-sell a trade.
:param tradeid: Id of the trade (can be received via status command) :param tradeid: Id of the trade (can be received via status command)
:returns: json object
logs
Show latest logs.
:param limit: Limits log messages to the last <limit> logs. No limit to get all the trades.
performance performance
Returns the performance of the different coins Return the performance of the different coins.
:returns: json object
profit profit
Returns the profit summary Return the profit summary.
:returns: json object
reload_config reload_config
Reload configuration Reload configuration.
:returns: json object
show_config show_config
Returns part of the configuration, relevant for trading operations. Returns part of the configuration, relevant for trading operations.
:return: json object containing the version
start start
Start the bot if it's in stopped state. Start the bot if it's in the stopped state.
:returns: json object
status status
Get the status of open trades Get the status of open trades.
:returns: json object
stop stop
Stop the bot. Use start to restart Stop the bot. Use `start` to restart.
:returns: json object
stopbuy stopbuy
Stop buying (but handle sells gracefully). Stop buying (but handle sells gracefully). Use `reload_config` to reset.
use reload_config to reset
:returns: json object trades
Return trades history.
:param limit: Limits trades to the X last trades. No limit to get all the trades.
version version
Returns the version of the bot Return the version of the bot.
:returns: json object containing the version
whitelist whitelist
Show the current whitelist Show the current whitelist.
:returns: json object
``` ```
## Advanced API usage using JWT tokens ## Advanced API usage using JWT tokens

View File

@ -1,104 +1,59 @@
# Sandbox API testing # Sandbox API testing
Where an exchange provides a sandbox for risk-free integration, or end-to-end, testing CCXT provides access to these. Some exchanges provide sandboxes or testbeds for risk-free testing, while running the bot against a real exchange.
With some configuration, freqtrade (in combination with ccxt) provides access to these.
This document is a *light overview of configuring Freqtrade and GDAX sandbox. This document is an overview to configure Freqtrade to be used with sandboxes.
This can be useful to developers and trader alike as Freqtrade is quite customisable. This can be useful to developers and trader alike.
When testing your API connectivity, make sure to use the following URLs. ## Exchanges known to have a sandbox / testnet
***Website**
https://public.sandbox.gdax.com * [binance](https://testnet.binance.vision/)
***REST API** * [coinbasepro](https://public.sandbox.pro.coinbase.com)
https://api-public.sandbox.gdax.com * [gemini](https://exchange.sandbox.gemini.com/)
* [huobipro](https://www.testnet.huobi.pro/)
* [kucoin](https://sandbox.kucoin.com/)
* [phemex](https://testnet.phemex.com/)
!!! Note
We did not test correct functioning of all of the above testnets. Please report your experiences with each sandbox.
--- ---
# Configure a Sandbox account on Gdax ## Configure a Sandbox account
Aim of this document section When testing your API connectivity, make sure to use the appropriate sandbox / testnet URL.
- An sanbox account In general, you should follow these steps to enable an exchange's sandbox:
- create 2FA (needed to create an API)
- Add test 50BTC to account
- Create :
- - API-KEY
- - API-Secret
- - API Password
## Acccount * Figure out if an exchange has a sandbox (most likely by using google or the exchange's support documents)
* Create a sandbox account (often the sandbox-account requires separate registration)
* [Add some test assets to account](#add-test-funds)
* Create API keys
This link will redirect to the sandbox main page to login / create account dialogues: ### Add test funds
https://public.sandbox.pro.coinbase.com/orders/
After registration and Email confimation you wil be redirected into your sanbox account. It is easy to verify you're in sandbox by checking the URL bar. Usually, sandbox exchanges allow depositing funds directly via web-interface.
> https://public.sandbox.pro.coinbase.com/ You should make sure to have a realistic amount of funds available to your test-account, so results are representable of your real account funds.
## Enable 2Fa (a prerequisite to creating sandbox API Keys) !!! Warning
Test exchanges will **NEVER** require your real credit card or banking details!
From within sand box site select your profile, top right. ## Configure freqtrade to use a exchange's sandbox
>Or as a direct link: https://public.sandbox.pro.coinbase.com/profile
From the menu panel to the left of the screen select ### Sandbox URLs
> Security: "*View or Update*"
In the new site select "enable authenticator" as typical google Authenticator.
- open Google Authenticator on your phone
- scan barcode
- enter your generated 2fa
## Enable API Access
From within sandbox select profile>api>create api-keys
>or as a direct link: https://public.sandbox.pro.coinbase.com/profile/api
Click on "create one" and ensure **view** and **trade** are "checked" and sumbit your 2FA
- **Copy and paste the Passphase** into a notepade this will be needed later
- **Copy and paste the API Secret** popup into a notepad this will needed later
- **Copy and paste the API Key** into a notepad this will needed later
## Add 50 BTC test funds
To add funds, use the web interface deposit and withdraw buttons.
To begin select 'Wallets' from the top menu.
> Or as a direct link: https://public.sandbox.pro.coinbase.com/wallets
- Deposits (bottom left of screen)
- - Deposit Funds Bitcoin
- - - Coinbase BTC Wallet
- - - - Max (50 BTC)
- - - - - Deposit
*This process may be repeated for other currencies, ETH as example*
---
# Configure Freqtrade to use Gax Sandbox
The aim of this document section
- Enable sandbox URLs in Freqtrade
- Configure API
- - secret
- - key
- - passphrase
## Sandbox URLs
Freqtrade makes use of CCXT which in turn provides a list of URLs to Freqtrade. Freqtrade makes use of CCXT which in turn provides a list of URLs to Freqtrade.
These include `['test']` and `['api']`. These include `['test']` and `['api']`.
- `[Test]` if available will point to an Exchanges sandbox. * `[Test]` if available will point to an Exchanges sandbox.
- `[Api]` normally used, and resolves to live API target on the exchange * `[Api]` normally used, and resolves to live API target on the exchange.
To make use of sandbox / test add "sandbox": true, to your config.json To make use of sandbox / test add "sandbox": true, to your config.json
```json ```json
"exchange": { "exchange": {
"name": "gdax", "name": "coinbasepro",
"sandbox": true, "sandbox": true,
"key": "5wowfxemogxeowo;heiohgmd", "key": "5wowfxemogxeowo;heiohgmd",
"secret": "/ZMH1P62rCVmwefewrgcewX8nh4gob+lywxfwfxwwfxwfNsH1ySgvWCUR/w==", "secret": "/ZMH1P62rCVmwefewrgcewX8nh4gob+lywxfwfxwwfxwfNsH1ySgvWCUR/w==",
@ -106,36 +61,57 @@ To make use of sandbox / test add "sandbox": true, to your config.json
"outdated_offset": 5 "outdated_offset": 5
"pair_whitelist": [ "pair_whitelist": [
"BTC/USD" "BTC/USD"
]
},
"datadir": "user_data/data/coinbasepro_sandbox"
``` ```
Also insert your Also the following information:
- api-key (noted earlier) * api-key (created for the sandbox webpage)
- api-secret (noted earlier) * api-secret (noted earlier)
- password (the passphrase - noted earlier) * password (the passphrase - noted earlier)
!!! Tip "Different data directory"
We also recommend to set `datadir` to something identifying downloaded data as sandbox data, to avoid having sandbox data mixed with data from the real exchange.
This can be done by adding the `"datadir"` key to the configuration.
Now, whenever you use this configuration, your data directory will be set to this directory.
--- ---
## You should now be ready to test your sandbox ## You should now be ready to test your sandbox
Ensure Freqtrade logs show the sandbox URL, and trades made are shown in sandbox. Ensure Freqtrade logs show the sandbox URL, and trades made are shown in sandbox. Also make sure to select a pair which shows at least some decent value (which very often is BTC/<somestablecoin>).
** Typically the BTC/USD has the most activity in sandbox to test against.
## GDAX - Old Candles problem ## Common problems with sandbox exchanges
It is my experience that GDAX sandbox candles may be 20+- minutes out of date. This can cause trades to fail as one of Freqtrades safety checks. Sandbox exchange instances often have very low volume, which can cause some problems which usually are not seen on a real exchange instance.
To disable this check, add / change the `"outdated_offset"` parameter in the exchange section of your configuration to adjust for this delay. ### Old Candles problem
Example based on the above configuration:
```json Since Sandboxes often have low volume, candles can be quite old and show no volume.
"exchange": { To disable the error "Outdated history for pair ...", best increase the parameter `"outdated_offset"` to a number that seems realistic for the sandbox you're using.
"name": "gdax",
"sandbox": true, ### Unfilled orders
"key": "5wowfxemogxeowo;heiohgmd",
"secret": "/ZMH1P62rCVmwefewrgcewX8nh4gob+lywxfwfxwwfxwfNsH1ySgvWCUR/w==", Sandboxes often have very low volumes - which means that many trades can go unfilled, or can go unfilled for a very long time.
"password": "1bkjfkhfhfu6sr",
"outdated_offset": 30 To mitigate this, you can try to match the first order on the opposite orderbook side using the following configuration:
"pair_whitelist": [
"BTC/USD" ``` jsonc
``` "order_types": {
"buy": "limit",
"sell": "limit"
// ...
},
"bid_strategy": {
"price_side": "ask",
// ...
},
"ask_strategy":{
"price_side": "bid",
// ...
},
```
The configuration is similar to the suggested configuration for market orders - however by using limit-orders you can avoid moving the price too much, and you can set the worst price you might get.

View File

@ -54,6 +54,7 @@ official commands. You can ask at any moment for help with `/help`.
| `/stopbuy` | Stops the trader from opening new trades. Gracefully closes open trades according to their rules. | `/stopbuy` | Stops the trader from opening new trades. Gracefully closes open trades according to their rules.
| `/reload_config` | Reloads the configuration file | `/reload_config` | Reloads the configuration file
| `/show_config` | Shows part of the current configuration with relevant settings to operation | `/show_config` | Shows part of the current configuration with relevant settings to operation
| `/logs [limit]` | Show last log messages.
| `/status` | Lists all open trades | `/status` | Lists all open trades
| `/status table` | List all open trades in a table format. Pending buy orders are marked with an asterisk (*) Pending sell orders are marked with a double asterisk (**) | `/status table` | List all open trades in a table format. Pending buy orders are marked with an asterisk (*) Pending sell orders are marked with a double asterisk (**)
| `/trades [limit]` | List all recently closed trades in a table format. | `/trades [limit]` | List all recently closed trades in a table format.

View File

@ -15,7 +15,7 @@ ARGS_STRATEGY = ["strategy", "strategy_path"]
ARGS_TRADE = ["db_url", "sd_notify", "dry_run"] ARGS_TRADE = ["db_url", "sd_notify", "dry_run"]
ARGS_COMMON_OPTIMIZE = ["timeframe", "timerange", ARGS_COMMON_OPTIMIZE = ["timeframe", "timerange", "dataformat_ohlcv",
"max_open_trades", "stake_amount", "fee"] "max_open_trades", "stake_amount", "fee"]
ARGS_BACKTEST = ARGS_COMMON_OPTIMIZE + ["position_stacking", "use_max_market_positions", ARGS_BACKTEST = ARGS_COMMON_OPTIMIZE + ["position_stacking", "use_max_market_positions",

View File

@ -35,8 +35,8 @@ def start_download_data(args: Dict[str, Any]) -> None:
"Downloading data requires a list of pairs. " "Downloading data requires a list of pairs. "
"Please check the documentation on how to configure this.") "Please check the documentation on how to configure this.")
logger.info(f'About to download pairs: {config["pairs"]}, ' logger.info(f"About to download pairs: {config['pairs']}, "
f'intervals: {config["timeframes"]} to {config["datadir"]}') f"intervals: {config['timeframes']} to {config['datadir']}")
pairs_not_available: List[str] = [] pairs_not_available: List[str] = []
@ -51,21 +51,21 @@ def start_download_data(args: Dict[str, Any]) -> None:
if config.get('download_trades'): if config.get('download_trades'):
pairs_not_available = refresh_backtest_trades_data( pairs_not_available = refresh_backtest_trades_data(
exchange, pairs=config["pairs"], datadir=config['datadir'], exchange, pairs=config['pairs'], datadir=config['datadir'],
timerange=timerange, erase=bool(config.get("erase")), timerange=timerange, erase=bool(config.get('erase')),
data_format=config['dataformat_trades']) data_format=config['dataformat_trades'])
# Convert downloaded trade data to different timeframes # Convert downloaded trade data to different timeframes
convert_trades_to_ohlcv( convert_trades_to_ohlcv(
pairs=config["pairs"], timeframes=config["timeframes"], pairs=config['pairs'], timeframes=config['timeframes'],
datadir=config['datadir'], timerange=timerange, erase=bool(config.get("erase")), datadir=config['datadir'], timerange=timerange, erase=bool(config.get('erase')),
data_format_ohlcv=config['dataformat_ohlcv'], data_format_ohlcv=config['dataformat_ohlcv'],
data_format_trades=config['dataformat_trades'], data_format_trades=config['dataformat_trades'],
) )
else: else:
pairs_not_available = refresh_backtest_ohlcv_data( pairs_not_available = refresh_backtest_ohlcv_data(
exchange, pairs=config["pairs"], timeframes=config["timeframes"], exchange, pairs=config['pairs'], timeframes=config['timeframes'],
datadir=config['datadir'], timerange=timerange, erase=bool(config.get("erase")), datadir=config['datadir'], timerange=timerange, erase=bool(config.get('erase')),
data_format=config['dataformat_ohlcv']) data_format=config['dataformat_ohlcv'])
except KeyboardInterrupt: except KeyboardInterrupt:

View File

@ -75,7 +75,7 @@ def start_new_strategy(args: Dict[str, Any]) -> None:
if args["strategy"] == "DefaultStrategy": if args["strategy"] == "DefaultStrategy":
raise OperationalException("DefaultStrategy is not allowed as name.") raise OperationalException("DefaultStrategy is not allowed as name.")
new_path = config['user_data_dir'] / USERPATH_STRATEGIES / (args["strategy"] + ".py") new_path = config['user_data_dir'] / USERPATH_STRATEGIES / (args['strategy'] + '.py')
if new_path.exists(): if new_path.exists():
raise OperationalException(f"`{new_path}` already exists. " raise OperationalException(f"`{new_path}` already exists. "
@ -125,11 +125,11 @@ def start_new_hyperopt(args: Dict[str, Any]) -> None:
config = setup_utils_configuration(args, RunMode.UTIL_NO_EXCHANGE) config = setup_utils_configuration(args, RunMode.UTIL_NO_EXCHANGE)
if "hyperopt" in args and args["hyperopt"]: if 'hyperopt' in args and args['hyperopt']:
if args["hyperopt"] == "DefaultHyperopt": if args['hyperopt'] == 'DefaultHyperopt':
raise OperationalException("DefaultHyperopt is not allowed as name.") raise OperationalException("DefaultHyperopt is not allowed as name.")
new_path = config['user_data_dir'] / USERPATH_HYPEROPTS / (args["hyperopt"] + ".py") new_path = config['user_data_dir'] / USERPATH_HYPEROPTS / (args['hyperopt'] + '.py')
if new_path.exists(): if new_path.exists():
raise OperationalException(f"`{new_path}` already exists. " raise OperationalException(f"`{new_path}` already exists. "

View File

@ -54,7 +54,7 @@ class Configuration:
:param files: List of file paths :param files: List of file paths
:return: configuration dictionary :return: configuration dictionary
""" """
c = Configuration({"config": files}, RunMode.OTHER) c = Configuration({'config': files}, RunMode.OTHER)
return c.get_config() return c.get_config()
def load_from_files(self, files: List[str]) -> Dict[str, Any]: def load_from_files(self, files: List[str]) -> Dict[str, Any]:
@ -123,10 +123,10 @@ class Configuration:
the -v/--verbose, --logfile options the -v/--verbose, --logfile options
""" """
# Log level # Log level
config.update({'verbosity': self.args.get("verbosity", 0)}) config.update({'verbosity': self.args.get('verbosity', 0)})
if 'logfile' in self.args and self.args["logfile"]: if 'logfile' in self.args and self.args['logfile']:
config.update({'logfile': self.args["logfile"]}) config.update({'logfile': self.args['logfile']})
setup_logging(config) setup_logging(config)
@ -149,22 +149,22 @@ class Configuration:
def _process_common_options(self, config: Dict[str, Any]) -> None: def _process_common_options(self, config: Dict[str, Any]) -> None:
# Set strategy if not specified in config and or if it's non default # Set strategy if not specified in config and or if it's non default
if self.args.get("strategy") or not config.get('strategy'): if self.args.get('strategy') or not config.get('strategy'):
config.update({'strategy': self.args.get("strategy")}) config.update({'strategy': self.args.get('strategy')})
self._args_to_config(config, argname='strategy_path', self._args_to_config(config, argname='strategy_path',
logstring='Using additional Strategy lookup path: {}') logstring='Using additional Strategy lookup path: {}')
if ('db_url' in self.args and self.args["db_url"] and if ('db_url' in self.args and self.args['db_url'] and
self.args["db_url"] != constants.DEFAULT_DB_PROD_URL): self.args['db_url'] != constants.DEFAULT_DB_PROD_URL):
config.update({'db_url': self.args["db_url"]}) config.update({'db_url': self.args['db_url']})
logger.info('Parameter --db-url detected ...') logger.info('Parameter --db-url detected ...')
if config.get('forcebuy_enable', False): if config.get('forcebuy_enable', False):
logger.warning('`forcebuy` RPC message enabled.') logger.warning('`forcebuy` RPC message enabled.')
# Support for sd_notify # Support for sd_notify
if 'sd_notify' in self.args and self.args["sd_notify"]: if 'sd_notify' in self.args and self.args['sd_notify']:
config['internals'].update({'sd_notify': True}) config['internals'].update({'sd_notify': True})
def _process_datadir_options(self, config: Dict[str, Any]) -> None: def _process_datadir_options(self, config: Dict[str, Any]) -> None:
@ -173,24 +173,24 @@ class Configuration:
--user-data, --datadir --user-data, --datadir
""" """
# Check exchange parameter here - otherwise `datadir` might be wrong. # Check exchange parameter here - otherwise `datadir` might be wrong.
if "exchange" in self.args and self.args["exchange"]: if 'exchange' in self.args and self.args['exchange']:
config['exchange']['name'] = self.args["exchange"] config['exchange']['name'] = self.args['exchange']
logger.info(f"Using exchange {config['exchange']['name']}") logger.info(f"Using exchange {config['exchange']['name']}")
if 'pair_whitelist' not in config['exchange']: if 'pair_whitelist' not in config['exchange']:
config['exchange']['pair_whitelist'] = [] config['exchange']['pair_whitelist'] = []
if 'user_data_dir' in self.args and self.args["user_data_dir"]: if 'user_data_dir' in self.args and self.args['user_data_dir']:
config.update({'user_data_dir': self.args["user_data_dir"]}) config.update({'user_data_dir': self.args['user_data_dir']})
elif 'user_data_dir' not in config: elif 'user_data_dir' not in config:
# Default to cwd/user_data (legacy option ...) # Default to cwd/user_data (legacy option ...)
config.update({'user_data_dir': str(Path.cwd() / "user_data")}) config.update({'user_data_dir': str(Path.cwd() / 'user_data')})
# reset to user_data_dir so this contains the absolute path. # reset to user_data_dir so this contains the absolute path.
config['user_data_dir'] = create_userdata_dir(config['user_data_dir'], create_dir=False) config['user_data_dir'] = create_userdata_dir(config['user_data_dir'], create_dir=False)
logger.info('Using user-data directory: %s ...', config['user_data_dir']) logger.info('Using user-data directory: %s ...', config['user_data_dir'])
config.update({'datadir': create_datadir(config, self.args.get("datadir", None))}) config.update({'datadir': create_datadir(config, self.args.get('datadir', None))})
logger.info('Using data directory: %s ...', config.get('datadir')) logger.info('Using data directory: %s ...', config.get('datadir'))
if self.args.get('exportfilename'): if self.args.get('exportfilename'):
@ -219,8 +219,8 @@ class Configuration:
config.update({'use_max_market_positions': False}) config.update({'use_max_market_positions': False})
logger.info('Parameter --disable-max-market-positions detected ...') logger.info('Parameter --disable-max-market-positions detected ...')
logger.info('max_open_trades set to unlimited ...') logger.info('max_open_trades set to unlimited ...')
elif 'max_open_trades' in self.args and self.args["max_open_trades"]: elif 'max_open_trades' in self.args and self.args['max_open_trades']:
config.update({'max_open_trades': self.args["max_open_trades"]}) config.update({'max_open_trades': self.args['max_open_trades']})
logger.info('Parameter --max-open-trades detected, ' logger.info('Parameter --max-open-trades detected, '
'overriding max_open_trades to: %s ...', config.get('max_open_trades')) 'overriding max_open_trades to: %s ...', config.get('max_open_trades'))
elif config['runmode'] in NON_UTIL_MODES: elif config['runmode'] in NON_UTIL_MODES:
@ -447,12 +447,12 @@ class Configuration:
config['pairs'].sort() config['pairs'].sort()
return return
if "config" in self.args and self.args["config"]: if 'config' in self.args and self.args['config']:
logger.info("Using pairlist from configuration.") logger.info("Using pairlist from configuration.")
config['pairs'] = config.get('exchange', {}).get('pair_whitelist') config['pairs'] = config.get('exchange', {}).get('pair_whitelist')
else: else:
# Fall back to /dl_path/pairs.json # Fall back to /dl_path/pairs.json
pairs_file = config['datadir'] / "pairs.json" pairs_file = config['datadir'] / 'pairs.json'
if pairs_file.exists(): if pairs_file.exists():
with pairs_file.open('r') as f: with pairs_file.open('r') as f:
config['pairs'] = json_load(f) config['pairs'] = json_load(f)

View File

@ -24,7 +24,7 @@ ORDERTIF_POSSIBILITIES = ['gtc', 'fok', 'ioc']
AVAILABLE_PAIRLISTS = ['StaticPairList', 'VolumePairList', AVAILABLE_PAIRLISTS = ['StaticPairList', 'VolumePairList',
'AgeFilter', 'PrecisionFilter', 'PriceFilter', 'AgeFilter', 'PrecisionFilter', 'PriceFilter',
'ShuffleFilter', 'SpreadFilter'] 'ShuffleFilter', 'SpreadFilter']
AVAILABLE_DATAHANDLERS = ['json', 'jsongz'] AVAILABLE_DATAHANDLERS = ['json', 'jsongz', 'hdf5']
DRY_RUN_WALLET = 1000 DRY_RUN_WALLET = 1000
DATETIME_PRINT_FORMAT = '%Y-%m-%d %H:%M:%S' DATETIME_PRINT_FORMAT = '%Y-%m-%d %H:%M:%S'
MATH_CLOSE_PREC = 1e-14 # Precision used for float comparisons MATH_CLOSE_PREC = 1e-14 # Precision used for float comparisons

View File

@ -208,7 +208,7 @@ def load_trades_from_db(db_url: str, strategy: Optional[str] = None) -> pd.DataF
def load_trades(source: str, db_url: str, exportfilename: Path, def load_trades(source: str, db_url: str, exportfilename: Path,
no_trades: bool = False, strategy: Optional[str] = None) -> pd.DataFrame: no_trades: bool = False, strategy: Optional[str] = None) -> pd.DataFrame:
""" """
Based on configuration option "trade_source": Based on configuration option 'trade_source':
* loads data from DB (using `db_url`) * loads data from DB (using `db_url`)
* loads data from backtestfile (using `exportfilename`) * loads data from backtestfile (using `exportfilename`)
:param source: "DB" or "file" - specify source to load from :param source: "DB" or "file" - specify source to load from

View File

@ -255,6 +255,7 @@ def convert_ohlcv_format(config: Dict[str, Any], convert_from: str, convert_to:
drop_incomplete=False, drop_incomplete=False,
startup_candles=0) startup_candles=0)
logger.info(f"Converting {len(data)} candles for {pair}") logger.info(f"Converting {len(data)} candles for {pair}")
if len(data) > 0:
trg.ohlcv_store(pair=pair, timeframe=timeframe, data=data) trg.ohlcv_store(pair=pair, timeframe=timeframe, data=data)
if erase and convert_from != convert_to: if erase and convert_from != convert_to:
logger.info(f"Deleting source data for {pair} / {timeframe}") logger.info(f"Deleting source data for {pair} / {timeframe}")

View File

@ -39,6 +39,12 @@ class DataProvider:
""" """
self.__cached_pairs[(pair, timeframe)] = (dataframe, Arrow.utcnow().datetime) self.__cached_pairs[(pair, timeframe)] = (dataframe, Arrow.utcnow().datetime)
def add_pairlisthandler(self, pairlists) -> None:
"""
Allow adding pairlisthandler after initialization
"""
self._pairlists = pairlists
def refresh(self, def refresh(self,
pairlist: ListPairsWithTimeframes, pairlist: ListPairsWithTimeframes,
helping_pairs: ListPairsWithTimeframes = None) -> None: helping_pairs: ListPairsWithTimeframes = None) -> None:

View File

@ -0,0 +1,211 @@
import logging
import re
from pathlib import Path
from typing import List, Optional
import pandas as pd
from freqtrade import misc
from freqtrade.configuration import TimeRange
from freqtrade.constants import (DEFAULT_DATAFRAME_COLUMNS,
DEFAULT_TRADES_COLUMNS,
ListPairsWithTimeframes)
from .idatahandler import IDataHandler, TradeList
logger = logging.getLogger(__name__)
class HDF5DataHandler(IDataHandler):
_columns = DEFAULT_DATAFRAME_COLUMNS
@classmethod
def ohlcv_get_available_data(cls, datadir: Path) -> ListPairsWithTimeframes:
"""
Returns a list of all pairs with ohlcv data available in this datadir
:param datadir: Directory to search for ohlcv files
:return: List of Tuples of (pair, timeframe)
"""
_tmp = [re.search(r'^([a-zA-Z_]+)\-(\d+\S+)(?=.h5)', p.name)
for p in datadir.glob("*.h5")]
return [(match[1].replace('_', '/'), match[2]) for match in _tmp
if match and len(match.groups()) > 1]
@classmethod
def ohlcv_get_pairs(cls, datadir: Path, timeframe: str) -> List[str]:
"""
Returns a list of all pairs with ohlcv data available in this datadir
for the specified timeframe
:param datadir: Directory to search for ohlcv files
:param timeframe: Timeframe to search pairs for
:return: List of Pairs
"""
_tmp = [re.search(r'^(\S+)(?=\-' + timeframe + '.h5)', p.name)
for p in datadir.glob(f"*{timeframe}.h5")]
# Check if regex found something and only return these results
return [match[0].replace('_', '/') for match in _tmp if match]
def ohlcv_store(self, pair: str, timeframe: str, data: pd.DataFrame) -> None:
"""
Store data in hdf5 file.
:param pair: Pair - used to generate filename
:timeframe: Timeframe - used to generate filename
:data: Dataframe containing OHLCV data
:return: None
"""
key = self._pair_ohlcv_key(pair, timeframe)
_data = data.copy()
filename = self._pair_data_filename(self._datadir, pair, timeframe)
ds = pd.HDFStore(filename, mode='a', complevel=9, complib='blosc')
ds.put(key, _data.loc[:, self._columns], format='table', data_columns=['date'])
ds.close()
def _ohlcv_load(self, pair: str, timeframe: str,
timerange: Optional[TimeRange] = None) -> pd.DataFrame:
"""
Internal method used to load data for one pair from disk.
Implements the loading and conversion to a Pandas dataframe.
Timerange trimming and dataframe validation happens outside of this method.
:param pair: Pair to load data
:param timeframe: Timeframe (e.g. "5m")
:param timerange: Limit data to be loaded to this timerange.
Optionally implemented by subclasses to avoid loading
all data where possible.
:return: DataFrame with ohlcv data, or empty DataFrame
"""
key = self._pair_ohlcv_key(pair, timeframe)
filename = self._pair_data_filename(self._datadir, pair, timeframe)
if not filename.exists():
return pd.DataFrame(columns=self._columns)
where = []
if timerange:
if timerange.starttype == 'date':
where.append(f"date >= Timestamp({timerange.startts * 1e9})")
if timerange.stoptype == 'date':
where.append(f"date < Timestamp({timerange.stopts * 1e9})")
pairdata = pd.read_hdf(filename, key=key, mode="r", where=where)
if list(pairdata.columns) != self._columns:
raise ValueError("Wrong dataframe format")
pairdata = pairdata.astype(dtype={'open': 'float', 'high': 'float',
'low': 'float', 'close': 'float', 'volume': 'float'})
return pairdata
def ohlcv_purge(self, pair: str, timeframe: str) -> bool:
"""
Remove data for this pair
:param pair: Delete data for this pair.
:param timeframe: Timeframe (e.g. "5m")
:return: True when deleted, false if file did not exist.
"""
filename = self._pair_data_filename(self._datadir, pair, timeframe)
if filename.exists():
filename.unlink()
return True
return False
def ohlcv_append(self, pair: str, timeframe: str, data: pd.DataFrame) -> None:
"""
Append data to existing data structures
:param pair: Pair
:param timeframe: Timeframe this ohlcv data is for
:param data: Data to append.
"""
raise NotImplementedError()
@classmethod
def trades_get_pairs(cls, datadir: Path) -> List[str]:
"""
Returns a list of all pairs for which trade data is available in this
:param datadir: Directory to search for ohlcv files
:return: List of Pairs
"""
_tmp = [re.search(r'^(\S+)(?=\-trades.h5)', p.name)
for p in datadir.glob("*trades.h5")]
# Check if regex found something and only return these results to avoid exceptions.
return [match[0].replace('_', '/') for match in _tmp if match]
def trades_store(self, pair: str, data: TradeList) -> None:
"""
Store trades data (list of Dicts) to file
:param pair: Pair - used for filename
:param data: List of Lists containing trade data,
column sequence as in DEFAULT_TRADES_COLUMNS
"""
key = self._pair_trades_key(pair)
ds = pd.HDFStore(self._pair_trades_filename(self._datadir, pair),
mode='a', complevel=9, complib='blosc')
ds.put(key, pd.DataFrame(data, columns=DEFAULT_TRADES_COLUMNS),
format='table', data_columns=['timestamp'])
ds.close()
def trades_append(self, pair: str, data: TradeList):
"""
Append data to existing files
:param pair: Pair - used for filename
:param data: List of Lists containing trade data,
column sequence as in DEFAULT_TRADES_COLUMNS
"""
raise NotImplementedError()
def _trades_load(self, pair: str, timerange: Optional[TimeRange] = None) -> TradeList:
"""
Load a pair from h5 file.
:param pair: Load trades for this pair
:param timerange: Timerange to load trades for - currently not implemented
:return: List of trades
"""
key = self._pair_trades_key(pair)
filename = self._pair_trades_filename(self._datadir, pair)
if not filename.exists():
return []
where = []
if timerange:
if timerange.starttype == 'date':
where.append(f"timestamp >= {timerange.startts * 1e3}")
if timerange.stoptype == 'date':
where.append(f"timestamp < {timerange.stopts * 1e3}")
trades = pd.read_hdf(filename, key=key, mode="r", where=where)
return trades.values.tolist()
def trades_purge(self, pair: str) -> bool:
"""
Remove data for this pair
:param pair: Delete data for this pair.
:return: True when deleted, false if file did not exist.
"""
filename = self._pair_trades_filename(self._datadir, pair)
if filename.exists():
filename.unlink()
return True
return False
@classmethod
def _pair_ohlcv_key(cls, pair: str, timeframe: str) -> str:
return f"{pair}/ohlcv/tf_{timeframe}"
@classmethod
def _pair_trades_key(cls, pair: str) -> str:
return f"{pair}/trades"
@classmethod
def _pair_data_filename(cls, datadir: Path, pair: str, timeframe: str) -> Path:
pair_s = misc.pair_to_filename(pair)
filename = datadir.joinpath(f'{pair_s}-{timeframe}.h5')
return filename
@classmethod
def _pair_trades_filename(cls, datadir: Path, pair: str) -> Path:
pair_s = misc.pair_to_filename(pair)
filename = datadir.joinpath(f'{pair_s}-trades.h5')
return filename

View File

@ -9,7 +9,8 @@ from pandas import DataFrame
from freqtrade.configuration import TimeRange from freqtrade.configuration import TimeRange
from freqtrade.constants import DEFAULT_DATAFRAME_COLUMNS from freqtrade.constants import DEFAULT_DATAFRAME_COLUMNS
from freqtrade.data.converter import (ohlcv_to_dataframe, from freqtrade.data.converter import (clean_ohlcv_dataframe,
ohlcv_to_dataframe,
trades_remove_duplicates, trades_remove_duplicates,
trades_to_ohlcv) trades_to_ohlcv)
from freqtrade.data.history.idatahandler import IDataHandler, get_datahandler from freqtrade.data.history.idatahandler import IDataHandler, get_datahandler
@ -202,7 +203,10 @@ def _download_pair_history(datadir: Path,
if data.empty: if data.empty:
data = new_dataframe data = new_dataframe
else: else:
data = data.append(new_dataframe) # Run cleaning again to ensure there were no duplicate candles
# Especially between existing and new data.
data = clean_ohlcv_dataframe(data.append(new_dataframe), timeframe, pair,
fill_missing=False, drop_incomplete=False)
logger.debug("New Start: %s", logger.debug("New Start: %s",
f"{data.iloc[0]['date']:%Y-%m-%d %H:%M:%S}" if not data.empty else 'None') f"{data.iloc[0]['date']:%Y-%m-%d %H:%M:%S}" if not data.empty else 'None')

View File

@ -50,9 +50,7 @@ class IDataHandler(ABC):
@abstractmethod @abstractmethod
def ohlcv_store(self, pair: str, timeframe: str, data: DataFrame) -> None: def ohlcv_store(self, pair: str, timeframe: str, data: DataFrame) -> None:
""" """
Store data in json format "values". Store ohlcv data.
format looks as follows:
[[<date>,<open>,<high>,<low>,<close>]]
:param pair: Pair - used to generate filename :param pair: Pair - used to generate filename
:timeframe: Timeframe - used to generate filename :timeframe: Timeframe - used to generate filename
:data: Dataframe containing OHLCV data :data: Dataframe containing OHLCV data
@ -239,6 +237,9 @@ def get_datahandlerclass(datatype: str) -> Type[IDataHandler]:
elif datatype == 'jsongz': elif datatype == 'jsongz':
from .jsondatahandler import JsonGzDataHandler from .jsondatahandler import JsonGzDataHandler
return JsonGzDataHandler return JsonGzDataHandler
elif datatype == 'hdf5':
from .hdf5datahandler import HDF5DataHandler
return HDF5DataHandler
else: else:
raise ValueError(f"No datahandler for datatype {datatype} available.") raise ValueError(f"No datahandler for datatype {datatype} available.")

View File

@ -20,6 +20,7 @@ BAD_EXCHANGES = {
"Details in https://github.com/freqtrade/freqtrade/issues/1983", "Details in https://github.com/freqtrade/freqtrade/issues/1983",
"hitbtc": "This API cannot be used with Freqtrade. " "hitbtc": "This API cannot be used with Freqtrade. "
"Use `hitbtc2` exchange id to access this exchange.", "Use `hitbtc2` exchange id to access this exchange.",
"phemex": "Does not provide history. ",
**dict.fromkeys([ **dict.fromkeys([
'adara', 'adara',
'anxpro', 'anxpro',

View File

@ -86,8 +86,8 @@ class Exchange:
# Deep merge ft_has with default ft_has options # Deep merge ft_has with default ft_has options
self._ft_has = deep_merge_dicts(self._ft_has, deepcopy(self._ft_has_default)) self._ft_has = deep_merge_dicts(self._ft_has, deepcopy(self._ft_has_default))
if exchange_config.get("_ft_has_params"): if exchange_config.get('_ft_has_params'):
self._ft_has = deep_merge_dicts(exchange_config.get("_ft_has_params"), self._ft_has = deep_merge_dicts(exchange_config.get('_ft_has_params'),
self._ft_has) self._ft_has)
logger.info("Overriding exchange._ft_has with config params, result: %s", self._ft_has) logger.info("Overriding exchange._ft_has with config params, result: %s", self._ft_has)

View File

@ -541,7 +541,9 @@ class FreqtradeBot:
""" """
logger.debug(f"create_trade for pair {pair}") logger.debug(f"create_trade for pair {pair}")
if self.strategy.is_pair_locked(pair): analyzed_df, _ = self.dataprovider.get_analyzed_dataframe(pair, self.strategy.timeframe)
if self.strategy.is_pair_locked(
pair, analyzed_df.iloc[-1]['date'] if len(analyzed_df) > 0 else None):
logger.info(f"Pair {pair} is currently locked.") logger.info(f"Pair {pair} is currently locked.")
return False return False
@ -552,7 +554,6 @@ class FreqtradeBot:
return False return False
# running get_signal on historical data fetched # running get_signal on historical data fetched
analyzed_df, _ = self.dataprovider.get_analyzed_dataframe(pair, self.strategy.timeframe)
(buy, sell) = self.strategy.get_signal(pair, self.strategy.timeframe, analyzed_df) (buy, sell) = self.strategy.get_signal(pair, self.strategy.timeframe, analyzed_df)
if buy and not sell: if buy and not sell:
@ -955,7 +956,7 @@ class FreqtradeBot:
stop_price = trade.open_rate * (1 + stoploss) stop_price = trade.open_rate * (1 + stoploss)
if self.create_stoploss_order(trade=trade, stop_price=stop_price): if self.create_stoploss_order(trade=trade, stop_price=stop_price):
trade.stoploss_last_update = datetime.now() trade.stoploss_last_update = datetime.utcnow()
return False return False
# If stoploss order is canceled for some reason we add it # If stoploss order is canceled for some reason we add it

View File

@ -1,14 +1,18 @@
import logging import logging
import sys import sys
from logging import Formatter from logging import Formatter
from logging.handlers import RotatingFileHandler, SysLogHandler from logging.handlers import (BufferingHandler, RotatingFileHandler,
from typing import Any, Dict, List SysLogHandler)
from typing import Any, Dict
from freqtrade.exceptions import OperationalException from freqtrade.exceptions import OperationalException
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
LOGFORMAT = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
# Initialize bufferhandler - will be used for /log endpoints
bufferHandler = BufferingHandler(1000)
bufferHandler.setFormatter(Formatter(LOGFORMAT))
def _set_loggers(verbosity: int = 0, api_verbosity: str = 'info') -> None: def _set_loggers(verbosity: int = 0, api_verbosity: str = 'info') -> None:
@ -33,17 +37,31 @@ def _set_loggers(verbosity: int = 0, api_verbosity: str = 'info') -> None:
) )
def setup_logging_pre() -> None:
"""
Early setup for logging.
Uses INFO loglevel and only the Streamhandler.
Early messages (before proper logging setup) will therefore only be sent to additional
logging handlers after the real initialization, because we don't know which
ones the user desires beforehand.
"""
logging.basicConfig(
level=logging.INFO,
format=LOGFORMAT,
handlers=[logging.StreamHandler(sys.stderr), bufferHandler]
)
def setup_logging(config: Dict[str, Any]) -> None: def setup_logging(config: Dict[str, Any]) -> None:
""" """
Process -v/--verbose, --logfile options Process -v/--verbose, --logfile options
""" """
# Log level # Log level
verbosity = config['verbosity'] verbosity = config['verbosity']
logging.root.addHandler(bufferHandler)
# Log to stderr
log_handlers: List[logging.Handler] = [logging.StreamHandler(sys.stderr)]
logfile = config.get('logfile') logfile = config.get('logfile')
if logfile: if logfile:
s = logfile.split(':') s = logfile.split(':')
if s[0] == 'syslog': if s[0] == 'syslog':
@ -58,28 +76,27 @@ def setup_logging(config: Dict[str, Any]) -> None:
# to perform reduction of repeating messages if this is set in the # to perform reduction of repeating messages if this is set in the
# syslog config. The messages should be equal for this. # syslog config. The messages should be equal for this.
handler.setFormatter(Formatter('%(name)s - %(levelname)s - %(message)s')) handler.setFormatter(Formatter('%(name)s - %(levelname)s - %(message)s'))
log_handlers.append(handler) logging.root.addHandler(handler)
elif s[0] == 'journald': elif s[0] == 'journald':
try: try:
from systemd.journal import JournaldLogHandler from systemd.journal import JournaldLogHandler
except ImportError: except ImportError:
raise OperationalException("You need the systemd python package be installed in " raise OperationalException("You need the systemd python package be installed in "
"order to use logging to journald.") "order to use logging to journald.")
handler = JournaldLogHandler() handler_jd = JournaldLogHandler()
# No datetime field for logging into journald, to allow syslog # No datetime field for logging into journald, to allow syslog
# to perform reduction of repeating messages if this is set in the # to perform reduction of repeating messages if this is set in the
# syslog config. The messages should be equal for this. # syslog config. The messages should be equal for this.
handler.setFormatter(Formatter('%(name)s - %(levelname)s - %(message)s')) handler_jd.setFormatter(Formatter('%(name)s - %(levelname)s - %(message)s'))
log_handlers.append(handler) logging.root.addHandler(handler_jd)
else: else:
log_handlers.append(RotatingFileHandler(logfile, handler_rf = RotatingFileHandler(logfile,
maxBytes=1024 * 1024, # 1Mb maxBytes=1024 * 1024 * 10, # 10Mb
backupCount=10)) backupCount=10)
handler_rf.setFormatter(Formatter(LOGFORMAT))
logging.root.addHandler(handler_rf)
logging.basicConfig( logging.root.setLevel(logging.INFO if verbosity < 1 else logging.DEBUG)
level=logging.INFO if verbosity < 1 else logging.DEBUG,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
handlers=log_handlers
)
_set_loggers(verbosity, config.get('api_server', {}).get('verbosity', 'info')) _set_loggers(verbosity, config.get('api_server', {}).get('verbosity', 'info'))
logger.info('Verbosity set to %s', verbosity) logger.info('Verbosity set to %s', verbosity)

View File

@ -3,18 +3,17 @@
Main Freqtrade bot script. Main Freqtrade bot script.
Read the documentation to know what cli arguments you need. Read the documentation to know what cli arguments you need.
""" """
import logging
from freqtrade.exceptions import FreqtradeException, OperationalException
import sys import sys
from typing import Any, List
# check min. python version # check min. python version
if sys.version_info < (3, 6): if sys.version_info < (3, 6):
sys.exit("Freqtrade requires Python version >= 3.6") sys.exit("Freqtrade requires Python version >= 3.6")
# flake8: noqa E402
import logging
from typing import Any, List
from freqtrade.commands import Arguments from freqtrade.commands import Arguments
from freqtrade.exceptions import FreqtradeException, OperationalException
from freqtrade.loggers import setup_logging_pre
logger = logging.getLogger('freqtrade') logger = logging.getLogger('freqtrade')
@ -28,6 +27,7 @@ def main(sysargv: List[str] = None) -> None:
return_code: Any = 1 return_code: Any = 1
try: try:
setup_logging_pre()
arguments = Arguments(sysargv) arguments = Arguments(sysargv)
args = arguments.get_parsed_arg() args = arguments.get_parsed_arg()

View File

@ -96,6 +96,7 @@ class Backtesting:
"PrecisionFilter not allowed for backtesting multiple strategies." "PrecisionFilter not allowed for backtesting multiple strategies."
) )
dataprovider.add_pairlisthandler(self.pairlists)
self.pairlists.refresh_pairlist() self.pairlists.refresh_pairlist()
if len(self.pairlists.whitelist) == 0: if len(self.pairlists.whitelist) == 0:

View File

@ -38,15 +38,15 @@ def init_plotscript(config):
""" """
if "pairs" in config: if "pairs" in config:
pairs = config["pairs"] pairs = config['pairs']
else: else:
pairs = config["exchange"]["pair_whitelist"] pairs = config['exchange']['pair_whitelist']
# Set timerange to use # Set timerange to use
timerange = TimeRange.parse_timerange(config.get("timerange")) timerange = TimeRange.parse_timerange(config.get('timerange'))
data = load_data( data = load_data(
datadir=config.get("datadir"), datadir=config.get('datadir'),
pairs=pairs, pairs=pairs,
timeframe=config.get('timeframe', '5m'), timeframe=config.get('timeframe', '5m'),
timerange=timerange, timerange=timerange,
@ -67,7 +67,7 @@ def init_plotscript(config):
db_url=config.get('db_url'), db_url=config.get('db_url'),
exportfilename=filename, exportfilename=filename,
no_trades=no_trades, no_trades=no_trades,
strategy=config.get("strategy"), strategy=config.get('strategy'),
) )
trades = trim_dataframe(trades, timerange, 'open_date') trades = trim_dataframe(trades, timerange, 'open_date')
@ -491,13 +491,13 @@ def load_and_plot_trades(config: Dict[str, Any]):
pair=pair, pair=pair,
data=df_analyzed, data=df_analyzed,
trades=trades_pair, trades=trades_pair,
indicators1=config.get("indicators1", []), indicators1=config.get('indicators1', []),
indicators2=config.get("indicators2", []), indicators2=config.get('indicators2', []),
plot_config=strategy.plot_config if hasattr(strategy, 'plot_config') else {} plot_config=strategy.plot_config if hasattr(strategy, 'plot_config') else {}
) )
store_plot_file(fig, filename=generate_plot_filename(pair, config['timeframe']), store_plot_file(fig, filename=generate_plot_filename(pair, config['timeframe']),
directory=config['user_data_dir'] / "plot") directory=config['user_data_dir'] / 'plot')
logger.info('End of plotting process. %s plots generated', pair_counter) logger.info('End of plotting process. %s plots generated', pair_counter)
@ -514,7 +514,7 @@ def plot_profit(config: Dict[str, Any]) -> None:
# Filter trades to relevant pairs # Filter trades to relevant pairs
# Remove open pairs - we don't know the profit yet so can't calculate profit for these. # Remove open pairs - we don't know the profit yet so can't calculate profit for these.
# Also, If only one open pair is left, then the profit-generation would fail. # Also, If only one open pair is left, then the profit-generation would fail.
trades = trades[(trades['pair'].isin(plot_elements["pairs"])) trades = trades[(trades['pair'].isin(plot_elements['pairs']))
& (~trades['close_date'].isnull()) & (~trades['close_date'].isnull())
] ]
if len(trades) == 0: if len(trades) == 0:
@ -523,7 +523,7 @@ def plot_profit(config: Dict[str, Any]) -> None:
# Create an average close price of all the pairs that were involved. # Create an average close price of all the pairs that were involved.
# this could be useful to gauge the overall market trend # this could be useful to gauge the overall market trend
fig = generate_profit_graph(plot_elements["pairs"], plot_elements["ohlcv"], fig = generate_profit_graph(plot_elements['pairs'], plot_elements['ohlcv'],
trades, config.get('timeframe', '5m')) trades, config.get('timeframe', '5m'))
store_plot_file(fig, filename='freqtrade-profit-plot.html', store_plot_file(fig, filename='freqtrade-profit-plot.html',
directory=config['user_data_dir'] / "plot", auto_open=True) directory=config['user_data_dir'] / 'plot', auto_open=True)

View File

@ -187,6 +187,7 @@ class ApiServer(RPC):
self.app.add_url_rule(f'{BASE_URI}/count', 'count', view_func=self._count, methods=['GET']) self.app.add_url_rule(f'{BASE_URI}/count', 'count', view_func=self._count, methods=['GET'])
self.app.add_url_rule(f'{BASE_URI}/daily', 'daily', view_func=self._daily, methods=['GET']) self.app.add_url_rule(f'{BASE_URI}/daily', 'daily', view_func=self._daily, methods=['GET'])
self.app.add_url_rule(f'{BASE_URI}/edge', 'edge', view_func=self._edge, methods=['GET']) self.app.add_url_rule(f'{BASE_URI}/edge', 'edge', view_func=self._edge, methods=['GET'])
self.app.add_url_rule(f'{BASE_URI}/logs', 'log', view_func=self._get_logs, methods=['GET'])
self.app.add_url_rule(f'{BASE_URI}/profit', 'profit', self.app.add_url_rule(f'{BASE_URI}/profit', 'profit',
view_func=self._profit, methods=['GET']) view_func=self._profit, methods=['GET'])
self.app.add_url_rule(f'{BASE_URI}/performance', 'performance', self.app.add_url_rule(f'{BASE_URI}/performance', 'performance',
@ -349,6 +350,18 @@ class ApiServer(RPC):
return self.rest_dump(stats) return self.rest_dump(stats)
@require_login
@rpc_catch_errors
def _get_logs(self):
"""
Returns latest logs
get:
param:
limit: Only get a certain number of records
"""
limit = int(request.args.get('limit', 0)) or None
return self.rest_dump(self._rpc_get_logs(limit))
@require_login @require_login
@rpc_catch_errors @rpc_catch_errors
def _edge(self): def _edge(self):

View File

@ -11,9 +11,9 @@ from typing import Any, Dict, List, Optional, Tuple, Union
import arrow import arrow
from numpy import NAN, mean from numpy import NAN, mean
from freqtrade.exceptions import (ExchangeError, from freqtrade.exceptions import ExchangeError, PricingError
PricingError)
from freqtrade.exchange import timeframe_to_minutes, timeframe_to_msecs from freqtrade.exchange import timeframe_to_minutes, timeframe_to_msecs
from freqtrade.loggers import bufferHandler
from freqtrade.misc import shorten_date from freqtrade.misc import shorten_date
from freqtrade.persistence import Trade from freqtrade.persistence import Trade
from freqtrade.rpc.fiat_convert import CryptoToFiatConverter from freqtrade.rpc.fiat_convert import CryptoToFiatConverter
@ -158,6 +158,7 @@ class RPC:
current_profit_abs=current_profit_abs, current_profit_abs=current_profit_abs,
stoploss_current_dist=stoploss_current_dist, stoploss_current_dist=stoploss_current_dist,
stoploss_current_dist_ratio=round(stoploss_current_dist_ratio, 8), stoploss_current_dist_ratio=round(stoploss_current_dist_ratio, 8),
stoploss_current_dist_pct=round(stoploss_current_dist_ratio * 100, 2),
stoploss_entry_dist=stoploss_entry_dist, stoploss_entry_dist=stoploss_entry_dist,
stoploss_entry_dist_ratio=round(stoploss_entry_dist_ratio, 8), stoploss_entry_dist_ratio=round(stoploss_entry_dist_ratio, 8),
open_order='({} {} rem={:.8f})'.format( open_order='({} {} rem={:.8f})'.format(
@ -631,6 +632,24 @@ class RPC:
} }
return res return res
def _rpc_get_logs(self, limit: Optional[int]) -> Dict[str, Any]:
"""Returns the last X logs"""
if limit:
buffer = bufferHandler.buffer[-limit:]
else:
buffer = bufferHandler.buffer
records = [[datetime.fromtimestamp(r.created).strftime("%Y-%m-%d %H:%M:%S"),
r.created * 1000, r.name, r.levelname,
r.message + ('\n' + r.exc_text if r.exc_text else '')]
for r in buffer]
# Log format:
# [logtime-formatted, logepoch, logger-name, loglevel, message \n + exception]
# e.g. ["2020-08-27 11:35:01", 1598520901097.9397,
# "freqtrade.worker", "INFO", "Starting worker develop"]
return {'log_count': len(records), 'logs': records}
def _rpc_edge(self) -> List[Dict[str, Any]]: def _rpc_edge(self) -> List[Dict[str, Any]]:
""" Returns information related to Edge """ """ Returns information related to Edge """
if not self._freqtrade.edge: if not self._freqtrade.edge:

View File

@ -12,6 +12,7 @@ from tabulate import tabulate
from telegram import ParseMode, ReplyKeyboardMarkup, Update from telegram import ParseMode, ReplyKeyboardMarkup, Update
from telegram.error import NetworkError, TelegramError from telegram.error import NetworkError, TelegramError
from telegram.ext import CallbackContext, CommandHandler, Updater from telegram.ext import CallbackContext, CommandHandler, Updater
from telegram.utils.helpers import escape_markdown
from freqtrade.__init__ import __version__ from freqtrade.__init__ import __version__
from freqtrade.rpc import RPC, RPCException, RPCMessageType from freqtrade.rpc import RPC, RPCException, RPCMessageType
@ -103,6 +104,7 @@ class Telegram(RPC):
CommandHandler('stopbuy', self._stopbuy), CommandHandler('stopbuy', self._stopbuy),
CommandHandler('whitelist', self._whitelist), CommandHandler('whitelist', self._whitelist),
CommandHandler('blacklist', self._blacklist), CommandHandler('blacklist', self._blacklist),
CommandHandler('logs', self._logs),
CommandHandler('edge', self._edge), CommandHandler('edge', self._edge),
CommandHandler('help', self._help), CommandHandler('help', self._help),
CommandHandler('version', self._version), CommandHandler('version', self._version),
@ -239,17 +241,18 @@ class Telegram(RPC):
("*Close Profit:* `{close_profit_pct}`" ("*Close Profit:* `{close_profit_pct}`"
if r['close_profit_pct'] is not None else ""), if r['close_profit_pct'] is not None else ""),
"*Current Profit:* `{current_profit_pct:.2f}%`", "*Current Profit:* `{current_profit_pct:.2f}%`",
]
if (r['stop_loss'] != r['initial_stop_loss']
and r['initial_stop_loss_pct'] is not None):
# Adding initial stoploss only if it is different from stoploss # Adding initial stoploss only if it is different from stoploss
"*Initial Stoploss:* `{initial_stop_loss:.8f}` " + lines.append("*Initial Stoploss:* `{initial_stop_loss:.8f}` "
("`({initial_stop_loss_pct:.2f}%)`") if ( "`({initial_stop_loss_pct:.2f}%)`")
r['stop_loss'] != r['initial_stop_loss']
and r['initial_stop_loss_pct'] is not None) else "",
# Adding stoploss and stoploss percentage only if it is not None # Adding stoploss and stoploss percentage only if it is not None
"*Stoploss:* `{stop_loss:.8f}` " + lines.append("*Stoploss:* `{stop_loss:.8f}` " +
("`({stop_loss_pct:.2f}%)`" if r['stop_loss_pct'] else ""), ("`({stop_loss_pct:.2f}%)`" if r['stop_loss_pct'] else ""))
] lines.append("*Stoploss distance:* `{stoploss_current_dist:.8f}` "
"`({stoploss_current_dist_pct:.2f}%)`")
if r['open_order']: if r['open_order']:
if r['sell_order_status']: if r['sell_order_status']:
lines.append("*Open Order:* `{open_order}` - `{sell_order_status}`") lines.append("*Open Order:* `{open_order}` - `{sell_order_status}`")
@ -637,6 +640,38 @@ class Telegram(RPC):
except RPCException as e: except RPCException as e:
self._send_msg(str(e)) self._send_msg(str(e))
@authorized_only
def _logs(self, update: Update, context: CallbackContext) -> None:
"""
Handler for /logs
Shows the latest logs
"""
try:
try:
limit = int(context.args[0])
except (TypeError, ValueError, IndexError):
limit = 10
logs = self._rpc_get_logs(limit)['logs']
msgs = ''
msg_template = "*{}* {}: {} \\- `{}`"
for logrec in logs:
msg = msg_template.format(escape_markdown(logrec[0], version=2),
escape_markdown(logrec[2], version=2),
escape_markdown(logrec[3], version=2),
escape_markdown(logrec[4], version=2))
if len(msgs + msg) + 10 >= MAX_TELEGRAM_MESSAGE_LENGTH:
# Send message immediately if it would become too long
self._send_msg(msgs, parse_mode=ParseMode.MARKDOWN_V2)
msgs = msg + '\n'
else:
# Append message to messages to send
msgs += msg + '\n'
if msgs:
self._send_msg(msgs, parse_mode=ParseMode.MARKDOWN_V2)
except RPCException as e:
self._send_msg(str(e))
@authorized_only @authorized_only
def _edge(self, update: Update, context: CallbackContext) -> None: def _edge(self, update: Update, context: CallbackContext) -> None:
""" """
@ -682,6 +717,7 @@ class Telegram(RPC):
"*/stopbuy:* `Stops buying, but handles open trades gracefully` \n" "*/stopbuy:* `Stops buying, but handles open trades gracefully` \n"
"*/reload_config:* `Reload configuration file` \n" "*/reload_config:* `Reload configuration file` \n"
"*/show_config:* `Show running configuration` \n" "*/show_config:* `Show running configuration` \n"
"*/logs [limit]:* `Show latest logs - defaults to 10` \n"
"*/whitelist:* `Show current whitelist` \n" "*/whitelist:* `Show current whitelist` \n"
"*/blacklist [pair]:* `Show current blacklist, or adds one or more pairs " "*/blacklist [pair]:* `Show current blacklist, or adds one or more pairs "
"to the blacklist.` \n" "to the blacklist.` \n"

View File

@ -14,8 +14,9 @@ from pandas import DataFrame
from freqtrade.constants import ListPairsWithTimeframes from freqtrade.constants import ListPairsWithTimeframes
from freqtrade.data.dataprovider import DataProvider from freqtrade.data.dataprovider import DataProvider
from freqtrade.exceptions import StrategyError, OperationalException from freqtrade.exceptions import OperationalException, StrategyError
from freqtrade.exchange import timeframe_to_minutes from freqtrade.exchange import timeframe_to_minutes
from freqtrade.exchange.exchange import timeframe_to_next_date
from freqtrade.persistence import Trade from freqtrade.persistence import Trade
from freqtrade.strategy.strategy_wrapper import strategy_safe_wrapper from freqtrade.strategy.strategy_wrapper import strategy_safe_wrapper
from freqtrade.wallets import Wallets from freqtrade.wallets import Wallets
@ -297,13 +298,25 @@ class IStrategy(ABC):
if pair in self._pair_locked_until: if pair in self._pair_locked_until:
del self._pair_locked_until[pair] del self._pair_locked_until[pair]
def is_pair_locked(self, pair: str) -> bool: def is_pair_locked(self, pair: str, candle_date: datetime = None) -> bool:
""" """
Checks if a pair is currently locked Checks if a pair is currently locked
The 2nd, optional parameter ensures that locks are applied until the new candle arrives,
and not stop at 14:00:00 - while the next candle arrives at 14:00:02 leaving a gap
of 2 seconds for a buy to happen on an old signal.
:param: pair: "Pair to check"
:param candle_date: Date of the last candle. Optional, defaults to current date
:returns: locking state of the pair in question.
""" """
if pair not in self._pair_locked_until: if pair not in self._pair_locked_until:
return False return False
if not candle_date:
return self._pair_locked_until[pair] >= datetime.now(timezone.utc) return self._pair_locked_until[pair] >= datetime.now(timezone.utc)
else:
# Locking should happen until a new candle arrives
lock_time = timeframe_to_next_date(self.timeframe, candle_date)
# lock_time = candle_date + timedelta(minutes=timeframe_to_minutes(self.timeframe))
return self._pair_locked_until[pair] > lock_time
def analyze_ticker(self, dataframe: DataFrame, metadata: dict) -> DataFrame: def analyze_ticker(self, dataframe: DataFrame, metadata: dict) -> DataFrame:
""" """
@ -434,7 +447,7 @@ class IStrategy(ABC):
if latest_date < (arrow.utcnow().shift(minutes=-(timeframe_minutes * 2 + offset))): if latest_date < (arrow.utcnow().shift(minutes=-(timeframe_minutes * 2 + offset))):
logger.warning( logger.warning(
'Outdated history for pair %s. Last tick is %s minutes old', 'Outdated history for pair %s. Last tick is %s minutes old',
pair, (arrow.utcnow() - latest_date).seconds // 60 pair, int((arrow.utcnow() - latest_date).total_seconds() // 60)
) )
return False, False return False, False

View File

@ -222,7 +222,7 @@ def crossed(series1, series2, direction=None):
if isinstance(series1, np.ndarray): if isinstance(series1, np.ndarray):
series1 = pd.Series(series1) series1 = pd.Series(series1)
if isinstance(series2, (float, int, np.ndarray)): if isinstance(series2, (float, int, np.ndarray, np.integer, np.floating)):
series2 = pd.Series(index=series1.index, data=series2) series2 = pd.Series(index=series1.index, data=series2)
if direction is None or direction == "above": if direction is None or direction == "above":

View File

@ -1,9 +1,9 @@
# requirements without requirements installable via conda # requirements without requirements installable via conda
# mainly used for Raspberry pi installs # mainly used for Raspberry pi installs
ccxt==1.33.18 ccxt==1.33.72
SQLAlchemy==1.3.18 SQLAlchemy==1.3.19
python-telegram-bot==12.8 python-telegram-bot==12.8
arrow==0.15.8 arrow==0.16.0
cachetools==4.1.1 cachetools==4.1.1
requests==2.24.0 requests==2.24.0
urllib3==1.25.10 urllib3==1.25.10
@ -13,6 +13,8 @@ TA-Lib==0.4.18
tabulate==0.8.7 tabulate==0.8.7
pycoingecko==1.3.0 pycoingecko==1.3.0
jinja2==2.11.2 jinja2==2.11.2
tables==3.6.1
blosc==1.9.1
# find first, C search in arrays # find first, C search in arrays
py_find_1st==1.1.4 py_find_1st==1.1.4
@ -26,10 +28,10 @@ sdnotify==0.3.2
# Api server # Api server
flask==1.1.2 flask==1.1.2
flask-jwt-extended==3.24.1 flask-jwt-extended==3.24.1
flask-cors==3.0.8 flask-cors==3.0.9
# Support for colorized terminal output # Support for colorized terminal output
colorama==0.4.3 colorama==0.4.3
# Building config files interactively # Building config files interactively
questionary==1.5.2 questionary==1.5.2
prompt-toolkit==3.0.6 prompt-toolkit==3.0.7

View File

@ -11,7 +11,7 @@ mypy==0.782
pytest==6.0.1 pytest==6.0.1
pytest-asyncio==0.14.0 pytest-asyncio==0.14.0
pytest-cov==2.10.1 pytest-cov==2.10.1
pytest-mock==3.2.0 pytest-mock==3.3.1
pytest-random-order==1.0.4 pytest-random-order==1.0.4
# Convert jupyter notebooks to markdown documents # Convert jupyter notebooks to markdown documents

View File

@ -7,4 +7,4 @@ scikit-learn==0.23.1
scikit-optimize==0.7.4 scikit-optimize==0.7.4
filelock==3.0.12 filelock==3.0.12
joblib==0.16.0 joblib==0.16.0
progressbar2==3.51.4 progressbar2==3.52.1

View File

@ -2,4 +2,4 @@
-r requirements-common.txt -r requirements-common.txt
numpy==1.19.1 numpy==1.19.1
pandas==1.1.0 pandas==1.1.1

View File

@ -159,6 +159,14 @@ class FtRestClient():
""" """
return self._get("show_config") return self._get("show_config")
def logs(self, limit=None):
"""Show latest logs.
:param limit: Limits log messages to the last <limit> logs. No limit to get all the trades.
:return: json object
"""
return self._get("logs", params={"limit": limit} if limit else 0)
def trades(self, limit=None): def trades(self, limit=None):
"""Return trades history. """Return trades history.
@ -276,11 +284,11 @@ def main(args):
print_commands() print_commands()
sys.exit() sys.exit()
config = load_config(args["config"]) config = load_config(args['config'])
url = config.get("api_server", {}).get("server_url", "127.0.0.1") url = config.get('api_server', {}).get('server_url', '127.0.0.1')
port = config.get("api_server", {}).get("listen_port", "8080") port = config.get('api_server', {}).get('listen_port', '8080')
username = config.get("api_server", {}).get("username") username = config.get('api_server', {}).get('username')
password = config.get("api_server", {}).get("password") password = config.get('api_server', {}).get('password')
server_url = f"http://{url}:{port}" server_url = f"http://{url}:{port}"
client = FtRestClient(server_url, username, password) client = FtRestClient(server_url, username, password)

View File

@ -85,6 +85,8 @@ setup(name='freqtrade',
# from requirements.txt # from requirements.txt
'numpy', 'numpy',
'pandas', 'pandas',
'tables',
'blosc',
], ],
extras_require={ extras_require={
'api': api, 'api': api,

View File

@ -78,7 +78,7 @@ def patch_exchange(mocker, api_mock=None, id='bittrex', mock_markets=True) -> No
def get_patched_exchange(mocker, config, api_mock=None, id='bittrex', def get_patched_exchange(mocker, config, api_mock=None, id='bittrex',
mock_markets=True) -> Exchange: mock_markets=True) -> Exchange:
patch_exchange(mocker, api_mock, id, mock_markets) patch_exchange(mocker, api_mock, id, mock_markets)
config["exchange"]["name"] = id config['exchange']['name'] = id
try: try:
exchange = ExchangeResolver.load_exchange(id, config) exchange = ExchangeResolver.load_exchange(id, config)
except ImportError: except ImportError:

View File

@ -12,7 +12,9 @@ from pandas import DataFrame
from pandas.testing import assert_frame_equal from pandas.testing import assert_frame_equal
from freqtrade.configuration import TimeRange from freqtrade.configuration import TimeRange
from freqtrade.constants import AVAILABLE_DATAHANDLERS
from freqtrade.data.converter import ohlcv_to_dataframe from freqtrade.data.converter import ohlcv_to_dataframe
from freqtrade.data.history.hdf5datahandler import HDF5DataHandler
from freqtrade.data.history.history_utils import ( from freqtrade.data.history.history_utils import (
_download_pair_history, _download_trades_history, _download_pair_history, _download_trades_history,
_load_cached_data_for_updating, convert_trades_to_ohlcv, get_timerange, _load_cached_data_for_updating, convert_trades_to_ohlcv, get_timerange,
@ -620,7 +622,7 @@ def test_convert_trades_to_ohlcv(mocker, default_conf, testdatadir, caplog):
_clean_test_file(file5) _clean_test_file(file5)
def test_jsondatahandler_ohlcv_get_pairs(testdatadir): def test_datahandler_ohlcv_get_pairs(testdatadir):
pairs = JsonDataHandler.ohlcv_get_pairs(testdatadir, '5m') pairs = JsonDataHandler.ohlcv_get_pairs(testdatadir, '5m')
# Convert to set to avoid failures due to sorting # Convert to set to avoid failures due to sorting
assert set(pairs) == {'UNITTEST/BTC', 'XLM/BTC', 'ETH/BTC', 'TRX/BTC', 'LTC/BTC', assert set(pairs) == {'UNITTEST/BTC', 'XLM/BTC', 'ETH/BTC', 'TRX/BTC', 'LTC/BTC',
@ -630,8 +632,11 @@ def test_jsondatahandler_ohlcv_get_pairs(testdatadir):
pairs = JsonGzDataHandler.ohlcv_get_pairs(testdatadir, '8m') pairs = JsonGzDataHandler.ohlcv_get_pairs(testdatadir, '8m')
assert set(pairs) == {'UNITTEST/BTC'} assert set(pairs) == {'UNITTEST/BTC'}
pairs = HDF5DataHandler.ohlcv_get_pairs(testdatadir, '5m')
assert set(pairs) == {'UNITTEST/BTC'}
def test_jsondatahandler_ohlcv_get_available_data(testdatadir):
def test_datahandler_ohlcv_get_available_data(testdatadir):
paircombs = JsonDataHandler.ohlcv_get_available_data(testdatadir) paircombs = JsonDataHandler.ohlcv_get_available_data(testdatadir)
# Convert to set to avoid failures due to sorting # Convert to set to avoid failures due to sorting
assert set(paircombs) == {('UNITTEST/BTC', '5m'), ('ETH/BTC', '5m'), ('XLM/BTC', '5m'), assert set(paircombs) == {('UNITTEST/BTC', '5m'), ('ETH/BTC', '5m'), ('XLM/BTC', '5m'),
@ -643,6 +648,8 @@ def test_jsondatahandler_ohlcv_get_available_data(testdatadir):
paircombs = JsonGzDataHandler.ohlcv_get_available_data(testdatadir) paircombs = JsonGzDataHandler.ohlcv_get_available_data(testdatadir)
assert set(paircombs) == {('UNITTEST/BTC', '8m')} assert set(paircombs) == {('UNITTEST/BTC', '8m')}
paircombs = HDF5DataHandler.ohlcv_get_available_data(testdatadir)
assert set(paircombs) == {('UNITTEST/BTC', '5m')}
def test_jsondatahandler_trades_get_pairs(testdatadir): def test_jsondatahandler_trades_get_pairs(testdatadir):
@ -653,15 +660,17 @@ def test_jsondatahandler_trades_get_pairs(testdatadir):
def test_jsondatahandler_ohlcv_purge(mocker, testdatadir): def test_jsondatahandler_ohlcv_purge(mocker, testdatadir):
mocker.patch.object(Path, "exists", MagicMock(return_value=False)) mocker.patch.object(Path, "exists", MagicMock(return_value=False))
mocker.patch.object(Path, "unlink", MagicMock()) unlinkmock = mocker.patch.object(Path, "unlink", MagicMock())
dh = JsonGzDataHandler(testdatadir) dh = JsonGzDataHandler(testdatadir)
assert not dh.ohlcv_purge('UNITTEST/NONEXIST', '5m') assert not dh.ohlcv_purge('UNITTEST/NONEXIST', '5m')
assert unlinkmock.call_count == 0
mocker.patch.object(Path, "exists", MagicMock(return_value=True)) mocker.patch.object(Path, "exists", MagicMock(return_value=True))
assert dh.ohlcv_purge('UNITTEST/NONEXIST', '5m') assert dh.ohlcv_purge('UNITTEST/NONEXIST', '5m')
assert unlinkmock.call_count == 1
def test_jsondatahandler_trades_load(mocker, testdatadir, caplog): def test_jsondatahandler_trades_load(testdatadir, caplog):
dh = JsonGzDataHandler(testdatadir) dh = JsonGzDataHandler(testdatadir)
logmsg = "Old trades format detected - converting" logmsg = "Old trades format detected - converting"
dh.trades_load('XRP/ETH') dh.trades_load('XRP/ETH')
@ -674,26 +683,144 @@ def test_jsondatahandler_trades_load(mocker, testdatadir, caplog):
def test_jsondatahandler_trades_purge(mocker, testdatadir): def test_jsondatahandler_trades_purge(mocker, testdatadir):
mocker.patch.object(Path, "exists", MagicMock(return_value=False)) mocker.patch.object(Path, "exists", MagicMock(return_value=False))
mocker.patch.object(Path, "unlink", MagicMock()) unlinkmock = mocker.patch.object(Path, "unlink", MagicMock())
dh = JsonGzDataHandler(testdatadir) dh = JsonGzDataHandler(testdatadir)
assert not dh.trades_purge('UNITTEST/NONEXIST') assert not dh.trades_purge('UNITTEST/NONEXIST')
assert unlinkmock.call_count == 0
mocker.patch.object(Path, "exists", MagicMock(return_value=True)) mocker.patch.object(Path, "exists", MagicMock(return_value=True))
assert dh.trades_purge('UNITTEST/NONEXIST') assert dh.trades_purge('UNITTEST/NONEXIST')
assert unlinkmock.call_count == 1
def test_jsondatahandler_ohlcv_append(testdatadir): @pytest.mark.parametrize('datahandler', AVAILABLE_DATAHANDLERS)
dh = JsonGzDataHandler(testdatadir) def test_datahandler_ohlcv_append(datahandler, testdatadir, ):
dh = get_datahandler(testdatadir, datahandler)
with pytest.raises(NotImplementedError): with pytest.raises(NotImplementedError):
dh.ohlcv_append('UNITTEST/ETH', '5m', DataFrame()) dh.ohlcv_append('UNITTEST/ETH', '5m', DataFrame())
def test_jsondatahandler_trades_append(testdatadir): @pytest.mark.parametrize('datahandler', AVAILABLE_DATAHANDLERS)
dh = JsonGzDataHandler(testdatadir) def test_datahandler_trades_append(datahandler, testdatadir):
dh = get_datahandler(testdatadir, datahandler)
with pytest.raises(NotImplementedError): with pytest.raises(NotImplementedError):
dh.trades_append('UNITTEST/ETH', []) dh.trades_append('UNITTEST/ETH', [])
def test_hdf5datahandler_trades_get_pairs(testdatadir):
pairs = HDF5DataHandler.trades_get_pairs(testdatadir)
# Convert to set to avoid failures due to sorting
assert set(pairs) == {'XRP/ETH'}
def test_hdf5datahandler_trades_load(testdatadir):
dh = HDF5DataHandler(testdatadir)
trades = dh.trades_load('XRP/ETH')
assert isinstance(trades, list)
trades1 = dh.trades_load('UNITTEST/NONEXIST')
assert trades1 == []
# data goes from 2019-10-11 - 2019-10-13
timerange = TimeRange.parse_timerange('20191011-20191012')
trades2 = dh._trades_load('XRP/ETH', timerange)
assert len(trades) > len(trades2)
# unfiltered load has trades before starttime
assert len([t for t in trades if t[0] < timerange.startts * 1000]) >= 0
# filtered list does not have trades before starttime
assert len([t for t in trades2 if t[0] < timerange.startts * 1000]) == 0
# unfiltered load has trades after endtime
assert len([t for t in trades if t[0] > timerange.stopts * 1000]) > 0
# filtered list does not have trades after endtime
assert len([t for t in trades2 if t[0] > timerange.stopts * 1000]) == 0
def test_hdf5datahandler_trades_store(testdatadir):
dh = HDF5DataHandler(testdatadir)
trades = dh.trades_load('XRP/ETH')
dh.trades_store('XRP/NEW', trades)
file = testdatadir / 'XRP_NEW-trades.h5'
assert file.is_file()
# Load trades back
trades_new = dh.trades_load('XRP/NEW')
assert len(trades_new) == len(trades)
assert trades[0][0] == trades_new[0][0]
assert trades[0][1] == trades_new[0][1]
# assert trades[0][2] == trades_new[0][2] # This is nan - so comparison does not make sense
assert trades[0][3] == trades_new[0][3]
assert trades[0][4] == trades_new[0][4]
assert trades[0][5] == trades_new[0][5]
assert trades[0][6] == trades_new[0][6]
assert trades[-1][0] == trades_new[-1][0]
assert trades[-1][1] == trades_new[-1][1]
# assert trades[-1][2] == trades_new[-1][2] # This is nan - so comparison does not make sense
assert trades[-1][3] == trades_new[-1][3]
assert trades[-1][4] == trades_new[-1][4]
assert trades[-1][5] == trades_new[-1][5]
assert trades[-1][6] == trades_new[-1][6]
_clean_test_file(file)
def test_hdf5datahandler_trades_purge(mocker, testdatadir):
mocker.patch.object(Path, "exists", MagicMock(return_value=False))
unlinkmock = mocker.patch.object(Path, "unlink", MagicMock())
dh = HDF5DataHandler(testdatadir)
assert not dh.trades_purge('UNITTEST/NONEXIST')
assert unlinkmock.call_count == 0
mocker.patch.object(Path, "exists", MagicMock(return_value=True))
assert dh.trades_purge('UNITTEST/NONEXIST')
assert unlinkmock.call_count == 1
def test_hdf5datahandler_ohlcv_load_and_resave(testdatadir):
dh = HDF5DataHandler(testdatadir)
ohlcv = dh.ohlcv_load('UNITTEST/BTC', '5m')
assert isinstance(ohlcv, DataFrame)
assert len(ohlcv) > 0
file = testdatadir / 'UNITTEST_NEW-5m.h5'
assert not file.is_file()
dh.ohlcv_store('UNITTEST/NEW', '5m', ohlcv)
assert file.is_file()
assert not ohlcv[ohlcv['date'] < '2018-01-15'].empty
# Data gores from 2018-01-10 - 2018-01-30
timerange = TimeRange.parse_timerange('20180115-20180119')
# Call private function to ensure timerange is filtered in hdf5
ohlcv = dh._ohlcv_load('UNITTEST/BTC', '5m', timerange)
ohlcv1 = dh._ohlcv_load('UNITTEST/NEW', '5m', timerange)
assert len(ohlcv) == len(ohlcv1)
assert ohlcv.equals(ohlcv1)
assert ohlcv[ohlcv['date'] < '2018-01-15'].empty
assert ohlcv[ohlcv['date'] > '2018-01-19'].empty
_clean_test_file(file)
# Try loading inexisting file
ohlcv = dh.ohlcv_load('UNITTEST/NONEXIST', '5m')
assert ohlcv.empty
def test_hdf5datahandler_ohlcv_purge(mocker, testdatadir):
mocker.patch.object(Path, "exists", MagicMock(return_value=False))
unlinkmock = mocker.patch.object(Path, "unlink", MagicMock())
dh = HDF5DataHandler(testdatadir)
assert not dh.ohlcv_purge('UNITTEST/NONEXIST', '5m')
assert unlinkmock.call_count == 0
mocker.patch.object(Path, "exists", MagicMock(return_value=True))
assert dh.ohlcv_purge('UNITTEST/NONEXIST', '5m')
assert unlinkmock.call_count == 1
def test_gethandlerclass(): def test_gethandlerclass():
cl = get_datahandlerclass('json') cl = get_datahandlerclass('json')
assert cl == JsonDataHandler assert cl == JsonDataHandler
@ -702,6 +829,9 @@ def test_gethandlerclass():
assert cl == JsonGzDataHandler assert cl == JsonGzDataHandler
assert issubclass(cl, IDataHandler) assert issubclass(cl, IDataHandler)
assert issubclass(cl, JsonDataHandler) assert issubclass(cl, JsonDataHandler)
cl = get_datahandlerclass('hdf5')
assert cl == HDF5DataHandler
assert issubclass(cl, IDataHandler)
with pytest.raises(ValueError, match=r"No datahandler for .*"): with pytest.raises(ValueError, match=r"No datahandler for .*"):
get_datahandlerclass('DeadBeef') get_datahandlerclass('DeadBeef')
@ -713,3 +843,6 @@ def test_get_datahandler(testdatadir):
assert type(dh) == JsonGzDataHandler assert type(dh) == JsonGzDataHandler
dh1 = get_datahandler(testdatadir, 'jsongz', dh) dh1 = get_datahandler(testdatadir, 'jsongz', dh)
assert id(dh1) == id(dh) assert id(dh1) == id(dh)
dh = get_datahandler(testdatadir, 'hdf5')
assert type(dh) == HDF5DataHandler

View File

@ -359,6 +359,7 @@ def test_backtesting_start(default_conf, mocker, testdatadir, caplog) -> None:
] ]
for line in exists: for line in exists:
assert log_has(line, caplog) assert log_has(line, caplog)
assert backtesting.strategy.dp._pairlists is not None
def test_backtesting_start_no_data(default_conf, mocker, caplog, testdatadir) -> None: def test_backtesting_start_no_data(default_conf, mocker, caplog, testdatadir) -> None:

View File

@ -101,6 +101,7 @@ def test_rpc_trade_status(default_conf, ticker, fee, mocker) -> None:
'initial_stop_loss_ratio': -0.1, 'initial_stop_loss_ratio': -0.1,
'stoploss_current_dist': -1.1080000000000002e-06, 'stoploss_current_dist': -1.1080000000000002e-06,
'stoploss_current_dist_ratio': -0.10081893, 'stoploss_current_dist_ratio': -0.10081893,
'stoploss_current_dist_pct': -10.08,
'stoploss_entry_dist': -0.00010475, 'stoploss_entry_dist': -0.00010475,
'stoploss_entry_dist_ratio': -0.10448878, 'stoploss_entry_dist_ratio': -0.10448878,
'open_order': None, 'open_order': None,
@ -165,6 +166,7 @@ def test_rpc_trade_status(default_conf, ticker, fee, mocker) -> None:
'initial_stop_loss_ratio': -0.1, 'initial_stop_loss_ratio': -0.1,
'stoploss_current_dist': ANY, 'stoploss_current_dist': ANY,
'stoploss_current_dist_ratio': ANY, 'stoploss_current_dist_ratio': ANY,
'stoploss_current_dist_pct': ANY,
'stoploss_entry_dist': -0.00010475, 'stoploss_entry_dist': -0.00010475,
'stoploss_entry_dist_ratio': -0.10448878, 'stoploss_entry_dist_ratio': -0.10448878,
'open_order': None, 'open_order': None,

View File

@ -10,10 +10,12 @@ from flask import Flask
from requests.auth import _basic_auth_str from requests.auth import _basic_auth_str
from freqtrade.__init__ import __version__ from freqtrade.__init__ import __version__
from freqtrade.loggers import setup_logging, setup_logging_pre
from freqtrade.persistence import Trade from freqtrade.persistence import Trade
from freqtrade.rpc.api_server import BASE_URI, ApiServer from freqtrade.rpc.api_server import BASE_URI, ApiServer
from freqtrade.state import State from freqtrade.state import State
from tests.conftest import get_patched_freqtradebot, log_has, patch_get_signal, create_mock_trades from tests.conftest import (create_mock_trades, get_patched_freqtradebot,
log_has, patch_get_signal)
_TEST_USER = "FreqTrader" _TEST_USER = "FreqTrader"
_TEST_PASS = "SuperSecurePassword1!" _TEST_PASS = "SuperSecurePassword1!"
@ -21,6 +23,9 @@ _TEST_PASS = "SuperSecurePassword1!"
@pytest.fixture @pytest.fixture
def botclient(default_conf, mocker): def botclient(default_conf, mocker):
setup_logging_pre()
setup_logging(default_conf)
default_conf.update({"api_server": {"enabled": True, default_conf.update({"api_server": {"enabled": True,
"listen_ip_address": "127.0.0.1", "listen_ip_address": "127.0.0.1",
"listen_port": 8080, "listen_port": 8080,
@ -87,20 +92,20 @@ def test_api_unauthorized(botclient):
assert rc.json == {'error': 'Unauthorized'} assert rc.json == {'error': 'Unauthorized'}
# Change only username # Change only username
ftbot.config['api_server']['username'] = "Ftrader" ftbot.config['api_server']['username'] = 'Ftrader'
rc = client_get(client, f"{BASE_URI}/version") rc = client_get(client, f"{BASE_URI}/version")
assert_response(rc, 401) assert_response(rc, 401)
assert rc.json == {'error': 'Unauthorized'} assert rc.json == {'error': 'Unauthorized'}
# Change only password # Change only password
ftbot.config['api_server']['username'] = _TEST_USER ftbot.config['api_server']['username'] = _TEST_USER
ftbot.config['api_server']['password'] = "WrongPassword" ftbot.config['api_server']['password'] = 'WrongPassword'
rc = client_get(client, f"{BASE_URI}/version") rc = client_get(client, f"{BASE_URI}/version")
assert_response(rc, 401) assert_response(rc, 401)
assert rc.json == {'error': 'Unauthorized'} assert rc.json == {'error': 'Unauthorized'}
ftbot.config['api_server']['username'] = "Ftrader" ftbot.config['api_server']['username'] = 'Ftrader'
ftbot.config['api_server']['password'] = "WrongPassword" ftbot.config['api_server']['password'] = 'WrongPassword'
rc = client_get(client, f"{BASE_URI}/version") rc = client_get(client, f"{BASE_URI}/version")
assert_response(rc, 401) assert_response(rc, 401)
@ -423,6 +428,34 @@ def test_api_delete_trade(botclient, mocker, fee, markets):
assert stoploss_mock.call_count == 1 assert stoploss_mock.call_count == 1
def test_api_logs(botclient):
ftbot, client = botclient
rc = client_get(client, f"{BASE_URI}/logs")
assert_response(rc)
assert len(rc.json) == 2
assert 'logs' in rc.json
# Using a fixed comparison here would make this test fail!
assert rc.json['log_count'] > 10
assert len(rc.json['logs']) == rc.json['log_count']
assert isinstance(rc.json['logs'][0], list)
# date
assert isinstance(rc.json['logs'][0][0], str)
# created_timestamp
assert isinstance(rc.json['logs'][0][1], float)
assert isinstance(rc.json['logs'][0][2], str)
assert isinstance(rc.json['logs'][0][3], str)
assert isinstance(rc.json['logs'][0][4], str)
rc = client_get(client, f"{BASE_URI}/logs?limit=5")
assert_response(rc)
assert len(rc.json) == 2
assert 'logs' in rc.json
# Using a fixed comparison here would make this test fail!
assert rc.json['log_count'] == 5
assert len(rc.json['logs']) == rc.json['log_count']
def test_api_edge_disabled(botclient, mocker, ticker, fee, markets): def test_api_edge_disabled(botclient, mocker, ticker, fee, markets):
ftbot, client = botclient ftbot, client = botclient
patch_get_signal(ftbot, (True, False)) patch_get_signal(ftbot, (True, False))
@ -600,6 +633,7 @@ def test_api_status(botclient, mocker, ticker, fee, markets):
'initial_stop_loss_ratio': -0.1, 'initial_stop_loss_ratio': -0.1,
'stoploss_current_dist': -1.1080000000000002e-06, 'stoploss_current_dist': -1.1080000000000002e-06,
'stoploss_current_dist_ratio': -0.10081893, 'stoploss_current_dist_ratio': -0.10081893,
'stoploss_current_dist_pct': -10.08,
'stoploss_entry_dist': -0.00010475, 'stoploss_entry_dist': -0.00010475,
'stoploss_entry_dist_ratio': -0.10448878, 'stoploss_entry_dist_ratio': -0.10448878,
'trade_id': 1, 'trade_id': 1,
@ -676,7 +710,7 @@ def test_api_forcebuy(botclient, mocker, fee):
assert rc.json == {"error": "Error querying _forcebuy: Forcebuy not enabled."} assert rc.json == {"error": "Error querying _forcebuy: Forcebuy not enabled."}
# enable forcebuy # enable forcebuy
ftbot.config["forcebuy_enable"] = True ftbot.config['forcebuy_enable'] = True
fbuy_mock = MagicMock(return_value=None) fbuy_mock = MagicMock(return_value=None)
mocker.patch("freqtrade.rpc.RPC._rpc_forcebuy", fbuy_mock) mocker.patch("freqtrade.rpc.RPC._rpc_forcebuy", fbuy_mock)

View File

@ -16,6 +16,7 @@ from telegram.error import NetworkError
from freqtrade import __version__ from freqtrade import __version__
from freqtrade.edge import PairInfo from freqtrade.edge import PairInfo
from freqtrade.freqtradebot import FreqtradeBot from freqtrade.freqtradebot import FreqtradeBot
from freqtrade.loggers import setup_logging
from freqtrade.persistence import Trade from freqtrade.persistence import Trade
from freqtrade.rpc import RPCMessageType from freqtrade.rpc import RPCMessageType
from freqtrade.rpc.telegram import Telegram, authorized_only from freqtrade.rpc.telegram import Telegram, authorized_only
@ -76,7 +77,7 @@ def test_telegram_init(default_conf, mocker, caplog) -> None:
"['balance'], ['start'], ['stop'], ['forcesell'], ['forcebuy'], ['trades'], " "['balance'], ['start'], ['stop'], ['forcesell'], ['forcebuy'], ['trades'], "
"['delete'], ['performance'], ['daily'], ['count'], ['reload_config', " "['delete'], ['performance'], ['daily'], ['count'], ['reload_config', "
"'reload_conf'], ['show_config', 'show_conf'], ['stopbuy'], " "'reload_conf'], ['show_config', 'show_conf'], ['stopbuy'], "
"['whitelist'], ['blacklist'], ['edge'], ['help'], ['version']]") "['whitelist'], ['blacklist'], ['logs'], ['edge'], ['help'], ['version']]")
assert log_has(message_str, caplog) assert log_has(message_str, caplog)
@ -145,7 +146,7 @@ def test_authorized_only_exception(default_conf, mocker, caplog) -> None:
assert log_has('Exception occurred within Telegram module', caplog) assert log_has('Exception occurred within Telegram module', caplog)
def test_status(default_conf, update, mocker, fee, ticker,) -> None: def test_telegram_status(default_conf, update, mocker, fee, ticker,) -> None:
update.message.chat.id = "123" update.message.chat.id = "123"
default_conf['telegram']['enabled'] = False default_conf['telegram']['enabled'] = False
default_conf['telegram']['chat_id'] = "123" default_conf['telegram']['chat_id'] = "123"
@ -175,6 +176,8 @@ def test_status(default_conf, update, mocker, fee, ticker,) -> None:
'stop_loss': 1.099e-05, 'stop_loss': 1.099e-05,
'sell_order_status': None, 'sell_order_status': None,
'initial_stop_loss_pct': -0.05, 'initial_stop_loss_pct': -0.05,
'stoploss_current_dist': 1e-08,
'stoploss_current_dist_pct': -0.02,
'stop_loss_pct': -0.01, 'stop_loss_pct': -0.01,
'open_order': '(limit buy rem=0.00000000)' 'open_order': '(limit buy rem=0.00000000)'
}]), }]),
@ -1105,6 +1108,40 @@ def test_blacklist_static(default_conf, update, mocker) -> None:
assert freqtradebot.pairlists.blacklist == ["DOGE/BTC", "HOT/BTC", "ETH/BTC"] assert freqtradebot.pairlists.blacklist == ["DOGE/BTC", "HOT/BTC", "ETH/BTC"]
def test_telegram_logs(default_conf, update, mocker) -> None:
msg_mock = MagicMock()
mocker.patch.multiple(
'freqtrade.rpc.telegram.Telegram',
_init=MagicMock(),
_send_msg=msg_mock
)
setup_logging(default_conf)
freqtradebot = get_patched_freqtradebot(mocker, default_conf)
telegram = Telegram(freqtradebot)
context = MagicMock()
context.args = []
telegram._logs(update=update, context=context)
assert msg_mock.call_count == 1
assert "freqtrade\\.rpc\\.telegram" in msg_mock.call_args_list[0][0][0]
msg_mock.reset_mock()
context.args = ["1"]
telegram._logs(update=update, context=context)
assert msg_mock.call_count == 1
msg_mock.reset_mock()
# Test with changed MaxMessageLength
mocker.patch('freqtrade.rpc.telegram.MAX_TELEGRAM_MESSAGE_LENGTH', 200)
context = MagicMock()
context.args = []
telegram._logs(update=update, context=context)
# Called at least 3 times. Exact times will change with unrelated changes to setup messages
# Therefore we don't test for this explicitly.
assert msg_mock.call_count > 3
def test_edge_disabled(default_conf, update, mocker) -> None: def test_edge_disabled(default_conf, update, mocker) -> None:
msg_mock = MagicMock() msg_mock = MagicMock()
mocker.patch.multiple( mocker.patch.multiple(

View File

@ -1,6 +1,7 @@
# pragma pylint: disable=missing-docstring, C0103 # pragma pylint: disable=missing-docstring, C0103
import logging import logging
from datetime import datetime, timedelta, timezone
from unittest.mock import MagicMock from unittest.mock import MagicMock
import arrow import arrow
@ -8,12 +9,12 @@ import pytest
from pandas import DataFrame from pandas import DataFrame
from freqtrade.configuration import TimeRange from freqtrade.configuration import TimeRange
from freqtrade.data.dataprovider import DataProvider
from freqtrade.data.history import load_data from freqtrade.data.history import load_data
from freqtrade.exceptions import StrategyError from freqtrade.exceptions import StrategyError
from freqtrade.persistence import Trade from freqtrade.persistence import Trade
from freqtrade.resolvers import StrategyResolver from freqtrade.resolvers import StrategyResolver
from freqtrade.strategy.strategy_wrapper import strategy_safe_wrapper from freqtrade.strategy.strategy_wrapper import strategy_safe_wrapper
from freqtrade.data.dataprovider import DataProvider
from tests.conftest import log_has, log_has_re from tests.conftest import log_has, log_has_re
from .strats.default_strategy import DefaultStrategy from .strats.default_strategy import DefaultStrategy
@ -387,6 +388,31 @@ def test_is_pair_locked(default_conf):
strategy.unlock_pair(pair) strategy.unlock_pair(pair)
assert not strategy.is_pair_locked(pair) assert not strategy.is_pair_locked(pair)
pair = 'BTC/USDT'
# Lock until 14:30
lock_time = datetime(2020, 5, 1, 14, 30, 0, tzinfo=timezone.utc)
strategy.lock_pair(pair, lock_time)
# Lock is in the past ...
assert not strategy.is_pair_locked(pair)
# latest candle is from 14:20, lock goes to 14:30
assert strategy.is_pair_locked(pair, lock_time + timedelta(minutes=-10))
assert strategy.is_pair_locked(pair, lock_time + timedelta(minutes=-50))
# latest candle is from 14:25 (lock should be lifted)
# Since this is the "new candle" available at 14:30
assert not strategy.is_pair_locked(pair, lock_time + timedelta(minutes=-4))
# Should not be locked after time expired
assert not strategy.is_pair_locked(pair, lock_time + timedelta(minutes=10))
# Change timeframe to 15m
strategy.timeframe = '15m'
# Candle from 14:14 - lock goes until 14:30
assert strategy.is_pair_locked(pair, lock_time + timedelta(minutes=-16))
assert strategy.is_pair_locked(pair, lock_time + timedelta(minutes=-15, seconds=-2))
# Candle from 14:15 - lock goes until 14:30
assert not strategy.is_pair_locked(pair, lock_time + timedelta(minutes=-15))
def test_is_informative_pairs_callback(default_conf): def test_is_informative_pairs_callback(default_conf):
default_conf.update({'strategy': 'TestStrategyLegacy'}) default_conf.update({'strategy': 'TestStrategyLegacy'})

View File

@ -19,64 +19,64 @@ def test_parse_args_none() -> None:
def test_parse_args_defaults(mocker) -> None: def test_parse_args_defaults(mocker) -> None:
mocker.patch.object(Path, "is_file", MagicMock(side_effect=[False, True])) mocker.patch.object(Path, 'is_file', MagicMock(side_effect=[False, True]))
args = Arguments(['trade']).get_parsed_arg() args = Arguments(['trade']).get_parsed_arg()
assert args["config"] == ['config.json'] assert args['config'] == ['config.json']
assert args["strategy_path"] is None assert args['strategy_path'] is None
assert args["datadir"] is None assert args['datadir'] is None
assert args["verbosity"] == 0 assert args['verbosity'] == 0
def test_parse_args_default_userdatadir(mocker) -> None: def test_parse_args_default_userdatadir(mocker) -> None:
mocker.patch.object(Path, "is_file", MagicMock(return_value=True)) mocker.patch.object(Path, 'is_file', MagicMock(return_value=True))
args = Arguments(['trade']).get_parsed_arg() args = Arguments(['trade']).get_parsed_arg()
# configuration defaults to user_data if that is available. # configuration defaults to user_data if that is available.
assert args["config"] == [str(Path('user_data/config.json'))] assert args['config'] == [str(Path('user_data/config.json'))]
assert args["strategy_path"] is None assert args['strategy_path'] is None
assert args["datadir"] is None assert args['datadir'] is None
assert args["verbosity"] == 0 assert args['verbosity'] == 0
def test_parse_args_userdatadir(mocker) -> None: def test_parse_args_userdatadir(mocker) -> None:
mocker.patch.object(Path, "is_file", MagicMock(return_value=True)) mocker.patch.object(Path, 'is_file', MagicMock(return_value=True))
args = Arguments(['trade', '--user-data-dir', 'user_data']).get_parsed_arg() args = Arguments(['trade', '--user-data-dir', 'user_data']).get_parsed_arg()
# configuration defaults to user_data if that is available. # configuration defaults to user_data if that is available.
assert args["config"] == [str(Path('user_data/config.json'))] assert args['config'] == [str(Path('user_data/config.json'))]
assert args["strategy_path"] is None assert args['strategy_path'] is None
assert args["datadir"] is None assert args['datadir'] is None
assert args["verbosity"] == 0 assert args['verbosity'] == 0
def test_parse_args_config() -> None: def test_parse_args_config() -> None:
args = Arguments(['trade', '-c', '/dev/null']).get_parsed_arg() args = Arguments(['trade', '-c', '/dev/null']).get_parsed_arg()
assert args["config"] == ['/dev/null'] assert args['config'] == ['/dev/null']
args = Arguments(['trade', '--config', '/dev/null']).get_parsed_arg() args = Arguments(['trade', '--config', '/dev/null']).get_parsed_arg()
assert args["config"] == ['/dev/null'] assert args['config'] == ['/dev/null']
args = Arguments(['trade', '--config', '/dev/null', args = Arguments(['trade', '--config', '/dev/null',
'--config', '/dev/zero'],).get_parsed_arg() '--config', '/dev/zero'],).get_parsed_arg()
assert args["config"] == ['/dev/null', '/dev/zero'] assert args['config'] == ['/dev/null', '/dev/zero']
def test_parse_args_db_url() -> None: def test_parse_args_db_url() -> None:
args = Arguments(['trade', '--db-url', 'sqlite:///test.sqlite']).get_parsed_arg() args = Arguments(['trade', '--db-url', 'sqlite:///test.sqlite']).get_parsed_arg()
assert args["db_url"] == 'sqlite:///test.sqlite' assert args['db_url'] == 'sqlite:///test.sqlite'
def test_parse_args_verbose() -> None: def test_parse_args_verbose() -> None:
args = Arguments(['trade', '-v']).get_parsed_arg() args = Arguments(['trade', '-v']).get_parsed_arg()
assert args["verbosity"] == 1 assert args['verbosity'] == 1
args = Arguments(['trade', '--verbose']).get_parsed_arg() args = Arguments(['trade', '--verbose']).get_parsed_arg()
assert args["verbosity"] == 1 assert args['verbosity'] == 1
def test_common_scripts_options() -> None: def test_common_scripts_options() -> None:
args = Arguments(['download-data', '-p', 'ETH/BTC', 'XRP/BTC']).get_parsed_arg() args = Arguments(['download-data', '-p', 'ETH/BTC', 'XRP/BTC']).get_parsed_arg()
assert args["pairs"] == ['ETH/BTC', 'XRP/BTC'] assert args['pairs'] == ['ETH/BTC', 'XRP/BTC']
assert "func" in args assert 'func' in args
def test_parse_args_version() -> None: def test_parse_args_version() -> None:
@ -91,7 +91,7 @@ def test_parse_args_invalid() -> None:
def test_parse_args_strategy() -> None: def test_parse_args_strategy() -> None:
args = Arguments(['trade', '--strategy', 'SomeStrategy']).get_parsed_arg() args = Arguments(['trade', '--strategy', 'SomeStrategy']).get_parsed_arg()
assert args["strategy"] == 'SomeStrategy' assert args['strategy'] == 'SomeStrategy'
def test_parse_args_strategy_invalid() -> None: def test_parse_args_strategy_invalid() -> None:
@ -101,7 +101,7 @@ def test_parse_args_strategy_invalid() -> None:
def test_parse_args_strategy_path() -> None: def test_parse_args_strategy_path() -> None:
args = Arguments(['trade', '--strategy-path', '/some/path']).get_parsed_arg() args = Arguments(['trade', '--strategy-path', '/some/path']).get_parsed_arg()
assert args["strategy_path"] == '/some/path' assert args['strategy_path'] == '/some/path'
def test_parse_args_strategy_path_invalid() -> None: def test_parse_args_strategy_path_invalid() -> None:
@ -127,13 +127,13 @@ def test_parse_args_backtesting_custom() -> None:
'SampleStrategy' 'SampleStrategy'
] ]
call_args = Arguments(args).get_parsed_arg() call_args = Arguments(args).get_parsed_arg()
assert call_args["config"] == ['test_conf.json'] assert call_args['config'] == ['test_conf.json']
assert call_args["verbosity"] == 0 assert call_args['verbosity'] == 0
assert call_args["command"] == 'backtesting' assert call_args['command'] == 'backtesting'
assert call_args["func"] is not None assert call_args['func'] is not None
assert call_args["timeframe"] == '1m' assert call_args['timeframe'] == '1m'
assert type(call_args["strategy_list"]) is list assert type(call_args['strategy_list']) is list
assert len(call_args["strategy_list"]) == 2 assert len(call_args['strategy_list']) == 2
def test_parse_args_hyperopt_custom() -> None: def test_parse_args_hyperopt_custom() -> None:
@ -144,13 +144,13 @@ def test_parse_args_hyperopt_custom() -> None:
'--spaces', 'buy' '--spaces', 'buy'
] ]
call_args = Arguments(args).get_parsed_arg() call_args = Arguments(args).get_parsed_arg()
assert call_args["config"] == ['test_conf.json'] assert call_args['config'] == ['test_conf.json']
assert call_args["epochs"] == 20 assert call_args['epochs'] == 20
assert call_args["verbosity"] == 0 assert call_args['verbosity'] == 0
assert call_args["command"] == 'hyperopt' assert call_args['command'] == 'hyperopt'
assert call_args["spaces"] == ['buy'] assert call_args['spaces'] == ['buy']
assert call_args["func"] is not None assert call_args['func'] is not None
assert callable(call_args["func"]) assert callable(call_args['func'])
def test_download_data_options() -> None: def test_download_data_options() -> None:
@ -163,10 +163,10 @@ def test_download_data_options() -> None:
] ]
pargs = Arguments(args).get_parsed_arg() pargs = Arguments(args).get_parsed_arg()
assert pargs["pairs_file"] == 'file_with_pairs' assert pargs['pairs_file'] == 'file_with_pairs'
assert pargs["datadir"] == 'datadir/directory' assert pargs['datadir'] == 'datadir/directory'
assert pargs["days"] == 30 assert pargs['days'] == 30
assert pargs["exchange"] == 'binance' assert pargs['exchange'] == 'binance'
def test_plot_dataframe_options() -> None: def test_plot_dataframe_options() -> None:
@ -180,10 +180,10 @@ def test_plot_dataframe_options() -> None:
] ]
pargs = Arguments(args).get_parsed_arg() pargs = Arguments(args).get_parsed_arg()
assert pargs["indicators1"] == ["sma10", "sma100"] assert pargs['indicators1'] == ['sma10', 'sma100']
assert pargs["indicators2"] == ["macd", "fastd", "fastk"] assert pargs['indicators2'] == ['macd', 'fastd', 'fastk']
assert pargs["plot_limit"] == 30 assert pargs['plot_limit'] == 30
assert pargs["pairs"] == ["UNITTEST/BTC"] assert pargs['pairs'] == ['UNITTEST/BTC']
def test_plot_profit_options() -> None: def test_plot_profit_options() -> None:
@ -191,66 +191,66 @@ def test_plot_profit_options() -> None:
'plot-profit', 'plot-profit',
'-p', 'UNITTEST/BTC', '-p', 'UNITTEST/BTC',
'--trade-source', 'DB', '--trade-source', 'DB',
"--db-url", "sqlite:///whatever.sqlite", '--db-url', 'sqlite:///whatever.sqlite',
] ]
pargs = Arguments(args).get_parsed_arg() pargs = Arguments(args).get_parsed_arg()
assert pargs["trade_source"] == "DB" assert pargs['trade_source'] == 'DB'
assert pargs["pairs"] == ["UNITTEST/BTC"] assert pargs['pairs'] == ['UNITTEST/BTC']
assert pargs["db_url"] == "sqlite:///whatever.sqlite" assert pargs['db_url'] == 'sqlite:///whatever.sqlite'
def test_config_notallowed(mocker) -> None: def test_config_notallowed(mocker) -> None:
mocker.patch.object(Path, "is_file", MagicMock(return_value=False)) mocker.patch.object(Path, 'is_file', MagicMock(return_value=False))
args = [ args = [
'create-userdir', 'create-userdir',
] ]
pargs = Arguments(args).get_parsed_arg() pargs = Arguments(args).get_parsed_arg()
assert "config" not in pargs assert 'config' not in pargs
# When file exists: # When file exists:
mocker.patch.object(Path, "is_file", MagicMock(return_value=True)) mocker.patch.object(Path, 'is_file', MagicMock(return_value=True))
args = [ args = [
'create-userdir', 'create-userdir',
] ]
pargs = Arguments(args).get_parsed_arg() pargs = Arguments(args).get_parsed_arg()
# config is not added even if it exists, since create-userdir is in the notallowed list # config is not added even if it exists, since create-userdir is in the notallowed list
assert "config" not in pargs assert 'config' not in pargs
def test_config_notrequired(mocker) -> None: def test_config_notrequired(mocker) -> None:
mocker.patch.object(Path, "is_file", MagicMock(return_value=False)) mocker.patch.object(Path, 'is_file', MagicMock(return_value=False))
args = [ args = [
'download-data', 'download-data',
] ]
pargs = Arguments(args).get_parsed_arg() pargs = Arguments(args).get_parsed_arg()
assert pargs["config"] is None assert pargs['config'] is None
# When file exists: # When file exists:
mocker.patch.object(Path, "is_file", MagicMock(side_effect=[False, True])) mocker.patch.object(Path, 'is_file', MagicMock(side_effect=[False, True]))
args = [ args = [
'download-data', 'download-data',
] ]
pargs = Arguments(args).get_parsed_arg() pargs = Arguments(args).get_parsed_arg()
# config is added if it exists # config is added if it exists
assert pargs["config"] == ['config.json'] assert pargs['config'] == ['config.json']
def test_check_int_positive() -> None: def test_check_int_positive() -> None:
assert check_int_positive("3") == 3 assert check_int_positive('3') == 3
assert check_int_positive("1") == 1 assert check_int_positive('1') == 1
assert check_int_positive("100") == 100 assert check_int_positive('100') == 100
with pytest.raises(argparse.ArgumentTypeError): with pytest.raises(argparse.ArgumentTypeError):
check_int_positive("-2") check_int_positive('-2')
with pytest.raises(argparse.ArgumentTypeError): with pytest.raises(argparse.ArgumentTypeError):
check_int_positive("0") check_int_positive('0')
with pytest.raises(argparse.ArgumentTypeError): with pytest.raises(argparse.ArgumentTypeError):
check_int_positive("3.5") check_int_positive('3.5')
with pytest.raises(argparse.ArgumentTypeError): with pytest.raises(argparse.ArgumentTypeError):
check_int_positive("DeadBeef") check_int_positive('DeadBeef')

View File

@ -21,7 +21,7 @@ from freqtrade.configuration.deprecated_settings import (
from freqtrade.configuration.load_config import load_config_file, log_config_error_range from freqtrade.configuration.load_config import load_config_file, log_config_error_range
from freqtrade.constants import DEFAULT_DB_DRYRUN_URL, DEFAULT_DB_PROD_URL from freqtrade.constants import DEFAULT_DB_DRYRUN_URL, DEFAULT_DB_PROD_URL
from freqtrade.exceptions import OperationalException from freqtrade.exceptions import OperationalException
from freqtrade.loggers import _set_loggers, setup_logging from freqtrade.loggers import _set_loggers, setup_logging, setup_logging_pre
from freqtrade.state import RunMode from freqtrade.state import RunMode
from tests.conftest import (log_has, log_has_re, from tests.conftest import (log_has, log_has_re,
patched_configuration_load_config_file) patched_configuration_load_config_file)
@ -674,10 +674,12 @@ def test_set_loggers_syslog(mocker):
'logfile': 'syslog:/dev/log', 'logfile': 'syslog:/dev/log',
} }
setup_logging_pre()
setup_logging(config) setup_logging(config)
assert len(logger.handlers) == 2 assert len(logger.handlers) == 3
assert [x for x in logger.handlers if type(x) == logging.handlers.SysLogHandler] assert [x for x in logger.handlers if type(x) == logging.handlers.SysLogHandler]
assert [x for x in logger.handlers if type(x) == logging.StreamHandler] assert [x for x in logger.handlers if type(x) == logging.StreamHandler]
assert [x for x in logger.handlers if type(x) == logging.handlers.BufferingHandler]
# reset handlers to not break pytest # reset handlers to not break pytest
logger.handlers = orig_handlers logger.handlers = orig_handlers
@ -727,7 +729,10 @@ def test_set_logfile(default_conf, mocker):
assert validated_conf['logfile'] == "test_file.log" assert validated_conf['logfile'] == "test_file.log"
f = Path("test_file.log") f = Path("test_file.log")
assert f.is_file() assert f.is_file()
try:
f.unlink() f.unlink()
except Exception:
pass
def test_load_config_warn_forcebuy(default_conf, mocker, caplog) -> None: def test_load_config_warn_forcebuy(default_conf, mocker, caplog) -> None:
@ -1005,7 +1010,7 @@ def test_pairlist_resolving_fallback(mocker):
args = Arguments(arglist).get_parsed_arg() args = Arguments(arglist).get_parsed_arg()
# Fix flaky tests if config.json exists # Fix flaky tests if config.json exists
args["config"] = None args['config'] = None
configuration = Configuration(args, RunMode.OTHER) configuration = Configuration(args, RunMode.OTHER)
config = configuration.get_config() config = configuration.get_config()

18
tests/test_indicators.py Normal file
View File

@ -0,0 +1,18 @@
import freqtrade.vendor.qtpylib.indicators as qtpylib
import numpy as np
import pandas as pd
def test_crossed_numpy_types():
"""
This test is only present since this method currently diverges from the qtpylib implementation.
And we must ensure to not break this again once we update from the original source.
"""
series = pd.Series([56, 97, 19, 76, 65, 25, 87, 91, 79, 79])
expected_result = pd.Series([False, True, False, True, False, False, True, False, False, False])
assert qtpylib.crossed_above(series, 60).equals(expected_result)
assert qtpylib.crossed_above(series, 60.0).equals(expected_result)
assert qtpylib.crossed_above(series, np.int32(60)).equals(expected_result)
assert qtpylib.crossed_above(series, np.int64(60)).equals(expected_result)
assert qtpylib.crossed_above(series, np.float64(60.0)).equals(expected_result)

View File

@ -44,19 +44,19 @@ def test_parse_args_backtesting(mocker) -> None:
def test_main_start_hyperopt(mocker) -> None: def test_main_start_hyperopt(mocker) -> None:
mocker.patch.object(Path, "is_file", MagicMock(side_effect=[False, True])) mocker.patch.object(Path, 'is_file', MagicMock(side_effect=[False, True]))
hyperopt_mock = mocker.patch('freqtrade.commands.start_hyperopt', MagicMock()) hyperopt_mock = mocker.patch('freqtrade.commands.start_hyperopt', MagicMock())
hyperopt_mock.__name__ = PropertyMock("start_hyperopt") hyperopt_mock.__name__ = PropertyMock('start_hyperopt')
# it's sys.exit(0) at the end of hyperopt # it's sys.exit(0) at the end of hyperopt
with pytest.raises(SystemExit): with pytest.raises(SystemExit):
main(['hyperopt']) main(['hyperopt'])
assert hyperopt_mock.call_count == 1 assert hyperopt_mock.call_count == 1
call_args = hyperopt_mock.call_args[0][0] call_args = hyperopt_mock.call_args[0][0]
assert call_args["config"] == ['config.json'] assert call_args['config'] == ['config.json']
assert call_args["verbosity"] == 0 assert call_args['verbosity'] == 0
assert call_args["command"] == 'hyperopt' assert call_args['command'] == 'hyperopt'
assert call_args["func"] is not None assert call_args['func'] is not None
assert callable(call_args["func"]) assert callable(call_args['func'])
def test_main_fatal_exception(mocker, default_conf, caplog) -> None: def test_main_fatal_exception(mocker, default_conf, caplog) -> None:

View File

@ -362,22 +362,22 @@ def test_start_plot_profit(mocker):
def test_start_plot_profit_error(mocker): def test_start_plot_profit_error(mocker):
args = [ args = [
"plot-profit", 'plot-profit',
"--pairs", "ETH/BTC" '--pairs', 'ETH/BTC'
] ]
argsp = get_args(args) argsp = get_args(args)
# Make sure we use no config. Details: #2241 # Make sure we use no config. Details: #2241
# not resetting config causes random failures if config.json exists # not resetting config causes random failures if config.json exists
argsp["config"] = [] argsp['config'] = []
with pytest.raises(OperationalException): with pytest.raises(OperationalException):
start_plot_profit(argsp) start_plot_profit(argsp)
def test_plot_profit(default_conf, mocker, testdatadir, caplog): def test_plot_profit(default_conf, mocker, testdatadir, caplog):
default_conf['trade_source'] = 'file' default_conf['trade_source'] = 'file'
default_conf["datadir"] = testdatadir default_conf['datadir'] = testdatadir
default_conf['exportfilename'] = testdatadir / "backtest-result_test_nofile.json" default_conf['exportfilename'] = testdatadir / 'backtest-result_test_nofile.json'
default_conf['pairs'] = ["ETH/BTC", "LTC/BTC"] default_conf['pairs'] = ['ETH/BTC', 'LTC/BTC']
profit_mock = MagicMock() profit_mock = MagicMock()
store_mock = MagicMock() store_mock = MagicMock()

View File

@ -1,5 +1,3 @@
import talib.abstract as ta import talib.abstract as ta
import pandas as pd import pandas as pd

BIN
tests/testdata/UNITTEST_BTC-5m.h5 vendored Normal file

Binary file not shown.

BIN
tests/testdata/XRP_ETH-trades.h5 vendored Normal file

Binary file not shown.