Merge branch 'develop' into volumeList_enhanced_filter

This commit is contained in:
Matthias 2019-10-30 16:41:17 +01:00
commit dee9b84322
29 changed files with 434 additions and 139 deletions

View File

@ -44,7 +44,7 @@
"ZEC/BTC", "ZEC/BTC",
"XLM/BTC", "XLM/BTC",
"NXT/BTC", "NXT/BTC",
"POWR/BTC", "TRX/BTC",
"ADA/BTC", "ADA/BTC",
"XMR/BTC" "XMR/BTC"
], ],

View File

@ -81,7 +81,7 @@
"ZEC/BTC", "ZEC/BTC",
"XLM/BTC", "XLM/BTC",
"NXT/BTC", "NXT/BTC",
"POWR/BTC", "TRX/BTC",
"ADA/BTC", "ADA/BTC",
"XMR/BTC" "XMR/BTC"
], ],

View File

@ -72,6 +72,8 @@ The exported trades can be used for [further analysis](#further-backtest-result-
freqtrade backtesting --export trades --export-filename=backtest_samplestrategy.json freqtrade backtesting --export trades --export-filename=backtest_samplestrategy.json
``` ```
Please also read about the [strategy startup period](strategy-customization.md#strategy-startup-period).
#### Supplying custom fee value #### Supplying custom fee value
Sometimes your account has certain fee rebates (fee reductions starting with a certain account size or monthly volume), which are not visible to ccxt. Sometimes your account has certain fee rebates (fee reductions starting with a certain account size or monthly volume), which are not visible to ccxt.

View File

@ -3,74 +3,101 @@
The `stoploss` configuration parameter is loss in percentage that should trigger a sale. The `stoploss` configuration parameter is loss in percentage that should trigger a sale.
For example, value `-0.10` will cause immediate sell if the profit dips below -10% for a given trade. This parameter is optional. For example, value `-0.10` will cause immediate sell if the profit dips below -10% for a given trade. This parameter is optional.
Most of the strategy files already include the optimal `stoploss` Most of the strategy files already include the optimal `stoploss` value.
value. This parameter is optional. If you use it in the configuration file, it will take over the
`stoploss` value from the strategy file.
## Stop Loss support !!! Info
All stoploss properties mentioned in this file can be set in the Strategy, or in the configuration. Configuration values will override the strategy values.
## Stop Loss Types
At this stage the bot contains the following stoploss support modes: At this stage the bot contains the following stoploss support modes:
1. static stop loss, defined in either the strategy or configuration. 1. Static stop loss.
2. trailing stop loss, defined in the configuration. 2. Trailing stop loss.
3. trailing stop loss, custom positive loss, defined in configuration. 3. Trailing stop loss, custom positive loss.
4. Trailing stop loss only once the trade has reached a certain offset.
!!! Note Those stoploss modes can be *on exchange* or *off exchange*. If the stoploss is *on exchange* it means a stoploss limit order is placed on the exchange immediately after buy order happens successfully. This will protect you against sudden crashes in market as the order will be in the queue immediately and if market goes down then the order has more chance of being fulfilled.
All stoploss properties can be configured in either Strategy or configuration. Configuration values override strategy values.
Those stoploss modes can be *on exchange* or *off exchange*. If the stoploss is *on exchange* it means a stoploss limit order is placed on the exchange immediately after buy order happens successfuly. This will protect you against sudden crashes in market as the order will be in the queue immediately and if market goes down then the order has more chance of being fulfilled. In case of stoploss on exchange there is another parameter called `stoploss_on_exchange_interval`. This configures the interval in seconds at which the bot will check the stoploss and update it if necessary.
In case of stoploss on exchange there is another parameter called `stoploss_on_exchange_interval`. This configures the interval in seconds at which the bot will check the stoploss and update it if necessary. As an example in case of trailing stoploss if the order is on the exchange and the market is going up then the bot automatically cancels the previous stoploss order and put a new one with a stop value higher than previous one. It is clear that the bot cannot do it every 5 seconds otherwise it gets banned. So this parameter will tell the bot how often it should update the stoploss order. The default value is 60 (1 minute). For example, assuming the stoploss is on exchange, and trailing stoploss is enabled, and the market is going up, then the bot automatically cancels the previous stoploss order and puts a new one with a stop value higher than the previous stoploss order.
The bot cannot do this every 5 seconds (at each iteration), otherwise it would get banned by the exchange.
So this parameter will tell the bot how often it should update the stoploss order. The default value is 60 (1 minute).
This same logic will reapply a stoploss order on the exchange should you cancel it accidentally.
!!! Note !!! Note
Stoploss on exchange is only supported for Binance as of now. Stoploss on exchange is only supported for Binance as of now.
## Static Stop Loss ## Static Stop Loss
This is very simple, basically you define a stop loss of x in your strategy file or alternative in the configuration, which This is very simple, you define a stop loss of x (as a ratio of price, i.e. x * 100% of price). This will try to sell the asset once the loss exceeds the defined loss.
will overwrite the strategy definition. This will basically try to sell your asset, the second the loss exceeds the defined loss.
## Trailing Stop Loss ## Trailing Stop Loss
The initial value for this stop loss, is defined in your strategy or configuration. Just as you would define your Stop Loss normally. The initial value for this is `stoploss`, just as you would define your static Stop loss.
To enable this Feauture all you have to do is to define the configuration element: To enable trailing stoploss:
``` json ``` python
"trailing_stop" : True trailing_stop = True
``` ```
This will now activate an algorithm, which automatically moves your stop loss up every time the price of your asset increases. This will now activate an algorithm, which automatically moves the stop loss up every time the price of your asset increases.
For example, simplified math, For example, simplified math:
* you buy an asset at a price of 100$ * the bot buys an asset at a price of 100$
* your stop loss is defined at 2% * the stop loss is defined at 2%
* which means your stop loss, gets triggered once your asset dropped below 98$ * the stop loss would get triggered once the asset dropps below 98$
* assuming your asset now increases to 102$ * assuming the asset now increases to 102$
* your stop loss, will now be 2% of 102$ or 99.96$ * the stop loss will now be 2% of 102$ or 99.96$
* now your asset drops in value to 101$, your stop loss, will still be 99.96$ * now the asset drops in value to 101$, the stop loss will still be 99.96$ and would trigger at 99.96$.
basically what this means is that your stop loss will be adjusted to be always be 2% of the highest observed price In summary: The stoploss will be adjusted to be always be 2% of the highest observed price.
### Custom positive loss ### Custom positive stoploss
Due to demand, it is possible to have a default stop loss, when you are in the red with your buy, but once your profit surpasses a certain percentage, It is also possible to have a default stop loss, when you are in the red with your buy, but once your profit surpasses a certain percentage, the system will utilize a new stop loss, which can have a different value.
the system will utilize a new stop loss, which can be a different value. For example your default stop loss is 5%, but once you have 1.1% profit, For example your default stop loss is 5%, but once you have 1.1% profit, it will be changed to be only a 1% stop loss, which trails the green candles until it goes below them.
it will be changed to be only a 1% stop loss, which trails the green candles until it goes below them.
Both values can be configured in the main configuration file and requires `"trailing_stop": true` to be set to true. Both values require `trailing_stop` to be set to true.
``` json ``` python
"trailing_stop_positive": 0.01, trailing_stop_positive = 0.01
"trailing_stop_positive_offset": 0.011, trailing_stop_positive_offset = 0.011
"trailing_only_offset_is_reached": false
``` ```
The 0.01 would translate to a 1% stop loss, once you hit 1.1% profit. The 0.01 would translate to a 1% stop loss, once you hit 1.1% profit.
Before this, `stoploss` is used for the trailing stoploss.
You should also make sure to have this value (`trailing_stop_positive_offset`) lower than your minimal ROI, otherwise minimal ROI will apply first and sell your trade. Read the [next section](#trailing-only-once-offset-is-reached) to keep stoploss at 5% of the entry point.
If `"trailing_only_offset_is_reached": true` then the trailing stoploss is only activated once the offset is reached. Until then, the stoploss remains at the configured`stoploss`. !!! Tip
Make sure to have this value (`trailing_stop_positive_offset`) lower than minimal ROI, otherwise minimal ROI will apply first and sell the trade.
### Trailing only once offset is reached
It is also possible to use a static stoploss until the offset is reached, and then trail the trade to take profits once the market turns.
If `"trailing_only_offset_is_reached": true` then the trailing stoploss is only activated once the offset is reached. Until then, the stoploss remains at the configured `stoploss`.
This option can be used with or without `trailing_stop_positive`, but uses `trailing_stop_positive_offset` as offset.
``` python
trailing_stop_positive_offset = 0.011
trailing_only_offset_is_reached = true
```
Simplified example:
``` python
stoploss = 0.05
trailing_stop_positive_offset = 0.03
trailing_only_offset_is_reached = True
```
* the bot buys an asset at a price of 100$
* the stop loss is defined at 5%
* the stop loss will remain at 95% until profit reaches +3%
## Changing stoploss on open trades ## Changing stoploss on open trades

View File

@ -117,6 +117,37 @@ def populate_indicators(self, dataframe: DataFrame, metadata: dict) -> DataFrame
Look into the [user_data/strategies/sample_strategy.py](https://github.com/freqtrade/freqtrade/blob/develop/user_data/strategies/sample_strategy.py). Look into the [user_data/strategies/sample_strategy.py](https://github.com/freqtrade/freqtrade/blob/develop/user_data/strategies/sample_strategy.py).
Then uncomment indicators you need. Then uncomment indicators you need.
### Strategy startup period
Most indicators have an instable startup period, in which they are either not available, or the calculation is incorrect. This can lead to inconsistencies, since Freqtrade does not know how long this instable period should be.
To account for this, the strategy can be assigned the `startup_candle_count` attribute.
This should be set to the maximum number of candles that the strategy requires to calculate stable indicators.
In this example strategy, this should be set to 100 (`startup_candle_count = 100`), since the longest needed history is 100 candles.
``` python
dataframe['ema100'] = ta.EMA(dataframe, timeperiod=100)
```
By letting the bot know how much history is needed, backtest trades can start at the specified timerange during backtesting and hyperopt.
!!! Warning
`startup_candle_count` should be below `ohlcv_candle_limit` (which is 500 for most exchanges) - since only this amount of candles will be available during Dry-Run/Live Trade operations.
#### Example
Let's try to backtest 1 month (January 2019) of 5m candles using the an example strategy with EMA100, as above.
``` bash
freqtrade backtesting --timerange 20190101-20190201 --ticker-interval 5m
```
Assuming `startup_candle_count` is set to 100, backtesting knows it needs 100 candles to generate valid buy signals. It will load data from `20190101 - (100 * 5m)` - which is ~2019-12-31 15:30:00.
If this data is available, indicators will be calculated with this extended timerange. The instable startup period (up to 2019-01-01 00:00:00) will then be removed before starting backtesting.
!!! Note
If data for the startup period is not available, then the timerange will be adjusted to account for this startup period - so Backtesting would start at 2019-01-01 08:30:00.
### Buy signal rules ### Buy signal rules
Edit the method `populate_buy_trend()` in your strategy file to update your buy strategy. Edit the method `populate_buy_trend()` in your strategy file to update your buy strategy.
@ -267,10 +298,10 @@ class Awesomestrategy(IStrategy):
``` ```
!!! Warning !!! Warning
The data is not persisted after a bot-restart (or config-reload). Also, the amount of data should be kept smallish (no DataFrames and such), otherwise the bot will start to consume a lot of memory and eventually run out of memory and crash. The data is not persisted after a bot-restart (or config-reload). Also, the amount of data should be kept smallish (no DataFrames and such), otherwise the bot will start to consume a lot of memory and eventually run out of memory and crash.
!!! Note !!! Note
If the data is pair-specific, make sure to use pair as one of the keys in the dictionary. If the data is pair-specific, make sure to use pair as one of the keys in the dictionary.
### Additional data (DataProvider) ### Additional data (DataProvider)

View File

@ -1,11 +1,14 @@
""" """
This module contains the argument manager class This module contains the argument manager class
""" """
import logging
import re import re
from typing import Optional from typing import Optional
import arrow import arrow
logger = logging.getLogger(__name__)
class TimeRange: class TimeRange:
""" """
@ -27,6 +30,34 @@ class TimeRange:
return (self.starttype == other.starttype and self.stoptype == other.stoptype return (self.starttype == other.starttype and self.stoptype == other.stoptype
and self.startts == other.startts and self.stopts == other.stopts) and self.startts == other.startts and self.stopts == other.stopts)
def subtract_start(self, seconds) -> None:
"""
Subtracts <seconds> from startts if startts is set.
:param seconds: Seconds to subtract from starttime
:return: None (Modifies the object in place)
"""
if self.startts:
self.startts = self.startts - seconds
def adjust_start_if_necessary(self, ticker_interval_secs: int, startup_candles: int,
min_date: arrow.Arrow) -> None:
"""
Adjust startts by <startup_candles> candles.
Applies only if no startup-candles have been available.
:param ticker_interval_secs: Ticker interval in seconds e.g. `timeframe_to_seconds('5m')`
:param startup_candles: Number of candles to move start-date forward
:param min_date: Minimum data date loaded. Key kriterium to decide if start-time
has to be moved
:return: None (Modifies the object in place)
"""
if (not self.starttype or (startup_candles
and min_date.timestamp >= self.startts)):
# If no startts was defined, or backtest-data starts at the defined backtest-date
logger.warning("Moving start-date by %s candles to account for startup time.",
startup_candles)
self.startts = (min_date.timestamp + ticker_interval_secs * startup_candles)
self.starttype = 'date'
@staticmethod @staticmethod
def parse_timerange(text: Optional[str]): def parse_timerange(text: Optional[str]):
""" """

View File

@ -150,15 +150,21 @@ def combine_tickers_with_mean(tickers: Dict[str, pd.DataFrame], column: str = "c
return df_comb return df_comb
def create_cum_profit(df: pd.DataFrame, trades: pd.DataFrame, col_name: str) -> pd.DataFrame: def create_cum_profit(df: pd.DataFrame, trades: pd.DataFrame, col_name: str,
timeframe: str) -> pd.DataFrame:
""" """
Adds a column `col_name` with the cumulative profit for the given trades array. Adds a column `col_name` with the cumulative profit for the given trades array.
:param df: DataFrame with date index :param df: DataFrame with date index
:param trades: DataFrame containing trades (requires columns close_time and profitperc) :param trades: DataFrame containing trades (requires columns close_time and profitperc)
:param col_name: Column name that will be assigned the results
:param timeframe: Timeframe used during the operations
:return: Returns df with one additional column, col_name, containing the cumulative profit. :return: Returns df with one additional column, col_name, containing the cumulative profit.
""" """
# Use groupby/sum().cumsum() to avoid errors when multiple trades sold at the same candle. from freqtrade.exchange import timeframe_to_minutes
df[col_name] = trades.groupby('close_time')['profitperc'].sum().cumsum() ticker_minutes = timeframe_to_minutes(timeframe)
# Resample to ticker_interval to make sure trades match candles
_trades_sum = trades.resample(f'{ticker_minutes}min', on='close_time')[['profitperc']].sum()
df.loc[:, col_name] = _trades_sum.cumsum()
# Set first value to 0 # Set first value to 0
df.loc[df.iloc[0].name, col_name] = 0 df.loc[df.iloc[0].name, col_name] = 0
# FFill to get continuous # FFill to get continuous

View File

@ -8,17 +8,19 @@ Includes:
import logging import logging
import operator import operator
from copy import deepcopy
from datetime import datetime from datetime import datetime
from pathlib import Path from pathlib import Path
from typing import Any, Dict, List, Optional, Tuple from typing import Any, Dict, List, Optional, Tuple
import arrow import arrow
import pytz
from pandas import DataFrame from pandas import DataFrame
from freqtrade import OperationalException, misc from freqtrade import OperationalException, misc
from freqtrade.configuration import TimeRange from freqtrade.configuration import TimeRange
from freqtrade.data.converter import parse_ticker_dataframe, trades_to_ohlcv from freqtrade.data.converter import parse_ticker_dataframe, trades_to_ohlcv
from freqtrade.exchange import Exchange, timeframe_to_minutes from freqtrade.exchange import Exchange, timeframe_to_minutes, timeframe_to_seconds
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -49,6 +51,19 @@ def trim_tickerlist(tickerlist: List[Dict], timerange: TimeRange) -> List[Dict]:
return tickerlist[start_index:stop_index] return tickerlist[start_index:stop_index]
def trim_dataframe(df: DataFrame, timerange: TimeRange) -> DataFrame:
"""
Trim dataframe based on given timerange
"""
if timerange.starttype == 'date':
start = datetime.fromtimestamp(timerange.startts, tz=pytz.utc)
df = df.loc[df['date'] >= start, :]
if timerange.stoptype == 'date':
stop = datetime.fromtimestamp(timerange.stopts, tz=pytz.utc)
df = df.loc[df['date'] <= stop, :]
return df
def load_tickerdata_file(datadir: Path, pair: str, ticker_interval: str, def load_tickerdata_file(datadir: Path, pair: str, ticker_interval: str,
timerange: Optional[TimeRange] = None) -> Optional[list]: timerange: Optional[TimeRange] = None) -> Optional[list]:
""" """
@ -113,7 +128,8 @@ def load_pair_history(pair: str,
refresh_pairs: bool = False, refresh_pairs: bool = False,
exchange: Optional[Exchange] = None, exchange: Optional[Exchange] = None,
fill_up_missing: bool = True, fill_up_missing: bool = True,
drop_incomplete: bool = True drop_incomplete: bool = True,
startup_candles: int = 0,
) -> DataFrame: ) -> DataFrame:
""" """
Loads cached ticker history for the given pair. Loads cached ticker history for the given pair.
@ -126,9 +142,15 @@ def load_pair_history(pair: str,
:param exchange: Exchange object (needed when using "refresh_pairs") :param exchange: Exchange object (needed when using "refresh_pairs")
:param fill_up_missing: Fill missing values with "No action"-candles :param fill_up_missing: Fill missing values with "No action"-candles
:param drop_incomplete: Drop last candle assuming it may be incomplete. :param drop_incomplete: Drop last candle assuming it may be incomplete.
:param startup_candles: Additional candles to load at the start of the period
:return: DataFrame with ohlcv data :return: DataFrame with ohlcv data
""" """
timerange_startup = deepcopy(timerange)
if startup_candles > 0 and timerange_startup:
logger.info('Using indicator startup period: %s ...', startup_candles)
timerange_startup.subtract_start(timeframe_to_seconds(ticker_interval) * startup_candles)
# The user forced the refresh of pairs # The user forced the refresh of pairs
if refresh_pairs: if refresh_pairs:
download_pair_history(datadir=datadir, download_pair_history(datadir=datadir,
@ -137,11 +159,11 @@ def load_pair_history(pair: str,
ticker_interval=ticker_interval, ticker_interval=ticker_interval,
timerange=timerange) timerange=timerange)
pairdata = load_tickerdata_file(datadir, pair, ticker_interval, timerange=timerange) pairdata = load_tickerdata_file(datadir, pair, ticker_interval, timerange=timerange_startup)
if pairdata: if pairdata:
if timerange: if timerange_startup:
_validate_pairdata(pair, pairdata, timerange) _validate_pairdata(pair, pairdata, timerange_startup)
return parse_ticker_dataframe(pairdata, ticker_interval, pair=pair, return parse_ticker_dataframe(pairdata, ticker_interval, pair=pair,
fill_missing=fill_up_missing, fill_missing=fill_up_missing,
drop_incomplete=drop_incomplete) drop_incomplete=drop_incomplete)
@ -160,10 +182,22 @@ def load_data(datadir: Path,
exchange: Optional[Exchange] = None, exchange: Optional[Exchange] = None,
timerange: Optional[TimeRange] = None, timerange: Optional[TimeRange] = None,
fill_up_missing: bool = True, fill_up_missing: bool = True,
startup_candles: int = 0,
fail_without_data: bool = False
) -> Dict[str, DataFrame]: ) -> Dict[str, DataFrame]:
""" """
Loads ticker history data for a list of pairs Loads ticker history data for a list of pairs
:return: dict(<pair>:<tickerlist>) :param datadir: Path to the data storage location.
:param ticker_interval: Ticker-interval (e.g. "5m")
:param pairs: List of pairs to load
:param refresh_pairs: Refresh pairs from exchange.
(Note: Requires exchange to be passed as well.)
:param exchange: Exchange object (needed when using "refresh_pairs")
:param timerange: Limit data to be loaded to this timerange
:param fill_up_missing: Fill missing values with "No action"-candles
:param startup_candles: Additional candles to load at the start of the period
:param fail_without_data: Raise OperationalException if no data is found.
:return: dict(<pair>:<Dataframe>)
TODO: refresh_pairs is still used by edge to keep the data uptodate. TODO: refresh_pairs is still used by edge to keep the data uptodate.
This should be replaced in the future. Instead, writing the current candles to disk This should be replaced in the future. Instead, writing the current candles to disk
from dataprovider should be implemented, as this would avoid loading ohlcv data twice. from dataprovider should be implemented, as this would avoid loading ohlcv data twice.
@ -176,9 +210,13 @@ def load_data(datadir: Path,
datadir=datadir, timerange=timerange, datadir=datadir, timerange=timerange,
refresh_pairs=refresh_pairs, refresh_pairs=refresh_pairs,
exchange=exchange, exchange=exchange,
fill_up_missing=fill_up_missing) fill_up_missing=fill_up_missing,
startup_candles=startup_candles)
if hist is not None: if hist is not None:
result[pair] = hist result[pair] = hist
if fail_without_data and not result:
raise OperationalException("No data found. Terminating.")
return result return result

View File

@ -100,7 +100,8 @@ class Edge:
ticker_interval=self.strategy.ticker_interval, ticker_interval=self.strategy.ticker_interval,
refresh_pairs=self._refresh_pairs, refresh_pairs=self._refresh_pairs,
exchange=self.exchange, exchange=self.exchange,
timerange=self._timerange timerange=self._timerange,
startup_candles=self.strategy.startup_candle_count,
) )
if not data: if not data:

View File

@ -228,6 +228,7 @@ class Exchange:
self.validate_pairs(config['exchange']['pair_whitelist']) self.validate_pairs(config['exchange']['pair_whitelist'])
self.validate_ordertypes(config.get('order_types', {})) self.validate_ordertypes(config.get('order_types', {}))
self.validate_order_time_in_force(config.get('order_time_in_force', {})) self.validate_order_time_in_force(config.get('order_time_in_force', {}))
self.validate_required_startup_candles(config.get('startup_candle_count', 0))
# Converts the interval provided in minutes in config to seconds # Converts the interval provided in minutes in config to seconds
self.markets_refresh_interval: int = exchange_config.get( self.markets_refresh_interval: int = exchange_config.get(
@ -443,6 +444,16 @@ class Exchange:
raise OperationalException( raise OperationalException(
f'Time in force policies are not supported for {self.name} yet.') f'Time in force policies are not supported for {self.name} yet.')
def validate_required_startup_candles(self, startup_candles) -> None:
"""
Checks if required startup_candles is more than ohlcv_candle_limit.
Requires a grace-period of 5 candles - so a startup-period up to 494 is allowed by default.
"""
if startup_candles + 5 > self._ft_has['ohlcv_candle_limit']:
raise OperationalException(
f"This strategy requires {startup_candles} candles to start. "
f"{self.name} only provides {self._ft_has['ohlcv_candle_limit']}.")
def exchange_has(self, endpoint: str) -> bool: def exchange_has(self, endpoint: str) -> bool:
""" """
Checks if exchange implements a specific API endpoint. Checks if exchange implements a specific API endpoint.

View File

@ -15,7 +15,7 @@ from freqtrade import OperationalException
from freqtrade.configuration import TimeRange from freqtrade.configuration import TimeRange
from freqtrade.data import history from freqtrade.data import history
from freqtrade.data.dataprovider import DataProvider from freqtrade.data.dataprovider import DataProvider
from freqtrade.exchange import timeframe_to_minutes from freqtrade.exchange import timeframe_to_minutes, timeframe_to_seconds
from freqtrade.misc import file_dump_json from freqtrade.misc import file_dump_json
from freqtrade.persistence import Trade from freqtrade.persistence import Trade
from freqtrade.resolvers import ExchangeResolver, StrategyResolver from freqtrade.resolvers import ExchangeResolver, StrategyResolver
@ -90,6 +90,8 @@ class Backtesting:
self.ticker_interval = str(self.config.get('ticker_interval')) self.ticker_interval = str(self.config.get('ticker_interval'))
self.ticker_interval_mins = timeframe_to_minutes(self.ticker_interval) self.ticker_interval_mins = timeframe_to_minutes(self.ticker_interval)
# Get maximum required startup period
self.required_startup = max([strat.startup_candle_count for strat in self.strategylist])
# Load one (first) strategy # Load one (first) strategy
self._set_strategy(self.strategylist[0]) self._set_strategy(self.strategylist[0])
@ -103,6 +105,31 @@ class Backtesting:
# And the regular "stoploss" function would not apply to that case # And the regular "stoploss" function would not apply to that case
self.strategy.order_types['stoploss_on_exchange'] = False self.strategy.order_types['stoploss_on_exchange'] = False
def load_bt_data(self):
timerange = TimeRange.parse_timerange(None if self.config.get(
'timerange') is None else str(self.config.get('timerange')))
data = history.load_data(
datadir=Path(self.config['datadir']),
pairs=self.config['exchange']['pair_whitelist'],
ticker_interval=self.ticker_interval,
timerange=timerange,
startup_candles=self.required_startup,
fail_without_data=True,
)
min_date, max_date = history.get_timeframe(data)
logger.info(
'Loading data from %s up to %s (%s days)..',
min_date.isoformat(), max_date.isoformat(), (max_date - min_date).days
)
# Adjust startts forward if not enough data is available
timerange.adjust_start_if_necessary(timeframe_to_seconds(self.ticker_interval),
self.required_startup, min_date)
return data, timerange
def _generate_text_table(self, data: Dict[str, Dict], results: DataFrame, def _generate_text_table(self, data: Dict[str, Dict], results: DataFrame,
skip_nan: bool = False) -> str: skip_nan: bool = False) -> str:
""" """
@ -412,39 +439,18 @@ class Backtesting:
:return: None :return: None
""" """
data: Dict[str, Any] = {} data: Dict[str, Any] = {}
pairs = self.config['exchange']['pair_whitelist']
logger.info('Using stake_currency: %s ...', self.config['stake_currency']) logger.info('Using stake_currency: %s ...', self.config['stake_currency'])
logger.info('Using stake_amount: %s ...', self.config['stake_amount']) logger.info('Using stake_amount: %s ...', self.config['stake_amount'])
timerange = TimeRange.parse_timerange(None if self.config.get(
'timerange') is None else str(self.config.get('timerange')))
data = history.load_data(
datadir=Path(self.config['datadir']),
pairs=pairs,
ticker_interval=self.ticker_interval,
timerange=timerange,
)
if not data:
logger.critical("No data found. Terminating.")
return
# Use max_open_trades in backtesting, except --disable-max-market-positions is set # Use max_open_trades in backtesting, except --disable-max-market-positions is set
if self.config.get('use_max_market_positions', True): if self.config.get('use_max_market_positions', True):
max_open_trades = self.config['max_open_trades'] max_open_trades = self.config['max_open_trades']
else: else:
logger.info('Ignoring max_open_trades (--disable-max-market-positions was used) ...') logger.info('Ignoring max_open_trades (--disable-max-market-positions was used) ...')
max_open_trades = 0 max_open_trades = 0
data, timerange = self.load_bt_data()
all_results = {} all_results = {}
min_date, max_date = history.get_timeframe(data)
logger.info(
'Backtesting with data from %s up to %s (%s days)..',
min_date.isoformat(),
max_date.isoformat(),
(max_date - min_date).days
)
for strat in self.strategylist: for strat in self.strategylist:
logger.info("Running backtesting for Strategy %s", strat.get_strategy_name()) logger.info("Running backtesting for Strategy %s", strat.get_strategy_name())
self._set_strategy(strat) self._set_strategy(strat)
@ -452,6 +458,15 @@ class Backtesting:
# need to reprocess data every time to populate signals # need to reprocess data every time to populate signals
preprocessed = self.strategy.tickerdata_to_dataframe(data) preprocessed = self.strategy.tickerdata_to_dataframe(data)
# Trim startup period from analyzed dataframe
for pair, df in preprocessed.items():
preprocessed[pair] = history.trim_dataframe(df, timerange)
min_date, max_date = history.get_timeframe(preprocessed)
logger.info(
'Backtesting with data from %s up to %s (%s days)..',
min_date.isoformat(), max_date.isoformat(), (max_date - min_date).days
)
# Execute backtest and print results # Execute backtest and print results
all_results[self.strategy.get_strategy_name()] = self.backtest( all_results[self.strategy.get_strategy_name()] = self.backtest(
{ {

View File

@ -22,8 +22,7 @@ from pandas import DataFrame
from skopt import Optimizer from skopt import Optimizer
from skopt.space import Dimension from skopt.space import Dimension
from freqtrade.configuration import TimeRange from freqtrade.data.history import get_timeframe, trim_dataframe
from freqtrade.data.history import load_data, get_timeframe
from freqtrade.misc import round_dict from freqtrade.misc import round_dict
from freqtrade.optimize.backtesting import Backtesting from freqtrade.optimize.backtesting import Backtesting
# Import IHyperOpt and IHyperOptLoss to allow unpickling classes from these modules # Import IHyperOpt and IHyperOptLoss to allow unpickling classes from these modules
@ -379,30 +378,19 @@ class Hyperopt:
) )
def start(self) -> None: def start(self) -> None:
timerange = TimeRange.parse_timerange(None if self.config.get( data, timerange = self.backtesting.load_bt_data()
'timerange') is None else str(self.config.get('timerange')))
data = load_data(
datadir=Path(self.config['datadir']),
pairs=self.config['exchange']['pair_whitelist'],
ticker_interval=self.backtesting.ticker_interval,
timerange=timerange
)
if not data: preprocessed = self.backtesting.strategy.tickerdata_to_dataframe(data)
logger.critical("No data found. Terminating.")
return
# Trim startup period from analyzed dataframe
for pair, df in preprocessed.items():
preprocessed[pair] = trim_dataframe(df, timerange)
min_date, max_date = get_timeframe(data) min_date, max_date = get_timeframe(data)
logger.info( logger.info(
'Hyperopting with data from %s up to %s (%s days)..', 'Hyperopting with data from %s up to %s (%s days)..',
min_date.isoformat(), min_date.isoformat(), max_date.isoformat(), (max_date - min_date).days
max_date.isoformat(),
(max_date - min_date).days
) )
preprocessed = self.backtesting.strategy.tickerdata_to_dataframe(data)
dump(preprocessed, self.tickerdata_pickle) dump(preprocessed, self.tickerdata_pickle)
# We don't need exchange instance anymore while running hyperopt # We don't need exchange instance anymore while running hyperopt

View File

@ -264,12 +264,12 @@ def generate_candlestick_graph(pair: str, data: pd.DataFrame, trades: pd.DataFra
def generate_profit_graph(pairs: str, tickers: Dict[str, pd.DataFrame], def generate_profit_graph(pairs: str, tickers: Dict[str, pd.DataFrame],
trades: pd.DataFrame) -> go.Figure: trades: pd.DataFrame, timeframe: str) -> go.Figure:
# Combine close-values for all pairs, rename columns to "pair" # Combine close-values for all pairs, rename columns to "pair"
df_comb = combine_tickers_with_mean(tickers, "close") df_comb = combine_tickers_with_mean(tickers, "close")
# Add combined cumulative profit # Add combined cumulative profit
df_comb = create_cum_profit(df_comb, trades, 'cum_profit') df_comb = create_cum_profit(df_comb, trades, 'cum_profit', timeframe)
# Plot the pairs average close prices, and total profit growth # Plot the pairs average close prices, and total profit growth
avgclose = go.Scatter( avgclose = go.Scatter(
@ -293,7 +293,7 @@ def generate_profit_graph(pairs: str, tickers: Dict[str, pd.DataFrame],
for pair in pairs: for pair in pairs:
profit_col = f'cum_profit_{pair}' profit_col = f'cum_profit_{pair}'
df_comb = create_cum_profit(df_comb, trades[trades['pair'] == pair], profit_col) df_comb = create_cum_profit(df_comb, trades[trades['pair'] == pair], profit_col, timeframe)
fig = add_profit(fig, 3, df_comb, profit_col, f"Profit {pair}") fig = add_profit(fig, 3, df_comb, profit_col, f"Profit {pair}")
@ -382,9 +382,9 @@ def plot_profit(config: Dict[str, Any]) -> None:
) )
# Filter trades to relevant pairs # Filter trades to relevant pairs
trades = trades[trades['pair'].isin(plot_elements["pairs"])] trades = trades[trades['pair'].isin(plot_elements["pairs"])]
# Create an average close price of all the pairs that were involved. # Create an average close price of all the pairs that were involved.
# this could be useful to gauge the overall market trend # this could be useful to gauge the overall market trend
fig = generate_profit_graph(plot_elements["pairs"], plot_elements["tickers"], trades) fig = generate_profit_graph(plot_elements["pairs"], plot_elements["tickers"],
trades, config.get('ticker_interval', '5m'))
store_plot_file(fig, filename='freqtrade-profit-plot.html', store_plot_file(fig, filename='freqtrade-profit-plot.html',
directory=config['user_data_dir'] / "plot", auto_open=True) directory=config['user_data_dir'] / "plot", auto_open=True)

View File

@ -57,6 +57,7 @@ class StrategyResolver(IResolver):
("order_time_in_force", None, False), ("order_time_in_force", None, False),
("stake_currency", None, False), ("stake_currency", None, False),
("stake_amount", None, False), ("stake_amount", None, False),
("startup_candle_count", None, False),
("use_sell_signal", True, True), ("use_sell_signal", True, True),
("sell_profit_only", False, True), ("sell_profit_only", False, True),
("ignore_roi_if_buy_signal", False, True), ("ignore_roi_if_buy_signal", False, True),

View File

@ -39,6 +39,9 @@ class DefaultStrategy(IStrategy):
'stoploss_on_exchange': False 'stoploss_on_exchange': False
} }
# Number of candles the strategy requires before producing valid signals
startup_candle_count: int = 20
# Optional time in force for orders # Optional time in force for orders
order_time_in_force = { order_time_in_force = {
'buy': 'gtc', 'buy': 'gtc',
@ -105,9 +108,6 @@ class DefaultStrategy(IStrategy):
# EMA - Exponential Moving Average # EMA - Exponential Moving Average
dataframe['ema10'] = ta.EMA(dataframe, timeperiod=10) dataframe['ema10'] = ta.EMA(dataframe, timeperiod=10)
# SMA - Simple Moving Average
dataframe['sma'] = ta.SMA(dataframe, timeperiod=40)
return dataframe return dataframe
def populate_buy_trend(self, dataframe: DataFrame, metadata: dict) -> DataFrame: def populate_buy_trend(self, dataframe: DataFrame, metadata: dict) -> DataFrame:

View File

@ -103,6 +103,9 @@ class IStrategy(ABC):
# run "populate_indicators" only for new candle # run "populate_indicators" only for new candle
process_only_new_candles: bool = False process_only_new_candles: bool = False
# Count of candles the strategy requires before producing valid signals
startup_candle_count: int = 0
# Class level variables (intentional) containing # Class level variables (intentional) containing
# the dataprovider (dp) (access to other candles, historic data, ...) # the dataprovider (dp) (access to other candles, historic data, ...)
# and wallets - access to the current balance. # and wallets - access to the current balance.
@ -421,6 +424,7 @@ class IStrategy(ABC):
def tickerdata_to_dataframe(self, tickerdata: Dict[str, List]) -> Dict[str, DataFrame]: def tickerdata_to_dataframe(self, tickerdata: Dict[str, List]) -> Dict[str, DataFrame]:
""" """
Creates a dataframe and populates indicators for given ticker data Creates a dataframe and populates indicators for given ticker data
Used by optimize operations only, not during dry / live runs.
""" """
return {pair: self.advise_indicators(pair_data, {'pair': pair}) return {pair: self.advise_indicators(pair_data, {'pair': pair})
for pair, pair_data in tickerdata.items()} for pair, pair_data in tickerdata.items()}

View File

@ -78,7 +78,7 @@
"ZEC/BTC", "ZEC/BTC",
"XLM/BTC", "XLM/BTC",
"NXT/BTC", "NXT/BTC",
"POWR/BTC", "TRX/BTC",
"ADA/BTC", "ADA/BTC",
"XMR/BTC" "XMR/BTC"
], ],

View File

@ -2,7 +2,7 @@ from unittest.mock import MagicMock
import pytest import pytest
from arrow import Arrow from arrow import Arrow
from pandas import DataFrame, to_datetime from pandas import DataFrame, DateOffset, to_datetime
from freqtrade.configuration import TimeRange from freqtrade.configuration import TimeRange
from freqtrade.data.btanalysis import (BT_DATA_COLUMNS, from freqtrade.data.btanalysis import (BT_DATA_COLUMNS,
@ -125,12 +125,30 @@ def test_create_cum_profit(testdatadir):
bt_data = load_backtest_data(filename) bt_data = load_backtest_data(filename)
timerange = TimeRange.parse_timerange("20180110-20180112") timerange = TimeRange.parse_timerange("20180110-20180112")
df = load_pair_history(pair="POWR/BTC", ticker_interval='5m', df = load_pair_history(pair="TRX/BTC", ticker_interval='5m',
datadir=testdatadir, timerange=timerange) datadir=testdatadir, timerange=timerange)
cum_profits = create_cum_profit(df.set_index('date'), cum_profits = create_cum_profit(df.set_index('date'),
bt_data[bt_data["pair"] == 'POWR/BTC'], bt_data[bt_data["pair"] == 'TRX/BTC'],
"cum_profits") "cum_profits", timeframe="5m")
assert "cum_profits" in cum_profits.columns
assert cum_profits.iloc[0]['cum_profits'] == 0
assert cum_profits.iloc[-1]['cum_profits'] == 0.0798005
def test_create_cum_profit1(testdatadir):
filename = testdatadir / "backtest-result_test.json"
bt_data = load_backtest_data(filename)
# Move close-time to "off" the candle, to make sure the logic still works
bt_data.loc[:, 'close_time'] = bt_data.loc[:, 'close_time'] + DateOffset(seconds=20)
timerange = TimeRange.parse_timerange("20180110-20180112")
df = load_pair_history(pair="TRX/BTC", ticker_interval='5m',
datadir=testdatadir, timerange=timerange)
cum_profits = create_cum_profit(df.set_index('date'),
bt_data[bt_data["pair"] == 'TRX/BTC'],
"cum_profits", timeframe="5m")
assert "cum_profits" in cum_profits.columns assert "cum_profits" in cum_profits.columns
assert cum_profits.iloc[0]['cum_profits'] == 0 assert cum_profits.iloc[0]['cum_profits'] == 0
assert cum_profits.iloc[-1]['cum_profits'] == 0.0798005 assert cum_profits.iloc[-1]['cum_profits'] == 0.0798005

View File

@ -95,6 +95,23 @@ def test_load_data_1min_ticker(ticker_history, mocker, caplog, testdatadir) -> N
_clean_test_file(file) _clean_test_file(file)
def test_load_data_startup_candles(mocker, caplog, default_conf, testdatadir) -> None:
ltfmock = mocker.patch('freqtrade.data.history.load_tickerdata_file',
MagicMock(return_value=None))
timerange = TimeRange('date', None, 1510639620, 0)
history.load_pair_history(pair='UNITTEST/BTC', ticker_interval='1m',
datadir=testdatadir, timerange=timerange,
startup_candles=20,
)
assert log_has(
'Using indicator startup period: 20 ...', caplog
)
assert ltfmock.call_count == 1
assert ltfmock.call_args_list[0][1]['timerange'] != timerange
# startts is 20 minutes earlier
assert ltfmock.call_args_list[0][1]['timerange'].startts == timerange.startts - 20 * 60
def test_load_data_with_new_pair_1min(ticker_history_list, mocker, caplog, def test_load_data_with_new_pair_1min(ticker_history_list, mocker, caplog,
default_conf, testdatadir) -> None: default_conf, testdatadir) -> None:
""" """
@ -427,6 +444,46 @@ def test_trim_tickerlist(testdatadir) -> None:
assert not ticker assert not ticker
def test_trim_dataframe(testdatadir) -> None:
data = history.load_data(
datadir=testdatadir,
ticker_interval='1m',
pairs=['UNITTEST/BTC']
)['UNITTEST/BTC']
min_date = int(data.iloc[0]['date'].timestamp())
max_date = int(data.iloc[-1]['date'].timestamp())
data_modify = data.copy()
# Remove first 30 minutes (1800 s)
tr = TimeRange('date', None, min_date + 1800, 0)
data_modify = history.trim_dataframe(data_modify, tr)
assert not data_modify.equals(data)
assert len(data_modify) < len(data)
assert len(data_modify) == len(data) - 30
assert all(data_modify.iloc[-1] == data.iloc[-1])
assert all(data_modify.iloc[0] == data.iloc[30])
data_modify = data.copy()
# Remove last 30 minutes (1800 s)
tr = TimeRange(None, 'date', 0, max_date - 1800)
data_modify = history.trim_dataframe(data_modify, tr)
assert not data_modify.equals(data)
assert len(data_modify) < len(data)
assert len(data_modify) == len(data) - 30
assert all(data_modify.iloc[0] == data.iloc[0])
assert all(data_modify.iloc[-1] == data.iloc[-31])
data_modify = data.copy()
# Remove first 25 and last 30 minutes (1800 s)
tr = TimeRange('date', 'date', min_date + 1500, max_date - 1800)
data_modify = history.trim_dataframe(data_modify, tr)
assert not data_modify.equals(data)
assert len(data_modify) < len(data)
assert len(data_modify) == len(data) - 55
# first row matches 25th original row
assert all(data_modify.iloc[0] == data.iloc[25])
def test_file_dump_json_tofile(testdatadir) -> None: def test_file_dump_json_tofile(testdatadir) -> None:
file = testdatadir / 'test_{id}.json'.format(id=str(uuid.uuid4())) file = testdatadir / 'test_{id}.json'.format(id=str(uuid.uuid4()))
data = {'bar': 'foo'} data = {'bar': 'foo'}

View File

@ -256,7 +256,7 @@ def test_edge_heartbeat_calculate(mocker, edge_conf):
def mocked_load_data(datadir, pairs=[], ticker_interval='0m', refresh_pairs=False, def mocked_load_data(datadir, pairs=[], ticker_interval='0m', refresh_pairs=False,
timerange=None, exchange=None): timerange=None, exchange=None, *args, **kwargs):
hz = 0.1 hz = 0.1
base = 0.001 base = 0.001

View File

@ -523,6 +523,24 @@ def test_validate_order_types_not_in_config(default_conf, mocker):
Exchange(conf) Exchange(conf)
def test_validate_required_startup_candles(default_conf, mocker, caplog):
api_mock = MagicMock()
mocker.patch('freqtrade.exchange.Exchange.name', PropertyMock(return_value='Binance'))
mocker.patch('freqtrade.exchange.Exchange._init_ccxt', api_mock)
mocker.patch('freqtrade.exchange.Exchange.validate_timeframes', MagicMock())
mocker.patch('freqtrade.exchange.Exchange._load_async_markets', MagicMock())
mocker.patch('freqtrade.exchange.Exchange.validate_pairs', MagicMock())
default_conf['startup_candle_count'] = 20
ex = Exchange(default_conf)
assert ex
default_conf['startup_candle_count'] = 600
with pytest.raises(OperationalException, match=r'This strategy requires 600.*'):
Exchange(default_conf)
def test_exchange_has(default_conf, mocker): def test_exchange_has(default_conf, mocker):
exchange = get_patched_exchange(mocker, default_conf) exchange = get_patched_exchange(mocker, default_conf)
assert not exchange.exchange_has('ASDFASDF') assert not exchange.exchange_has('ASDFASDF')

View File

@ -117,7 +117,7 @@ def simple_backtest(config, contour, num_results, mocker, testdatadir) -> None:
def mocked_load_data(datadir, pairs=[], ticker_interval='0m', refresh_pairs=False, def mocked_load_data(datadir, pairs=[], ticker_interval='0m', refresh_pairs=False,
timerange=None, exchange=None, live=False): timerange=None, exchange=None, live=False, *args, **kwargs):
tickerdata = history.load_tickerdata_file(datadir, 'UNITTEST/BTC', '1m', timerange=timerange) tickerdata = history.load_tickerdata_file(datadir, 'UNITTEST/BTC', '1m', timerange=timerange)
pairdata = {'UNITTEST/BTC': parse_ticker_dataframe(tickerdata, '1m', pair="UNITTEST/BTC", pairdata = {'UNITTEST/BTC': parse_ticker_dataframe(tickerdata, '1m', pair="UNITTEST/BTC",
fill_missing=True)} fill_missing=True)}
@ -494,7 +494,7 @@ def test_backtesting_start_no_data(default_conf, mocker, caplog, testdatadir) ->
def get_timeframe(input1): def get_timeframe(input1):
return Arrow(2017, 11, 14, 21, 17), Arrow(2017, 11, 14, 22, 59) return Arrow(2017, 11, 14, 21, 17), Arrow(2017, 11, 14, 22, 59)
mocker.patch('freqtrade.data.history.load_data', MagicMock(return_value={})) mocker.patch('freqtrade.data.history.load_pair_history', MagicMock(return_value=None))
mocker.patch('freqtrade.data.history.get_timeframe', get_timeframe) mocker.patch('freqtrade.data.history.get_timeframe', get_timeframe)
mocker.patch('freqtrade.exchange.Exchange.refresh_latest_ohlcv', MagicMock()) mocker.patch('freqtrade.exchange.Exchange.refresh_latest_ohlcv', MagicMock())
patch_exchange(mocker) patch_exchange(mocker)
@ -511,10 +511,8 @@ def test_backtesting_start_no_data(default_conf, mocker, caplog, testdatadir) ->
default_conf['timerange'] = '20180101-20180102' default_conf['timerange'] = '20180101-20180102'
backtesting = Backtesting(default_conf) backtesting = Backtesting(default_conf)
backtesting.start() with pytest.raises(OperationalException, match='No data found. Terminating.'):
# check the logs, that will contain the backtest result backtesting.start()
assert log_has('No data found. Terminating.', caplog)
def test_backtest(default_conf, fee, mocker, testdatadir) -> None: def test_backtest(default_conf, fee, mocker, testdatadir) -> None:
@ -838,6 +836,8 @@ def test_backtest_start_timerange(default_conf, mocker, caplog, testdatadir):
f'Using data directory: {testdatadir} ...', f'Using data directory: {testdatadir} ...',
'Using stake_currency: BTC ...', 'Using stake_currency: BTC ...',
'Using stake_amount: 0.001 ...', 'Using stake_amount: 0.001 ...',
'Loading data from 2017-11-14T20:57:00+00:00 '
'up to 2017-11-14T22:58:00+00:00 (0 days)..',
'Backtesting with data from 2017-11-14T21:17:00+00:00 ' 'Backtesting with data from 2017-11-14T21:17:00+00:00 '
'up to 2017-11-14T22:58:00+00:00 (0 days)..', 'up to 2017-11-14T22:58:00+00:00 (0 days)..',
'Parameter --enable-position-stacking detected ...' 'Parameter --enable-position-stacking detected ...'
@ -892,6 +892,8 @@ def test_backtest_start_multi_strat(default_conf, mocker, caplog, testdatadir):
f'Using data directory: {testdatadir} ...', f'Using data directory: {testdatadir} ...',
'Using stake_currency: BTC ...', 'Using stake_currency: BTC ...',
'Using stake_amount: 0.001 ...', 'Using stake_amount: 0.001 ...',
'Loading data from 2017-11-14T20:57:00+00:00 '
'up to 2017-11-14T22:58:00+00:00 (0 days)..',
'Backtesting with data from 2017-11-14T21:17:00+00:00 ' 'Backtesting with data from 2017-11-14T21:17:00+00:00 '
'up to 2017-11-14T22:58:00+00:00 (0 days)..', 'up to 2017-11-14T22:58:00+00:00 (0 days)..',
'Parameter --enable-position-stacking detected ...', 'Parameter --enable-position-stacking detected ...',

View File

@ -228,7 +228,7 @@ def test_start(mocker, default_conf, caplog) -> None:
def test_start_no_data(mocker, default_conf, caplog) -> None: def test_start_no_data(mocker, default_conf, caplog) -> None:
patched_configuration_load_config_file(mocker, default_conf) patched_configuration_load_config_file(mocker, default_conf)
mocker.patch('freqtrade.optimize.hyperopt.load_data', MagicMock(return_value={})) mocker.patch('freqtrade.data.history.load_pair_history', MagicMock(return_value=None))
mocker.patch( mocker.patch(
'freqtrade.optimize.hyperopt.get_timeframe', 'freqtrade.optimize.hyperopt.get_timeframe',
MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13))) MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13)))
@ -242,9 +242,8 @@ def test_start_no_data(mocker, default_conf, caplog) -> None:
'--epochs', '5' '--epochs', '5'
] ]
args = get_args(args) args = get_args(args)
start_hyperopt(args) with pytest.raises(OperationalException, match='No data found. Terminating.'):
start_hyperopt(args)
assert log_has('No data found. Terminating.', caplog)
def test_start_filelock(mocker, default_conf, caplog) -> None: def test_start_filelock(mocker, default_conf, caplog) -> None:
@ -393,7 +392,8 @@ def test_roi_table_generation(hyperopt) -> None:
def test_start_calls_optimizer(mocker, default_conf, caplog, capsys) -> None: def test_start_calls_optimizer(mocker, default_conf, caplog, capsys) -> None:
dumper = mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock()) dumper = mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock())
mocker.patch('freqtrade.optimize.hyperopt.load_data', MagicMock()) mocker.patch('freqtrade.optimize.backtesting.Backtesting.load_bt_data',
MagicMock(return_value=(MagicMock(), None)))
mocker.patch( mocker.patch(
'freqtrade.optimize.hyperopt.get_timeframe', 'freqtrade.optimize.hyperopt.get_timeframe',
MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13))) MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13)))
@ -516,7 +516,7 @@ def test_generate_optimizer(mocker, default_conf) -> None:
default_conf.update({'hyperopt_min_trades': 1}) default_conf.update({'hyperopt_min_trades': 1})
trades = [ trades = [
('POWR/BTC', 0.023117, 0.000233, 100) ('TRX/BTC', 0.023117, 0.000233, 100)
] ]
labels = ['currency', 'profit_percent', 'profit_abs', 'trade_duration'] labels = ['currency', 'profit_percent', 'profit_abs', 'trade_duration']
backtest_result = pd.DataFrame.from_records(trades, columns=labels) backtest_result = pd.DataFrame.from_records(trades, columns=labels)
@ -608,7 +608,8 @@ def test_continue_hyperopt(mocker, default_conf, caplog):
def test_print_json_spaces_all(mocker, default_conf, caplog, capsys) -> None: def test_print_json_spaces_all(mocker, default_conf, caplog, capsys) -> None:
dumper = mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock()) dumper = mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock())
mocker.patch('freqtrade.optimize.hyperopt.load_data', MagicMock()) mocker.patch('freqtrade.optimize.backtesting.Backtesting.load_bt_data',
MagicMock(return_value=(MagicMock(), None)))
mocker.patch( mocker.patch(
'freqtrade.optimize.hyperopt.get_timeframe', 'freqtrade.optimize.hyperopt.get_timeframe',
MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13))) MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13)))
@ -645,7 +646,8 @@ def test_print_json_spaces_all(mocker, default_conf, caplog, capsys) -> None:
def test_print_json_spaces_roi_stoploss(mocker, default_conf, caplog, capsys) -> None: def test_print_json_spaces_roi_stoploss(mocker, default_conf, caplog, capsys) -> None:
dumper = mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock()) dumper = mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock())
mocker.patch('freqtrade.optimize.hyperopt.load_data', MagicMock()) mocker.patch('freqtrade.optimize.backtesting.Backtesting.load_bt_data',
MagicMock(return_value=(MagicMock(), None)))
mocker.patch( mocker.patch(
'freqtrade.optimize.hyperopt.get_timeframe', 'freqtrade.optimize.hyperopt.get_timeframe',
MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13))) MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13)))
@ -682,7 +684,8 @@ def test_print_json_spaces_roi_stoploss(mocker, default_conf, caplog, capsys) ->
def test_simplified_interface_roi_stoploss(mocker, default_conf, caplog, capsys) -> None: def test_simplified_interface_roi_stoploss(mocker, default_conf, caplog, capsys) -> None:
dumper = mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock()) dumper = mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock())
mocker.patch('freqtrade.optimize.hyperopt.load_data', MagicMock()) mocker.patch('freqtrade.optimize.backtesting.Backtesting.load_bt_data',
MagicMock(return_value=(MagicMock(), None)))
mocker.patch( mocker.patch(
'freqtrade.optimize.hyperopt.get_timeframe', 'freqtrade.optimize.hyperopt.get_timeframe',
MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13))) MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13)))
@ -728,7 +731,8 @@ def test_simplified_interface_roi_stoploss(mocker, default_conf, caplog, capsys)
def test_simplified_interface_all_failed(mocker, default_conf, caplog, capsys) -> None: def test_simplified_interface_all_failed(mocker, default_conf, caplog, capsys) -> None:
mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock()) mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock())
mocker.patch('freqtrade.optimize.hyperopt.load_data', MagicMock()) mocker.patch('freqtrade.optimize.backtesting.Backtesting.load_bt_data',
MagicMock(return_value=(MagicMock(), None)))
mocker.patch( mocker.patch(
'freqtrade.optimize.hyperopt.get_timeframe', 'freqtrade.optimize.hyperopt.get_timeframe',
MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13))) MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13)))
@ -757,7 +761,8 @@ def test_simplified_interface_all_failed(mocker, default_conf, caplog, capsys) -
def test_simplified_interface_buy(mocker, default_conf, caplog, capsys) -> None: def test_simplified_interface_buy(mocker, default_conf, caplog, capsys) -> None:
dumper = mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock()) dumper = mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock())
mocker.patch('freqtrade.optimize.hyperopt.load_data', MagicMock()) mocker.patch('freqtrade.optimize.backtesting.Backtesting.load_bt_data',
MagicMock(return_value=(MagicMock(), None)))
mocker.patch( mocker.patch(
'freqtrade.optimize.hyperopt.get_timeframe', 'freqtrade.optimize.hyperopt.get_timeframe',
MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13))) MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13)))
@ -802,7 +807,8 @@ def test_simplified_interface_buy(mocker, default_conf, caplog, capsys) -> None:
def test_simplified_interface_sell(mocker, default_conf, caplog, capsys) -> None: def test_simplified_interface_sell(mocker, default_conf, caplog, capsys) -> None:
dumper = mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock()) dumper = mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock())
mocker.patch('freqtrade.optimize.hyperopt.load_data', MagicMock()) mocker.patch('freqtrade.optimize.backtesting.Backtesting.load_bt_data',
MagicMock(return_value=(MagicMock(), None)))
mocker.patch( mocker.patch(
'freqtrade.optimize.hyperopt.get_timeframe', 'freqtrade.optimize.hyperopt.get_timeframe',
MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13))) MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13)))
@ -853,7 +859,8 @@ def test_simplified_interface_sell(mocker, default_conf, caplog, capsys) -> None
]) ])
def test_simplified_interface_failed(mocker, default_conf, caplog, capsys, method, space) -> None: def test_simplified_interface_failed(mocker, default_conf, caplog, capsys, method, space) -> None:
mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock()) mocker.patch('freqtrade.optimize.hyperopt.dump', MagicMock())
mocker.patch('freqtrade.optimize.hyperopt.load_data', MagicMock()) mocker.patch('freqtrade.optimize.backtesting.Backtesting.load_bt_data',
MagicMock(return_value=(MagicMock(), None)))
mocker.patch( mocker.patch(
'freqtrade.optimize.hyperopt.get_timeframe', 'freqtrade.optimize.hyperopt.get_timeframe',
MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13))) MagicMock(return_value=(datetime(2017, 12, 10), datetime(2017, 12, 13)))

View File

@ -53,10 +53,10 @@ def test_init_plotscript(default_conf, mocker, testdatadir):
assert "trades" in ret assert "trades" in ret
assert "pairs" in ret assert "pairs" in ret
default_conf['pairs'] = ["POWR/BTC", "ADA/BTC"] default_conf['pairs'] = ["TRX/BTC", "ADA/BTC"]
ret = init_plotscript(default_conf) ret = init_plotscript(default_conf)
assert "tickers" in ret assert "tickers" in ret
assert "POWR/BTC" in ret["tickers"] assert "TRX/BTC" in ret["tickers"]
assert "ADA/BTC" in ret["tickers"] assert "ADA/BTC" in ret["tickers"]
@ -228,13 +228,13 @@ def test_add_profit(testdatadir):
bt_data = load_backtest_data(filename) bt_data = load_backtest_data(filename)
timerange = TimeRange.parse_timerange("20180110-20180112") timerange = TimeRange.parse_timerange("20180110-20180112")
df = history.load_pair_history(pair="POWR/BTC", ticker_interval='5m', df = history.load_pair_history(pair="TRX/BTC", ticker_interval='5m',
datadir=testdatadir, timerange=timerange) datadir=testdatadir, timerange=timerange)
fig = generate_empty_figure() fig = generate_empty_figure()
cum_profits = create_cum_profit(df.set_index('date'), cum_profits = create_cum_profit(df.set_index('date'),
bt_data[bt_data["pair"] == 'POWR/BTC'], bt_data[bt_data["pair"] == 'TRX/BTC'],
"cum_profits") "cum_profits", timeframe="5m")
fig1 = add_profit(fig, row=2, data=cum_profits, column='cum_profits', name='Profits') fig1 = add_profit(fig, row=2, data=cum_profits, column='cum_profits', name='Profits')
figure = fig1.layout.figure figure = fig1.layout.figure
@ -247,7 +247,7 @@ def test_generate_profit_graph(testdatadir):
filename = testdatadir / "backtest-result_test.json" filename = testdatadir / "backtest-result_test.json"
trades = load_backtest_data(filename) trades = load_backtest_data(filename)
timerange = TimeRange.parse_timerange("20180110-20180112") timerange = TimeRange.parse_timerange("20180110-20180112")
pairs = ["POWR/BTC", "ADA/BTC"] pairs = ["TRX/BTC", "ADA/BTC"]
tickers = history.load_data(datadir=testdatadir, tickers = history.load_data(datadir=testdatadir,
pairs=pairs, pairs=pairs,
@ -256,7 +256,7 @@ def test_generate_profit_graph(testdatadir):
) )
trades = trades[trades['pair'].isin(pairs)] trades = trades[trades['pair'].isin(pairs)]
fig = generate_profit_graph(pairs, tickers, trades) fig = generate_profit_graph(pairs, tickers, trades, timeframe="5m")
assert isinstance(fig, go.Figure) assert isinstance(fig, go.Figure)
assert fig.layout.title.text == "Freqtrade Profit plot" assert fig.layout.title.text == "Freqtrade Profit plot"

View File

@ -1,10 +1,11 @@
# pragma pylint: disable=missing-docstring, C0103 # pragma pylint: disable=missing-docstring, C0103
import arrow
import pytest import pytest
from freqtrade.configuration import TimeRange from freqtrade.configuration import TimeRange
def test_parse_timerange_incorrect() -> None: def test_parse_timerange_incorrect():
assert TimeRange('date', None, 1274486400, 0) == TimeRange.parse_timerange('20100522-') assert TimeRange('date', None, 1274486400, 0) == TimeRange.parse_timerange('20100522-')
assert TimeRange(None, 'date', 0, 1274486400) == TimeRange.parse_timerange('-20100522') assert TimeRange(None, 'date', 0, 1274486400) == TimeRange.parse_timerange('-20100522')
@ -28,3 +29,37 @@ def test_parse_timerange_incorrect() -> None:
with pytest.raises(Exception, match=r'Incorrect syntax.*'): with pytest.raises(Exception, match=r'Incorrect syntax.*'):
TimeRange.parse_timerange('-') TimeRange.parse_timerange('-')
def test_subtract_start():
x = TimeRange('date', 'date', 1274486400, 1438214400)
x.subtract_start(300)
assert x.startts == 1274486400 - 300
# Do nothing if no startdate exists
x = TimeRange(None, 'date', 0, 1438214400)
x.subtract_start(300)
assert not x.startts
x = TimeRange('date', None, 1274486400, 0)
x.subtract_start(300)
assert x.startts == 1274486400 - 300
def test_adjust_start_if_necessary():
min_date = arrow.Arrow(2017, 11, 14, 21, 15, 00)
x = TimeRange('date', 'date', 1510694100, 1510780500)
# Adjust by 20 candles - min_date == startts
x.adjust_start_if_necessary(300, 20, min_date)
assert x.startts == 1510694100 + (20 * 300)
x = TimeRange('date', 'date', 1510700100, 1510780500)
# Do nothing, startup is set and different min_date
x.adjust_start_if_necessary(300, 20, min_date)
assert x.startts == 1510694100 + (20 * 300)
x = TimeRange(None, 'date', 0, 1510780500)
# Adjust by 20 candles = 20 * 5m
x.adjust_start_if_necessary(300, 20, min_date)
assert x.startts == 1510694100 + (20 * 300)

File diff suppressed because one or more lines are too long

View File

@ -9,7 +9,7 @@
"LTC/BTC", "LTC/BTC",
"NEO/BTC", "NEO/BTC",
"NXT/BTC", "NXT/BTC",
"POWR/BTC", "TRX/BTC",
"STORJ/BTC", "STORJ/BTC",
"QTUM/BTC", "QTUM/BTC",
"WAVES/BTC", "WAVES/BTC",

View File

@ -59,6 +59,9 @@ class SampleStrategy(IStrategy):
sell_profit_only = False sell_profit_only = False
ignore_roi_if_buy_signal = False ignore_roi_if_buy_signal = False
# Number of candles the strategy requires before producing valid signals
startup_candle_count: int = 20
# Optional order type mapping. # Optional order type mapping.
order_types = { order_types = {
'buy': 'limit', 'buy': 'limit',