Merge branch 'fixHyperoptFreqai' of https://github.com/wagnercosta/freqtrade into fixHyperoptFreqai
This commit is contained in:
commit
e0490b3efc
Binary file not shown.
Before Width: | Height: | Size: 191 KiB After Width: | Height: | Size: 185 KiB |
@ -112,15 +112,15 @@ Mandatory parameters are marked as **Required**, which means that they are requi
|
|||||||
| `DI_threshold` | Activates the Dissimilarity Index for outlier detection when > 0. See details about how it works [here](#removing-outliers-with-the-dissimilarity-index). <br> **Datatype:** Positive float (typically < 1).
|
| `DI_threshold` | Activates the Dissimilarity Index for outlier detection when > 0. See details about how it works [here](#removing-outliers-with-the-dissimilarity-index). <br> **Datatype:** Positive float (typically < 1).
|
||||||
| `use_SVM_to_remove_outliers` | Train a support vector machine to detect and remove outliers from the training data set, as well as from incoming data points. See details about how it works [here](#removing-outliers-using-a-support-vector-machine-svm). <br> **Datatype:** Boolean.
|
| `use_SVM_to_remove_outliers` | Train a support vector machine to detect and remove outliers from the training data set, as well as from incoming data points. See details about how it works [here](#removing-outliers-using-a-support-vector-machine-svm). <br> **Datatype:** Boolean.
|
||||||
| `svm_params` | All parameters available in Sklearn's `SGDOneClassSVM()`. See details about some select parameters [here](#removing-outliers-using-a-support-vector-machine-svm). <br> **Datatype:** Dictionary.
|
| `svm_params` | All parameters available in Sklearn's `SGDOneClassSVM()`. See details about some select parameters [here](#removing-outliers-using-a-support-vector-machine-svm). <br> **Datatype:** Dictionary.
|
||||||
| `use_DBSCAN_to_remove_outliers` | Cluster data using DBSCAN to identify and remove outliers from training and prediction data. See details about how it works [here](#removing-outliers-with-dbscan). <br> **Datatype:** Boolean.
|
| `use_DBSCAN_to_remove_outliers` | Cluster data using DBSCAN to identify and remove outliers from training and prediction data. See details about how it works [here](#removing-outliers-with-dbscan). <br> **Datatype:** Boolean.
|
||||||
| `outlier_protection_percentage` | If more than `outlier_protection_percentage` fraction of points are removed as outliers, FreqAI will log a warning message and ignore outlier detection while keeping the original dataset intact. <br> **Datatype:** float. Default: `30`
|
| `outlier_protection_percentage` | If more than `outlier_protection_percentage` % of points are detected as outliers by the SVM or DBSCAN, FreqAI will log a warning message and ignore outlier detection while keeping the original dataset intact. If the outlier protection is triggered, no predictions will be made based on the training data. <br> **Datatype:** Float. Default: `30`
|
||||||
| `reverse_train_test_order` | If true, FreqAI will train on the latest data split and test on historical split of the data. This allows the model to be trained up to the most recent data point, while avoiding overfitting. However, users should be careful to understand unorthodox nature of this parameter before employing it. <br> **Datatype:** bool. Default: False
|
| `reverse_train_test_order` | If true, FreqAI will train on the latest data split and test on historical split of the data. This allows the model to be trained up to the most recent data point, while avoiding overfitting. However, users should be careful to understand unorthodox nature of this parameter before employing it. <br> **Datatype:** Boolean. Default: False
|
||||||
| | **Data split parameters**
|
| | **Data split parameters**
|
||||||
| `data_split_parameters` | Include any additional parameters available from Scikit-learn `test_train_split()`, which are shown [here](https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.train_test_split.html) (external website). <br> **Datatype:** Dictionary.
|
| `data_split_parameters` | Include any additional parameters available from Scikit-learn `test_train_split()`, which are shown [here](https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.train_test_split.html) (external website). <br> **Datatype:** Dictionary.
|
||||||
| `test_size` | Fraction of data that should be used for testing instead of training. <br> **Datatype:** Positive float < 1.
|
| `test_size` | Fraction of data that should be used for testing instead of training. <br> **Datatype:** Positive float < 1.
|
||||||
| `shuffle` | Shuffle the training data points during training. Typically, for time-series forecasting, this is set to `False`. <br>
|
| `shuffle` | Shuffle the training data points during training. Typically, for time-series forecasting, this is set to `False`. <br> **Datatype:** Boolean.
|
||||||
| | **Model training parameters**
|
| | **Model training parameters**
|
||||||
| `model_training_parameters` | A flexible dictionary that includes all parameters available by the user selected model library. For example, if the user uses `LightGBMRegressor`, this dictionary can contain any parameter available by the `LightGBMRegressor` [here](https://lightgbm.readthedocs.io/en/latest/pythonapi/lightgbm.LGBMRegressor.html) (external website). If the user selects a different model, this dictionary can contain any parameter from that model. <br> **Datatype:** Dictionary.**Datatype:** Boolean.
|
| `model_training_parameters` | A flexible dictionary that includes all parameters available by the user selected model library. For example, if the user uses `LightGBMRegressor`, this dictionary can contain any parameter available by the `LightGBMRegressor` [here](https://lightgbm.readthedocs.io/en/latest/pythonapi/lightgbm.LGBMRegressor.html) (external website). If the user selects a different model, this dictionary can contain any parameter from that model. <br> **Datatype:** Dictionary.
|
||||||
| `n_estimators` | The number of boosted trees to fit in regression. <br> **Datatype:** Integer.
|
| `n_estimators` | The number of boosted trees to fit in regression. <br> **Datatype:** Integer.
|
||||||
| `learning_rate` | Boosting learning rate during regression. <br> **Datatype:** Float.
|
| `learning_rate` | Boosting learning rate during regression. <br> **Datatype:** Float.
|
||||||
| `n_jobs`, `thread_count`, `task_type` | Set the number of threads for parallel processing and the `task_type` (`gpu` or `cpu`). Different model libraries use different parameter names. <br> **Datatype:** Float.
|
| `n_jobs`, `thread_count`, `task_type` | Set the number of threads for parallel processing and the `task_type` (`gpu` or `cpu`). Different model libraries use different parameter names. <br> **Datatype:** Float.
|
||||||
@ -749,7 +749,7 @@ Given a number of data points $N$, and a distance $\varepsilon$, DBSCAN clusters
|
|||||||
|
|
||||||
![dbscan](assets/freqai_dbscan.jpg)
|
![dbscan](assets/freqai_dbscan.jpg)
|
||||||
|
|
||||||
FreqAI uses `sklearn.cluster.DBSCAN` (details are available on scikit-learn's webpage [here](#https://scikit-learn.org/stable/modules/generated/sklearn.cluster.DBSCAN.html)) with `min_samples` ($N$) taken as double the no. of user-defined features, and `eps` ($\varepsilon$) taken as the longest distance in the *k-distance graph* computed from the nearest neighbors in the pairwise distances of all data points in the feature set.
|
FreqAI uses `sklearn.cluster.DBSCAN` (details are available on scikit-learn's webpage [here](#https://scikit-learn.org/stable/modules/generated/sklearn.cluster.DBSCAN.html)) with `min_samples` ($N$) taken as 1/4 of the no. of time points in the feature set, and `eps` ($\varepsilon$) taken as the elbow point in the *k-distance graph* computed from the nearest neighbors in the pairwise distances of all data points in the feature set.
|
||||||
|
|
||||||
## Additional information
|
## Additional information
|
||||||
|
|
||||||
@ -774,5 +774,5 @@ Code review, software architecture brainstorming:
|
|||||||
@xmatthias
|
@xmatthias
|
||||||
|
|
||||||
Beta testing and bug reporting:
|
Beta testing and bug reporting:
|
||||||
@bloodhunter4rc, Salah Lamkadem @ikonx, @ken11o2, @longyu, @paranoidandy, @smidelis, @smarm
|
@bloodhunter4rc, Salah Lamkadem @ikonx, @ken11o2, @longyu, @paranoidandy, @smidelis, @smarm,
|
||||||
Juha Nykänen @suikula, Wagner Costa @wagnercosta
|
Juha Nykänen @suikula, Wagner Costa @wagnercosta
|
||||||
|
@ -824,6 +824,8 @@ Options:
|
|||||||
- Merge the dataframe without lookahead bias
|
- Merge the dataframe without lookahead bias
|
||||||
- Forward-fill (optional)
|
- Forward-fill (optional)
|
||||||
|
|
||||||
|
For a full sample, please refer to the [complete data provider example](#complete-data-provider-sample) below.
|
||||||
|
|
||||||
All columns of the informative dataframe will be available on the returning dataframe in a renamed fashion:
|
All columns of the informative dataframe will be available on the returning dataframe in a renamed fashion:
|
||||||
|
|
||||||
!!! Example "Column renaming"
|
!!! Example "Column renaming"
|
||||||
|
@ -142,17 +142,20 @@ class FreqtradeBot(LoggingMixin):
|
|||||||
:return: None
|
:return: None
|
||||||
"""
|
"""
|
||||||
logger.info('Cleaning up modules ...')
|
logger.info('Cleaning up modules ...')
|
||||||
|
try:
|
||||||
|
# Wrap db activities in shutdown to avoid problems if database is gone,
|
||||||
|
# and raises further exceptions.
|
||||||
|
if self.config['cancel_open_orders_on_exit']:
|
||||||
|
self.cancel_all_open_orders()
|
||||||
|
|
||||||
if self.config['cancel_open_orders_on_exit']:
|
self.check_for_open_trades()
|
||||||
self.cancel_all_open_orders()
|
|
||||||
|
|
||||||
self.check_for_open_trades()
|
finally:
|
||||||
|
self.strategy.ft_bot_cleanup()
|
||||||
|
|
||||||
self.strategy.ft_bot_cleanup()
|
self.rpc.cleanup()
|
||||||
|
Trade.commit()
|
||||||
self.rpc.cleanup()
|
self.exchange.close()
|
||||||
Trade.commit()
|
|
||||||
self.exchange.close()
|
|
||||||
|
|
||||||
def startup(self) -> None:
|
def startup(self) -> None:
|
||||||
"""
|
"""
|
||||||
@ -283,7 +286,7 @@ class FreqtradeBot(LoggingMixin):
|
|||||||
pair=trade.pair,
|
pair=trade.pair,
|
||||||
amount=trade.amount,
|
amount=trade.amount,
|
||||||
is_short=trade.is_short,
|
is_short=trade.is_short,
|
||||||
open_date=trade.open_date_utc
|
open_date=trade.date_last_filled_utc
|
||||||
)
|
)
|
||||||
trade.funding_fees = funding_fees
|
trade.funding_fees = funding_fees
|
||||||
else:
|
else:
|
||||||
@ -728,10 +731,11 @@ class FreqtradeBot(LoggingMixin):
|
|||||||
fee = self.exchange.get_fee(symbol=pair, taker_or_maker='maker')
|
fee = self.exchange.get_fee(symbol=pair, taker_or_maker='maker')
|
||||||
base_currency = self.exchange.get_pair_base_currency(pair)
|
base_currency = self.exchange.get_pair_base_currency(pair)
|
||||||
open_date = datetime.now(timezone.utc)
|
open_date = datetime.now(timezone.utc)
|
||||||
funding_fees = self.exchange.get_funding_fees(
|
|
||||||
pair=pair, amount=amount, is_short=is_short, open_date=open_date)
|
|
||||||
# This is a new trade
|
# This is a new trade
|
||||||
if trade is None:
|
if trade is None:
|
||||||
|
funding_fees = self.exchange.get_funding_fees(
|
||||||
|
pair=pair, amount=amount, is_short=is_short, open_date=open_date)
|
||||||
trade = Trade(
|
trade = Trade(
|
||||||
pair=pair,
|
pair=pair,
|
||||||
base_currency=base_currency,
|
base_currency=base_currency,
|
||||||
@ -1486,7 +1490,7 @@ class FreqtradeBot(LoggingMixin):
|
|||||||
pair=trade.pair,
|
pair=trade.pair,
|
||||||
amount=trade.amount,
|
amount=trade.amount,
|
||||||
is_short=trade.is_short,
|
is_short=trade.is_short,
|
||||||
open_date=trade.open_date_utc,
|
open_date=trade.date_last_filled_utc,
|
||||||
)
|
)
|
||||||
exit_type = 'exit'
|
exit_type = 'exit'
|
||||||
exit_reason = exit_tag or exit_check.exit_reason
|
exit_reason = exit_tag or exit_check.exit_reason
|
||||||
|
@ -686,7 +686,7 @@ class Backtesting:
|
|||||||
self.futures_data[trade.pair],
|
self.futures_data[trade.pair],
|
||||||
amount=trade.amount,
|
amount=trade.amount,
|
||||||
is_short=trade.is_short,
|
is_short=trade.is_short,
|
||||||
open_date=trade.open_date_utc,
|
open_date=trade.date_last_filled_utc,
|
||||||
close_date=exit_candle_time,
|
close_date=exit_candle_time,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -212,17 +212,18 @@ def migrate_orders_table(engine, table_back_name: str, cols_order: List):
|
|||||||
ft_fee_base = get_column_def(cols_order, 'ft_fee_base', 'null')
|
ft_fee_base = get_column_def(cols_order, 'ft_fee_base', 'null')
|
||||||
average = get_column_def(cols_order, 'average', 'null')
|
average = get_column_def(cols_order, 'average', 'null')
|
||||||
stop_price = get_column_def(cols_order, 'stop_price', 'null')
|
stop_price = get_column_def(cols_order, 'stop_price', 'null')
|
||||||
|
funding_fee = get_column_def(cols_order, 'funding_fee', '0.0')
|
||||||
|
|
||||||
# sqlite does not support literals for booleans
|
# sqlite does not support literals for booleans
|
||||||
with engine.begin() as connection:
|
with engine.begin() as connection:
|
||||||
connection.execute(text(f"""
|
connection.execute(text(f"""
|
||||||
insert into orders (id, ft_trade_id, ft_order_side, ft_pair, ft_is_open, order_id,
|
insert into orders (id, ft_trade_id, ft_order_side, ft_pair, ft_is_open, order_id,
|
||||||
status, symbol, order_type, side, price, amount, filled, average, remaining, cost,
|
status, symbol, order_type, side, price, amount, filled, average, remaining, cost,
|
||||||
stop_price, order_date, order_filled_date, order_update_date, ft_fee_base)
|
stop_price, order_date, order_filled_date, order_update_date, ft_fee_base, funding_fee)
|
||||||
select id, ft_trade_id, ft_order_side, ft_pair, ft_is_open, order_id,
|
select id, ft_trade_id, ft_order_side, ft_pair, ft_is_open, order_id,
|
||||||
status, symbol, order_type, side, price, amount, filled, {average} average, remaining,
|
status, symbol, order_type, side, price, amount, filled, {average} average, remaining,
|
||||||
cost, {stop_price} stop_price, order_date, order_filled_date,
|
cost, {stop_price} stop_price, order_date, order_filled_date,
|
||||||
order_update_date, {ft_fee_base} ft_fee_base
|
order_update_date, {ft_fee_base} ft_fee_base, {funding_fee} funding_fee
|
||||||
from {table_back_name}
|
from {table_back_name}
|
||||||
"""))
|
"""))
|
||||||
|
|
||||||
@ -307,9 +308,10 @@ def check_migrate(engine, decl_base, previous_tables) -> None:
|
|||||||
# Check if migration necessary
|
# Check if migration necessary
|
||||||
# Migrates both trades and orders table!
|
# Migrates both trades and orders table!
|
||||||
# if ('orders' not in previous_tables
|
# if ('orders' not in previous_tables
|
||||||
# or not has_column(cols_orders, 'stop_price')):
|
# or not has_column(cols_orders, 'funding_fee')):
|
||||||
migrating = False
|
migrating = False
|
||||||
if not has_column(cols_trades, 'contract_size'):
|
# if not has_column(cols_trades, 'contract_size'):
|
||||||
|
if not has_column(cols_orders, 'funding_fee'):
|
||||||
migrating = True
|
migrating = True
|
||||||
logger.info(f"Running database migration for trades - "
|
logger.info(f"Running database migration for trades - "
|
||||||
f"backup: {table_back_name}, {order_table_bak_name}")
|
f"backup: {table_back_name}, {order_table_bak_name}")
|
||||||
|
@ -65,6 +65,8 @@ class Order(_DECL_BASE):
|
|||||||
order_filled_date = Column(DateTime, nullable=True)
|
order_filled_date = Column(DateTime, nullable=True)
|
||||||
order_update_date = Column(DateTime, nullable=True)
|
order_update_date = Column(DateTime, nullable=True)
|
||||||
|
|
||||||
|
funding_fee = Column(Float, nullable=True)
|
||||||
|
|
||||||
ft_fee_base = Column(Float, nullable=True)
|
ft_fee_base = Column(Float, nullable=True)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
@ -72,6 +74,13 @@ class Order(_DECL_BASE):
|
|||||||
""" Order-date with UTC timezoneinfo"""
|
""" Order-date with UTC timezoneinfo"""
|
||||||
return self.order_date.replace(tzinfo=timezone.utc)
|
return self.order_date.replace(tzinfo=timezone.utc)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def order_filled_utc(self) -> Optional[datetime]:
|
||||||
|
""" last order-date with UTC timezoneinfo"""
|
||||||
|
return (
|
||||||
|
self.order_filled_date.replace(tzinfo=timezone.utc) if self.order_filled_date else None
|
||||||
|
)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def safe_price(self) -> float:
|
def safe_price(self) -> float:
|
||||||
return self.average or self.price
|
return self.average or self.price
|
||||||
@ -119,6 +128,10 @@ class Order(_DECL_BASE):
|
|||||||
self.ft_is_open = True
|
self.ft_is_open = True
|
||||||
if self.status in NON_OPEN_EXCHANGE_STATES:
|
if self.status in NON_OPEN_EXCHANGE_STATES:
|
||||||
self.ft_is_open = False
|
self.ft_is_open = False
|
||||||
|
if self.trade:
|
||||||
|
# Assign funding fee up to this point
|
||||||
|
# (represents the funding fee since the last order)
|
||||||
|
self.funding_fee = self.trade.funding_fees
|
||||||
if (order.get('filled', 0.0) or 0.0) > 0:
|
if (order.get('filled', 0.0) or 0.0) > 0:
|
||||||
self.order_filled_date = datetime.now(timezone.utc)
|
self.order_filled_date = datetime.now(timezone.utc)
|
||||||
self.order_update_date = datetime.now(timezone.utc)
|
self.order_update_date = datetime.now(timezone.utc)
|
||||||
@ -179,6 +192,10 @@ class Order(_DECL_BASE):
|
|||||||
self.remaining = 0
|
self.remaining = 0
|
||||||
self.status = 'closed'
|
self.status = 'closed'
|
||||||
self.ft_is_open = False
|
self.ft_is_open = False
|
||||||
|
# Assign funding fees to Order.
|
||||||
|
# Assumes backtesting will use date_last_filled_utc to calculate future funding fees.
|
||||||
|
self.funding_fee = trade.funding_fees
|
||||||
|
|
||||||
if (self.ft_order_side == trade.entry_side):
|
if (self.ft_order_side == trade.entry_side):
|
||||||
trade.open_rate = self.price
|
trade.open_rate = self.price
|
||||||
trade.recalc_trade_from_orders()
|
trade.recalc_trade_from_orders()
|
||||||
@ -346,6 +363,15 @@ class LocalTrade():
|
|||||||
else:
|
else:
|
||||||
return self.amount
|
return self.amount
|
||||||
|
|
||||||
|
@property
|
||||||
|
def date_last_filled_utc(self) -> datetime:
|
||||||
|
""" Date of the last filled order"""
|
||||||
|
orders = self.select_filled_orders()
|
||||||
|
if not orders:
|
||||||
|
return self.open_date_utc
|
||||||
|
return max([self.open_date_utc,
|
||||||
|
max(o.order_filled_utc for o in orders if o.order_filled_utc)])
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def open_date_utc(self):
|
def open_date_utc(self):
|
||||||
return self.open_date.replace(tzinfo=timezone.utc)
|
return self.open_date.replace(tzinfo=timezone.utc)
|
||||||
@ -843,10 +869,14 @@ class LocalTrade():
|
|||||||
close_profit = 0.0
|
close_profit = 0.0
|
||||||
close_profit_abs = 0.0
|
close_profit_abs = 0.0
|
||||||
profit = None
|
profit = None
|
||||||
for o in self.orders:
|
# Reset funding fees
|
||||||
|
self.funding_fees = 0.0
|
||||||
|
funding_fees = 0.0
|
||||||
|
ordercount = len(self.orders) - 1
|
||||||
|
for i, o in enumerate(self.orders):
|
||||||
if o.ft_is_open or not o.filled:
|
if o.ft_is_open or not o.filled:
|
||||||
continue
|
continue
|
||||||
|
funding_fees += (o.funding_fee or 0.0)
|
||||||
tmp_amount = FtPrecise(o.safe_amount_after_fee)
|
tmp_amount = FtPrecise(o.safe_amount_after_fee)
|
||||||
tmp_price = FtPrecise(o.safe_price)
|
tmp_price = FtPrecise(o.safe_price)
|
||||||
|
|
||||||
@ -861,7 +891,11 @@ class LocalTrade():
|
|||||||
avg_price = current_stake / current_amount
|
avg_price = current_stake / current_amount
|
||||||
|
|
||||||
if is_exit:
|
if is_exit:
|
||||||
# Process partial exits
|
# Process exits
|
||||||
|
if i == ordercount and is_closing:
|
||||||
|
# Apply funding fees only to the last closing order
|
||||||
|
self.funding_fees = funding_fees
|
||||||
|
|
||||||
exit_rate = o.safe_price
|
exit_rate = o.safe_price
|
||||||
exit_amount = o.safe_amount_after_fee
|
exit_amount = o.safe_amount_after_fee
|
||||||
profit = self.calc_profit(rate=exit_rate, amount=exit_amount,
|
profit = self.calc_profit(rate=exit_rate, amount=exit_amount,
|
||||||
@ -871,6 +905,7 @@ class LocalTrade():
|
|||||||
exit_rate, amount=exit_amount, open_rate=avg_price)
|
exit_rate, amount=exit_amount, open_rate=avg_price)
|
||||||
else:
|
else:
|
||||||
total_stake = total_stake + self._calc_open_trade_value(tmp_amount, price)
|
total_stake = total_stake + self._calc_open_trade_value(tmp_amount, price)
|
||||||
|
self.funding_fees = funding_fees
|
||||||
|
|
||||||
if close_profit:
|
if close_profit:
|
||||||
self.close_profit = close_profit
|
self.close_profit = close_profit
|
||||||
|
@ -261,11 +261,15 @@ class RPC:
|
|||||||
profit_str += f" ({fiat_profit:.2f})"
|
profit_str += f" ({fiat_profit:.2f})"
|
||||||
fiat_profit_sum = fiat_profit if isnan(fiat_profit_sum) \
|
fiat_profit_sum = fiat_profit if isnan(fiat_profit_sum) \
|
||||||
else fiat_profit_sum + fiat_profit
|
else fiat_profit_sum + fiat_profit
|
||||||
|
open_order = (trade.select_order_by_order_id(
|
||||||
|
trade.open_order_id) if trade.open_order_id else None)
|
||||||
|
|
||||||
detail_trade = [
|
detail_trade = [
|
||||||
f'{trade.id} {direction_str}',
|
f'{trade.id} {direction_str}',
|
||||||
trade.pair + ('*' if (trade.open_order_id is not None
|
trade.pair + ('*' if (open_order
|
||||||
and trade.close_rate_requested is None) else '')
|
and open_order.ft_order_side == trade.entry_side) else '')
|
||||||
+ ('**' if (trade.close_rate_requested is not None) else ''),
|
+ ('**' if (open_order and
|
||||||
|
open_order.ft_order_side == trade.exit_side is not None) else ''),
|
||||||
shorten_date(arrow.get(trade.open_date).humanize(only_distance=True)),
|
shorten_date(arrow.get(trade.open_date).humanize(only_distance=True)),
|
||||||
profit_str
|
profit_str
|
||||||
]
|
]
|
||||||
|
@ -615,21 +615,25 @@ def test_calc_open_close_trade_price(
|
|||||||
is_short=is_short,
|
is_short=is_short,
|
||||||
leverage=lev,
|
leverage=lev,
|
||||||
trading_mode=trading_mode,
|
trading_mode=trading_mode,
|
||||||
funding_fees=funding_fees
|
|
||||||
)
|
)
|
||||||
entry_order = limit_order[trade.entry_side]
|
entry_order = limit_order[trade.entry_side]
|
||||||
exit_order = limit_order[trade.exit_side]
|
exit_order = limit_order[trade.exit_side]
|
||||||
trade.open_order_id = f'something-{is_short}-{lev}-{exchange}'
|
trade.open_order_id = f'something-{is_short}-{lev}-{exchange}'
|
||||||
|
|
||||||
oobj = Order.parse_from_ccxt_object(entry_order, 'ADA/USDT', trade.entry_side)
|
oobj = Order.parse_from_ccxt_object(entry_order, 'ADA/USDT', trade.entry_side)
|
||||||
trade.orders.append(oobj)
|
oobj.trade = trade
|
||||||
|
oobj.update_from_ccxt_object(entry_order)
|
||||||
trade.update_trade(oobj)
|
trade.update_trade(oobj)
|
||||||
|
|
||||||
|
trade.funding_fees = funding_fees
|
||||||
|
|
||||||
oobj = Order.parse_from_ccxt_object(exit_order, 'ADA/USDT', trade.exit_side)
|
oobj = Order.parse_from_ccxt_object(exit_order, 'ADA/USDT', trade.exit_side)
|
||||||
trade.orders.append(oobj)
|
oobj.trade = trade
|
||||||
|
oobj.update_from_ccxt_object(exit_order)
|
||||||
trade.update_trade(oobj)
|
trade.update_trade(oobj)
|
||||||
|
|
||||||
assert trade.is_open is False
|
assert trade.is_open is False
|
||||||
|
assert trade.funding_fees == funding_fees
|
||||||
|
|
||||||
assert pytest.approx(trade._calc_open_trade_value(trade.amount, trade.open_rate)) == open_value
|
assert pytest.approx(trade._calc_open_trade_value(trade.amount, trade.open_rate)) == open_value
|
||||||
assert pytest.approx(trade.calc_close_trade_value(trade.close_rate)) == close_value
|
assert pytest.approx(trade.calc_close_trade_value(trade.close_rate)) == close_value
|
||||||
|
Loading…
Reference in New Issue
Block a user