Evolving a neural network trading bot #711
languagelawyer
started this conversation in
Show and tell
Replies: 1 comment
-
very cool! thanks for posting this here! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I played with direct ANN weights encoding a bit to evolve a NN trading bot.
Currenty there are 3 objectives:Number of positions (to maximize) — encourages bot to do something2025-05-13 UPD: the objectives now are
[returns in each subperiod] ++ [total return]
.At first, I had pure Python implementation of backtesting engine/code, with PyTorch in
inference_mode
for NN. This was slow as hell. I'm backtesting on 1s Binance Spot historic data (candles), and it were taking 30-60 seconds to backtest 100 individuals over just 40-60 minutes of data (usingmultiprocessing.Pool
).I decided to rewrite backtesting code and NN implementation (I need only forward pass, no autograd/backprop, so it is very little code) in C++, and leave Python just for
pymoo
algorithms. This turned out to be more than 1000 times faster. It takes ~40 seconds to backtest 100 individuals over a month of data (usingmultiprocessing.pool.ThreadPool
).Initially I had terrible overfitting: starting backtesting just 1 second (1 candle) later than before were turning almost all "profitable" individuals into non-profitable. Now I split the backtesting period into a few subperiods and try to make the model profitable in EVERY subperiod (I backtest in each subperiod independently, i.e. start with zero RNN state etc.). This seems to help, at least to some degree.
See https://github.com/languagelawyer/genetic_trade
Beta Was this translation helpful? Give feedback.
All reactions