You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! Thank you so much for making deap! I noticed gp.cxSemantic includes the whole genome of each of the two parents in each of the two offspring.
This more than doubles the offspring size each generation and is like a software model of tetra/hexa/octaploidy, which is cool, except when you have a strict size or depth limit, because it can get stuck only producing offspring which exceed the limit(s).
Would it be possible to invent some kind of meiosis function for the gp trees? If we had a way to divide each parent in half, then we could keep doing cxSemantic indefinitely while still respecting the size and depth limits. Not a big deal since we can use other crossover operators, but it could be an interesting project to figure out because it aligns so well with fundamentals of cell biology. I promise to share if I figure it out. Thank you for taking the time to read this.
The text was updated successfully, but these errors were encountered:
Hi! Thank you so much for making deap! I noticed gp.cxSemantic includes the whole genome of each of the two parents in each of the two offspring.
This more than doubles the offspring size each generation and is like a software model of tetra/hexa/octaploidy, which is cool, except when you have a strict size or depth limit, because it can get stuck only producing offspring which exceed the limit(s).
Would it be possible to invent some kind of meiosis function for the gp trees? If we had a way to divide each parent in half, then we could keep doing cxSemantic indefinitely while still respecting the size and depth limits. Not a big deal since we can use other crossover operators, but it could be an interesting project to figure out because it aligns so well with fundamentals of cell biology. I promise to share if I figure it out. Thank you for taking the time to read this.
The text was updated successfully, but these errors were encountered: