The ML algorithms are not really advanced much at all from the training algorithms that popped up in the 70's-80's. Really the only difference between old "neural nets" and "deep neural nets" is adding more hidden layers, which became practical with advancements in hardware, not in the software algorithms themselves.
I know what you mean. I'm very inspired by
Wobbledogs and I'm working my way through the
genetic algorithm tutorial linked from a post on
how Wobbledogs does it. Unforunately, classic GA/EA has a piece missing - the brain. I'm afraid that once I get GA into Cave Confectioner, nobody will care because all it can do is optimize where the cavemen go to get food and whatnot, rather than evolving behaviors. If I put a neural network in the chromosome struct, it would be limited to a fixed size and probably require gated units to keep itself sane (LSTM, GRU, etc.) It would require more code and I don't know how to use neural networks yet.
Ideally I would use NEAT, which is like a neural network in a genetic algorithm except the neural network grows to the needed size. That's the state of the art right now.
As for nav mesh, I've actually never really used one myself. I'm assuming the general idea is to triangulate and then simplify your mesh, and once done, run some other path-finding algorithm? Or in other words, use mesh topology and geometry algorithms to generate "nice" data for path-finding to consume?
I haven't used it either, but I do know that it's inspired by classic A* pathfinding, and in the case of Recast, it does pretty much what you just described except that a support library simplifies all the geometry automatically so the human doesn't have to fiddle with it. It's not classic orthodox NavMesh, it's NavMesh++.