Thx for this. "optimizing “a problem by maintaining a population of candidate solutions and creating new candidate solutions by combining existing ones according to its simple formulae, and then keeping whichever candidate solution has the best score or fitness on the optimization problem" that's exactly like humans think. Create lots of potential solutions, true/false/crazy, and then go through all of them and evaluate which one is the best. We do that automatically every ms of every day. In fact, that's also how evolution works.
I've always said that was is missing in NN, is a memory system, and a feedback system. I don't remember the name of the theorem, but it says that you can approximate any function with 2 or 3 layers (not sure if they counted the outer layer), but that's what they are: function approximators. Can be very very complex functions, but still has the limitations of a function.
Thx for this. "optimizing “a problem by maintaining a population of candidate solutions and creating new candidate solutions by combining existing ones according to its simple formulae, and then keeping whichever candidate solution has the best score or fitness on the optimization problem" that's exactly like humans think. Create lots of potential solutions, true/false/crazy, and then go through all of them and evaluate which one is the best. We do that automatically every ms of every day. In fact, that's also how evolution works.
I've always said that was is missing in NN, is a memory system, and a feedback system. I don't remember the name of the theorem, but it says that you can approximate any function with 2 or 3 layers (not sure if they counted the outer layer), but that's what they are: function approximators. Can be very very complex functions, but still has the limitations of a function.
Now I gotta learn this!