Generally speaking, evolutionary algorithms aren’t opposed to deep learning.
An evolutionary algorithm is just one kind of search over a parameter search space.
A deep learning algorithm can use an evolutionary algorithm for parameter optimization,
by using evolutionary search over network weights. Often gradient search get used.
Other algorithms can be used as well, such as simulated annealling.
In this case, the cited paper uses something called Cartesian Genetic Programming
which creates a dynamic network of functions, determined by its “genes”.
The nodes in the graph process input usng prebuilt functions determined by their genes.
The prebuilt function include OpenCV image processing functions and some standard vector and scalar functions.
In this model, the genes are really just a point in a search space.
Other search algorithms could be used for this besides evolutionary algorithms, for example simulated annealling.
What I find interesting about the algorithm used here isn’t so much
the evolutionary search side of it so much as the dynamic topology of prebuilt functions,
which, strictly speaking, isn’t tied to evolutionary search.
Good to see any old favorite making a comeback challenging the huge bandwagon that is deep learning. Anyone interested in results at or above human performance should look at the Humies.
Generally speaking, evolutionary algorithms aren’t opposed to deep learning. An evolutionary algorithm is just one kind of search over a parameter search space. A deep learning algorithm can use an evolutionary algorithm for parameter optimization, by using evolutionary search over network weights. Often gradient search get used. Other algorithms can be used as well, such as simulated annealling.
In this case, the cited paper uses something called Cartesian Genetic Programming which creates a dynamic network of functions, determined by its “genes”. The nodes in the graph process input usng prebuilt functions determined by their genes. The prebuilt function include OpenCV image processing functions and some standard vector and scalar functions.
In this model, the genes are really just a point in a search space. Other search algorithms could be used for this besides evolutionary algorithms, for example simulated annealling.
What I find interesting about the algorithm used here isn’t so much the evolutionary search side of it so much as the dynamic topology of prebuilt functions, which, strictly speaking, isn’t tied to evolutionary search.
Good to see any old favorite making a comeback challenging the huge bandwagon that is deep learning. Anyone interested in results at or above human performance should look at the Humies.