Artificial Neural Networks | Interpolation vs. Extrapolation

Artificial Neural Networks (ANNs) are powerful inference tools. They can be trained to fit complex functions and then used to predict new (unseen) data outside their training set. Fitting the training data is relatively easy for ANNs because of their Universal Approximation capability. However, that does not mean ANNs can learn the rules as we…

Genetic Algorithm vs. Stochastic Gradient Descent

Genetic Algorithm (GA) and Stochastic Gradient Descent (SGD) are well-known optimization methods and are used for learning in Neural Networks. There are various implementations of GA, however, most of them (e.g. Neat) are not directly comparable to SGD because these GA methods use point/localized mutations in their connections/weights. Geoffrey Hinton, in one of his videos…