The most challenging phase in supervised Machine Learning pipeline is parameter tuning. There are many parameters, each with a range of values. The so called grid search is brute force approach that tries all possible combinations of values for the parameters looking for a combination of values that gives smallest test error.
Most supervised Machine Learning algorithm involves a significant number of parameters and grid search is not practical. To put things in perspective, if there 10 parameters and if each can take on 5 possible values there will 510 possible combinations to try.
In this post we demonstrate that with stochastic optimization technique called Simulated Annealing, near optimal solution can be found with significantly less number of iterations. The implementation Continue reading