Projects

Final Project

In this project,we will apply probability to some optimization problems. It is well-known that gradient descent is a nice method for minimizing a function. However, if the function is non-convex, there will be multiple local minimizers. The sad thing is gradient descent only leads to local minimizers which is not optimal. Local minimizers is like a trap which stop us moving to the global minimizer. By introducing randomness into the process, we will be able to kick the point out the trap and evolve to the global minimizer. For detail, see Final Project.