Hacker News new | past | comments | ask | show | jobs | submit login
Learning Without a Gradient (Part 2) (pchiusano.github.io)
2 points by dragontamer on Nov 5, 2020 | hide | past | favorite | 1 comment



For most "deep learning" algorithms, the typical approach is to "gradient ascent", following the slope of the hypergraph to find incrementally better-and-better results.

Genetic Algorithms however, do not necessarily need a gradient to find a better solution, though there's similarities between GAs and hill-climbing. This blogpost explores a conceptual similarity between traditional gradient ascent / backpropagation (currently a favored technique for neural nets), and how it relates to genetic algorithms.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: