I remember sitting in a lecture on natural language processing (NLP) with neural networks. The great instructor showed us graphs on the parameter counts of the latest NLP models: They ranged to billions of parameters!
He then said: Oh, the graph’s already outdated. Models are larger now.
Wow! What a time to witness such advancements; from millions to billions to … trillions of parameters. Somewhere around that time, I got hooked on all thing deep learning, doing all available classes on AI, plus some online courses (Coursera, MIT Deep Learning lectures).
The more I learned about machine learning and deep learning, the more I realized: there will always be more to discover. 6 years later, I might know slightly more than my past machine learning-self, but I still am an absolute beginner.