I am sure by now we are all very familiar with “AI” in our day to day lives. Although it is just machine learning at the moment, there is no doubt that these advanced statistical models have had quite the effect on society.
While I do love utilizing Machine Learning in the context of data science and finding patterns where one might otherwise ignore, I am far more interested in the idea of what AI could really be.
By this idea, we have by no means reached “AI”, as all we really have are statistical models at a grand scale. These models themselves are interesting, and data science is most certainly an interest of mine, but they are just that: statistics and data.
I believe finding the “real” sources of intelligence within our own human brains is the key to the digital implementations. To dive into this, my focus has been on taking time to understand the stochastic nature of the brain through differential equations, ex. the Hodgkin and Huxley Model. From this point onwards, my learning journey will learn more into the Computational Neuroscience side of things, but I believe that in the future these fields will become much more intertwined.
Transformers
Transformers are essentially what led to machine learning as we know it becoming mainstream. While attempts to create language models had been undertaken before(LSTM, etc), they all faced the problem of scaling, where it would very quickly become unfeasible to train these models.
Transformers changed this in the fact that a large portion of the training and inference algorithms could be accelerated via parallelization, aka using GPUs.
Understanding Transformers, and other notes go over the mechanisms in which this can be done, from the attention blocks to the MLP layers, to the nature of large language model word meaning embeddings.
Some AI Notes
I take machine learning notes in my free time, and they can be found under Overview|the conceptual notes. I plan to do some more notes regarding 3Blue1Brown’s videos on how transformers work, but those have yet to be done as seen under the Transformers heading of this note.