Machine learning and Artificial Intelligence are a pair of new buzzwords that most people have heard on gimmicky commercials in the past few years, but they never really explain what it is, or where they came from. The concepts have been around for decades with prominent figures such as Alan Turing who theorized the potential of a computer to exceed human intelligence. In 1957, Frank Rosenblatt designed the first neural net that simulated the processes of the human brain. A decade later, the nearest neighbor algorithm was written that is extremely relevant to modern classification problems.
By now, it’s no secret that the current AI and ML boom we are experiencing is directly linked to massive data sets. We can attribute the Big Data Revolution, at a high level, to the ubiquity of the internet. The power in which data scientists traverse those large quantities of data is via the modern, powerful GPU (I can hear the data scientists reading this article muttering the phrase “overfitting” as they read that last sentence). How we arrived at the powerful hardware we use today is one part of the equation in the super catchy title for this write up. Nvidia is the most common GPU present amongst Machine Learning rigs, that’s because they produce some of the most powerful consumer facing video cards on the market (ask a random 14 year old kid playing Fortnight at 180 FPS can tell you all about it if you need to know.) Do you like those self-driving cars? Well even if you don’t, there is still Nvidia in there powering the tech that keeps you safe from dangerous human drivers.
So what has funded the massive advance in technology over the past 2 decades of GPUs? That 14 year old and all of their friends. You can almost certainly attribute Nvidia’s increased demand during the first two decades of its life to growth of the gaming industry. Shortly after the release of what is now considered to be the most popular MMO/RPG on the planet, World of Warcraft, Nvidia’s stock more than quadrupled over the next 3 years. They, like many other companies took a hit from the 2008 recession, but there is a direct parallel between Activision/Blizzard’s stock and there’s during the recovery.
2012, this year was prophesized by the Mayans as the end of the human race or so I’ve been told. I apologize in advance for setting off the conspiracy theory alarmists that live next door, but 2012 is the year what many point to as the beginning of AI as we know it, the year when we saw the machine do something with greater speed and accuracy than humans had done better since our existence . The trend became abundantly clear, AI will outperform human beings on any task, eventually. After seeing this data scientists such as myself, embraced the emerging machine learning frameworks like Tensorflow, Caffe and a few others and we saw the AI winter turn to spring. Investments started to pour in from VCs and Wall Street, tech companies started to adopt AI into their strategy and we witnessed the start of the 4th age of human history. Who was there to usher this all in on the hardware side? The folks at Nvidia made it to where you an ONLY use their GPU on the most popular frameworks like Tensorflow. Genius.
Around this same time, a frenzy filled with FOMO and Fantasy rekindled the spirit of the San Francisco gold rush, except this time our modern alchemists were manufacturing the precious resources in that part of the US, as opposed to plucking it from the earth. Bitcoin mining rigs became a high commodity, and unlike the PC Gaming that required just one GPU, these machines were stripped bare so they could cram as many GPUs inside as possible. In the span of 2 years, Nvidias stock went from around $35 to $250! The R&D budget expanded to more than $2b! Cloud based tech companies like Amazon’s AWS and Microsoft’s Azure began to incorporate GPU cloud computing to their strategy so data scientists globally could access what was previously only accessible in super-computing labs. Now it’s the standard to train these computationally heavy ML models on the cloud and if you don’t, you need an Nvidia GPU.
To summarize, pizza fueled gaming binges and a modern gold rush are how we arrived at the tech age and the Mayans actually stopped their calendar in 2012, because their the predictive model they used, derived the culmination of globally aggregated technology at this exact year in a manifestation of modern A.I.