Without a doubt, 2016 was an amazing year for Machine Learning (ML) and Artificial Intelligence (AI). During the year, we saw nearly every high tech CEO claim the mantel of becoming an “AI Company”. However, only a few companies were actually able to monetize their significant investments in AI, notably Amazon AMZN +0.71%, Baidu , Facebook FB +1.75%, Google GOOGL +3.77%, IBM IBM -0.02%, Microsoft MSFT +0.29%, Tesla Motors TSLA +1.75% and NVIDIA NVDA -1.29%. But 2016 was nonetheless a year of many firsts. As a posterchild for the potential for ML, Google Deep Mind mastered the subtle and infinitely complex game of GO, soundly beating the reigning world champion. And more than a few cool products were introduced that incorporated Machine Learning, from the first autonomous vehicles to new “intelligent” household assistants such as Google Home and Amazon Echo. But will 2017 finally usher in the long-promised age of Artificial Intelligence?
Two domains: AI and Machine Learning. These terms are not interchangeable. Machine learning, a completely different way to program a computer by training it with a massive ocean of sample data, is real and is here to stay. General Artificial Intelligence remains a distant goal and is perhaps 5-20 years away depending on the specific domain of the “intelligence” being learned. To be sure, computers trained using Machine Learning hold tremendous promise, as well as the potential for massive disruption in the workplace. But these systems remain a far cry from genuine intelligence. Just ask Apple AAPL -0.14% Siri, and you will see what I mean. The hype around AI, and confusion over what the term actually means, will inevitably lead to some disillusionment as the limitations of this technology become apparent.
With that context in mind, here’s what I expect for the coming year for Machine Learning and AI.
1. Hardware accelerators for Machine Learning will proliferate.
Today, nearly all training of deep neural networks (DNNs) is performed using NVIDIA GPUs. Conversely, DNN inference, or the actual use of a trained network can be done efficiently on CPUs, GPUs, FPGAs, or even specialized ASICs such as the Google TPU, depending on the type of data being analyzed. Both training and inference markets will be hotly contested in 2017, as Advanced Micro Devices GPUs, Intel’s newly acquired Nervana chips, NVIDIA, Xilinx and several startups all launch accelerators specifically targeting this lucrative market. If you would like a deeper dive into the various semiconductor alternatives for AI, please see my companion article on this subject here.
2. Select application domains will leverage Machine Learning to improve efficiency of mission-critical processes.
If you are trying to find the killer AI app, the increasingly pervasive nature of the technology will make it difficult to identify. However, Machine Learning has begun to deliver spectacular results in very specific niches where the pattern recognition capabilities can be exploited, and this trend will continue to expand into new markets in 2017.