insideHPC Readers: Weigh in on Why AI is Taking Off Now

artificial-intelligenceThis may indeed be the year of artificial intelligence, when the technology came into its own for mainstream businesses.

Many well-known names have committed to adding AI solutions to their product mix – General Electric is pushing its AI business called Predix, IBM runs ads featuring its Watson technology talking with Bob Dylan, and just recently CRM giant Saleforce announced it would be adding AI to it products. Its system, called Einstein, promises to provide insights into what sales leads to follow and what products to make next. These moves represent years of development and billions in investment. There are big pushes for AI in manufacturing, agriculture, healthcare and many other industry sectors.

But will other companies understand if AI has value for them? Perhaps a better question is “Why now?” This question centers on both the opportunity and why many companies are scared about missing out.

Please give us your opinion about whether your company has any plans to become an AI Enterprise?

Much of the reason why AI, machine learning and deep learning are crushing it today goes back to 2006 when Amazon Web Services started providing low-cost computing in the cloud. Ten years later, computing is cheaper than ever, allowing companies to avoid expensive infrastructure costs, and open the floodgates for unbridled volumes of data streaming in from an ever-increasing number of sensors. Google just recently climbed aboard the bandwagon with the Google Cloud to even more fully validate this trend.

One of the big reasons why AI is on its upward trajectory is the rise of relatively inexpensive compute resources. Machine learning techniques like artificial neural networks were widely used in the 1980s and early 1990s, but for various reasons their popularity diminished in the late 1990s. More recently, neural networks have had a major resurgence. One of the reason why their popularity waned is because a neural network is a computationally expensive algorithm. Today, computers became fast enough to run large scale neural networks. Since 2006, advanced neural networks are used to realize methods referred as Deep Learning. GPUs (graphics processing units) is a new hardware innovation that’s provided this compute power. The cloud and GPUs are merging as well, with AWS now offering GPU access in the cloud.

Sign up for our insideHPC Newsletter

Comments

  1. Sean O'Connor says

    Object recognition is the key to robotics and a lot of other applications. Done.
    Deep neural nets are very expensive to train both in hardware terms and your electricity bill. To actually use them after training you only need minimal hardware. You could envision a robotics company training a deep neural net for a month on a supercomputer (maybe at a cost of $1,000,000) but then you can put the trained net into 1,000,000 robots. You could end up with a market place for nets that have been trained for this thing or that.
    The other thing about robotics is that very low cost actuators are likely to become available.
    Cent per actuator robotics if you like: http://www.pnas.org/content/early/2016/09/21/1605273113.abstract
    Actually that is as big a deal as AI.