The term “artificial intelligence” was first popularized at the 1956 Dartmouth Conferences, and until the past few years it was largely considered science fiction. From R2D2 and C3PO to The Terminator, people have always wondered what it would be like if machines could think, learn, reason and behave like humans. AI made for great entertainment, but it was never realistic.
Now that more compute power is available, AI is no longer far-fetched. Graphics processing units (GPUs), which perform calculations much faster than traditional central processing units (CPUs), have helped enable the rapid growth of AI applications. Demand for AI hardware is high — TrendForce predicts that about 1.2 million AI servers will ship in 2023, representing almost 9 percent of total server shipments.
Today, it’s within the reach of almost any business to invest in AI technology. Before making those investments, however, organizations need to understand what AI is, what it is not, and how it can be used.