Deep learning, an artificial intelligence-based technology that requires increased computer power and speed to support the latest algorithms, is driving a host of startups to develop AI-specific chips.
According to a New York Times report, venture capitalists have now shifted focus on chip startups to help them open up a new revenue stream and become a dominant force in this emerging segment.
The NYT report said that close to 45 startups were involved in developing chips that can handle a series of tasks, from speech recognition to boost the hardware requirements of self-driving cars. While the startups may not be in a position to challenge the dominance of chip majors such as Intel, Qualcomm and Nvidia, their focus will be on finding a niche to make their businesses profitable.
In 2017, venture capital firms invested over $1.5 billion in such startups, with five of them raising more than $100 million, each. This was nearly two times the investment compared to 2015.
“Machine learning and AI has reopened questions around how to build computers,” Bill Coughran, who helped oversee the global infrastructure at Google for several years, and is now a partner at Sequoia, the Silicon Valley venture capital firm, told the NYT.
The emergence of chip startups had coincided with the efforts by Google, Microsoft and other internet giants to develop apps for face recognition, voice commands and design responses, which required the systems to be fed tonnes of data.
Intel, which entered the game late, recently acquired Nervana, a 50-employee Silicon Valley firm developing an AI chip from scratch for $400 million. Soon after, another Silicon Valley startup, Cerebras, hired five Nervana engineers as it was also looking to develop an AI chip.
According to a Forbes report, Cerebras has raised over $100 million by early-2018. The other firms to have raised similar amounts include Silicon Valley firms Graphcore and Wave Computing, besides two Chinese government-backed companies, Horizon Robotics and Cambricon.
However, the battle does not seem to be limited to neural networks or AI. Graphcore is developing chips that include more RAM so that data exchange is reduced. Other firms are working at ways that can help data exchange happen faster by increasing the data flow between the chips and data networks.
“This is not just about building chips, but looking at how these chips are connected together and how they talk to the rest of the system,” said Coughran.