The funding climate for AI chip startups, once as sunny as a mid-July day, is starting to cloud over as Nvidia asserts its dominance.
In accordance with a recent report, U.S. chip firms raised just $881 million from January 2023 to September 2023 — down from $1.79 billion in the primary three quarters of 2022. AI chip company Mythic ran out of money in 2022 and was nearly forced to halt operations, while Graphcore, a once-well-capitalized rival, now faces mounting losses.
But one startup appears to have found success within the ultra-competitive — and increasingly crowded — AI chip space.
Hailo, co-founded in 2017 by Orr Danon and Avi Baum, previously CTO for wireless connectivity on the microprocessor outfit Texas Instruments, designs specialized chips to run AI workloads on edge devices. Hailo’s chips execute AI tasks with lower memory usage and power consumption than a typical processor, making them a powerful candidate for compact, offline and battery-powered devices resembling cars, smart cameras and robotics.
“I co-founded Hailo with the mission to make high-performance AI available at scale outside the realm of knowledge centers,” Danon told TechCrunch. “Our processors are used for tasks resembling object detection, semantic segmentation and so forth, in addition to for AI-powered image and video enhancement. More recently, they’ve been used to run large language models (LLMs) on edge devices including personal computers, infotainment electronic control units and more.”
Many AI chip startups have yet to land one major contract, let alone dozens or a whole bunch. But Hailo has over 300 customers today, Danon claims, in industries resembling automotive, security, retail, industrial automation, medical devices and defense.
In a bet on Hailo’s future prospects, a cohort of monetary backers including Israeli businessman Alfred Akirov, automotive importer Delek Motors and the VC platform OurCrowd invested $120 million in Hailo this week, an extension to the corporate’s Series C. Danon said that the brand new capital will “enable Hailo to leverage all opportunities within the pipeline” while “setting the stage for long-term growth.”
“We’re strategically positioned to bring AI to edge devices in ways that can significantly expand the reach and impact of this remarkable recent technology,” Danon said.
Now, you may be wondering, does a startup like Hailo really stand a probability against chip giants like Nvidia, and to a lesser extent Arm, Intel and AMD? One expert, Christos Kozyrakis, Stanford professor of electrical engineering and computer science, thinks so — he believes accelerator chips like Hailo’s will turn out to be “absolutely vital” as AI proliferates.
“The energy efficiency gap between CPUs and accelerators is just too large to disregard,” Kozyrakis told TechCrunch. “You employ the accelerators for efficiency with key tasks (e.g., AI) and have a processor or two on the side for programmability.”
Kozyrakis does see longevity presenting a challenge to Hailo’s leadership — for instance, if the AI model architectures its chips are designed to run efficiently fall out of vogue. Software support, too, may very well be a problem, Kozyrakis says, if a critical mass of developers aren’t willing to learn to make use of the tooling built around Hailo’s chips.
“A lot of the challenges where it concerns custom chips are within the software ecosystem,” Kozyrakis said. “That is where Nvidia, as an example, has an enormous advantage over other corporations in AI, as they’ve been investing in software for his or her architectures for 15-plus years.”
But, with $340 million within the bank and a workforce numbering around 250, Danon’s feeling confident about Hailo’s path forward — at the very least within the short term. He sees the startup’s technology addressing lots of the challenges corporations encounter with cloud-based AI inference, particularly latency, cost and scalability.
“Traditional AI models depend on cloud-based infrastructure, often affected by latency issues and other challenges,” Danon said. “They’re incapable of real-time insights and alerts, and their dependency on networks jeopardizes reliability and integration with the cloud, which poses data privacy concerns. Hailo is addressing these challenges by offering solutions that operate independently of the cloud, thus making them capable of handle much higher amounts of AI processing.”
Curious for Danon’s perspective, I asked about generative AI and its heavy dependence on the cloud and distant data centers. Surely, Hailo sees the present top-down, cloud-centric model (e.g OpenAI’s modus operandi) is an existential threat?
Danon said that, quite the opposite, generative AI is driving recent demand for Hailo’s hardware.
“Lately, we’ve seen a surge in demand for edge AI applications in most industries starting from airport security to food packaging,” he said. “The brand new surge in generative AI is further boosting this demand, as we’re seeing requests to process LLMs locally by customers not only within the compute and automotive industries, but additionally in industrial automation, security and others.”
How about that.