Dropbox, Figma CEOs back Lamini, a startup constructing a generative AI platform for enterprises

Date:

Pheromones
Boutiquefeel WW
Giftmio [Lifetime] Many GEOs
Cotosen WW

Lamini, a Palo Alto-based startup constructing a platform to assist enterprises deploy generative AI tech, has raised $25 million from investors including Stanford computer science professor Andrew Ng.

Lamini, co-founded several years ago by Sharon Zhou and Greg Diamos, has an interesting sales pitch.

Many generative AI platforms are far too general-purpose, Zhou and Diamos argue, and don’t have solutions and infrastructure geared to fulfill the needs of corporations. In contrast, Lamini was built from the bottom up with enterprises in mind, and is targeted on delivering high generative AI accuracy and scalability.

“The highest priority of nearly every CEO, CIO and CTO is to make the most of generative AI inside their organization with maximal ROI,” Zhou, Lamini’s CEO, told TechCrunch. “But while it’s easy to get a working demo on a laptop for a person developer, the trail to production is strewn with failures left and right.”

To Zhou’s point, many corporations have expressed frustration with the hurdles to meaningfully embracing generative AI across their business functions.

Based on a March poll from MIT Insights, only 9% of organizations have widely adopted generative AI despite 75% having experimented with it. Top hurdles run the gamut from an absence of IT infrastructure and capabilities to poor governance structures, insufficient skills and high implementation costs. Security is a significant component, too — in a recent survey by Insight Enterprises, 38% of corporations said security was impacting their ability to leverage generative AI tech.

So what’s Lamini’s answer?

Zhou says that “each piece” of Lamini’s tech stack has been optimized for enterprise-scale generative AI workloads, from the hardware to the software, including the engines used to support model orchestration, fine-tuning, running and training. “Optimized” is a vague word, granted, but Lamini is pioneering one step that Zhou calls “memory tuning,” which is a method to coach a model on data such that it recalls parts of that data exactly.

Memory tuning can potentially reduce hallucinations, Zhou claims, or instances when a model makes up facts in response to a request.

“Memory tuning is a training paradigm — as efficient as fine-tuning, but goes beyond it — to coach a model on proprietary data that features key facts, numbers and figures in order that the model has high precision,” Nina Wei, an AI designer at Lamini, told me via email, “and may memorize and recall the precise match of any key information as a substitute of generalizing or hallucinating.”

I’m undecided I purchase that. “Memory tuning” appears to be more a marketing term than a tutorial one; there aren’t any research papers about it — none that I managed to show up, at the very least. I’ll leave Lamini to indicate evidence that its “memory tuning” is best than the opposite hallucination-reducing techniques which are being/have been attempted.

Fortunately for Lamini, memory tuning isn’t its only differentiator.

Zhou says the platform can operate in highly secured environments, including air-gapped ones. Lamini lets corporations run, tremendous tune, and train models on a variety of configurations, from on-premises data centers to private and non-private clouds. And it scales workloads “elastically,” reaching over 1,000 GPUs if the appliance or use case demands it, Zhou says.

“Incentives are currently misaligned available in the market with closed source models,” Zhou said. “We aim to put control back into the hands of more people, not only a couple of, starting with enterprises who care most about control and have probably the most to lose from their proprietary data owned by another person.”

Lamini’s co-founders are, for what it’s price, quite completed within the AI space. They’ve also individually brushed shoulders with Ng, which little doubt explains his investment.

Zhou was previously faculty at Stanford, where she headed a bunch that was researching generative AI. Prior to receiving her doctorate in computer science under Ng, she was a machine learning product manager at Google Cloud.

Diamos, for his part, co-founded MLCommons, the engineering consortium dedicated to creating standard benchmarks for AI models and hardware, in addition to the MLCommons benchmarking suite, MLPerf. He also led AI research at Baidu, where he worked with Ng while the latter was chief scientist there. Diamos was also a software architect on Nvidia’s CUDA team.

The co-founders’ industry connections appear to have given Lamini a leg up on the fundraising front. Along with Ng, Figma CEO Dylan Field, Dropbox CEO Drew Houston, OpenAI co-founder Andrej Karpathy, and — strangely enough — Bernard Arnault, the CEO of luxury goods giant LVMH, have all invested in Lamini.

AMD Ventures can also be an investor (a bit ironic considering Diamos’ Nvidia roots), as are First Round Capital and Amplify Partners. AMD got involved early, supplying Lamini with data center hardware, and today, Lamini runs a lot of its models on AMD Instinct GPUs, bucking the industry trend.

Lamini makes the lofty claim that its model training and running performance is on par with Nvidia equivalent GPUs, depending on the workload. Since we’re not equipped to check that claim, we’ll leave it to 3rd parties.

To this point, Lamini has raised $25 million across seed and Series A rounds (Amplify led the Series A). Zhou says the cash is being put toward tripling the corporate’s 10-person team, expanding its compute infrastructure, and kicking off development into “deeper technical optimizations.”

There are various enterprise-oriented, generative AI vendors that might compete with facets of Lamini’s platform, including tech giants like Google, AWS and Microsoft (via its OpenAI partnership). Google, AWS and OpenAI, specifically, have been aggressively courting the enterprise in recent months, introducing features like streamlined fine-tuning, private fine-tuning on private data, and more.

I asked Zhou about Lamini’s customers, revenue and overall go-to-market momentum. She wasn’t willing to disclose much at this somewhat early juncture, but said that AMD (via the AMD Ventures tie-in), AngelList and NordicTrack are amongst Lamini’s early (paying) users, together with several undisclosed government agencies.

“We’re growing quickly,” she added. “The primary challenge is serving customers. We’ve only handled inbound demand because we’ve been inundated. Given the interest in generative AI, we’re not representative in the general tech slowdown — unlike our peers within the hyped AI world, we now have gross margins and burn that look more like a daily tech company.”

Amplify general partner Mike Dauber said, “We imagine there’s a large opportunity for generative AI in enterprises. While there are various AI infrastructure corporations, Lamini is the primary one I’ve seen that’s taking the issues of the enterprise seriously and creating an answer that helps enterprises unlock the tremendous value of their private data while satisfying even probably the most stringent compliance and security requirements.”

Share post:

Popular

More like this
Related

Yo Gotti Shows Love With Lavish Birthday Trip

Yo Gotti is making it clear that he’s not...

Not much of a feat, but not less than, Terrafirma’s in win column

Stanley Pringle and Terrafirma had good enough reasons to...

Release date, price, and contents for Terrifier bundle

Halloween events are at all times an enormous deal...

Volcanoes may help reveal interior heat on Jupiter moon

By staring into the hellish landscape of Jupiter's moon...