close
close

Meta Shares Hard Numbers on Llama’s Enterprise AI Value

Even though generative AI has been on the scene for less than two years, we are already seeing benefits for consumers, businesses, and society. As in every tech gold rush, startups and big tech companies are doing everything they can to grab their unfair share of the spoils. LLM is one of the most important tools these tech companies are creating to stay ahead of their competitors.

There seem to be two camps for LLM development: closed and open. Closed LLMs include OpenAI, Google Gemini, and Anthropic. The most widely used open LLM is the Llama model from Meta. Enterprises, researchers, and all but the largest consumer companies can access it for free, making Llama a disruptive and valuable force—and aligning it with some other areas of enterprise IT where open source tools are the norm.

Today, Meta shared some insights into the dynamics of Llama’s development, especially the recently released Llama 3.1, and, more interestingly for me, details on how enterprises and their technology partners are using Llama to solve measurable business problems and create opportunities.

As an analyst, I always want more of that hard data — especially real-world statistics on how technology is impacting business, and especially in areas like generative AI where there’s so much noise to wade through. So kudos to Meta for providing a dose of hard data, supplemented by plenty of examples of how Llama is driving business value for enterprises.

Llama usage statistics are killer

Before we get to those specific examples, let’s look at the raw download and usage numbers, which are completely off the mark. Llama models have been downloaded on Hugging Face almost 350 million times (a 10x increase over the past 12 months), including 20 million times in the past month alone. Meta claims that this makes Llama “the leading open-source model family,” based on a survey by Artificial Analysis AI benchmarking site.

What if we measured in tokens instead? Meta says that hosted Llama usage across the large cloud providers it works with—AWS, Microsoft Azure, Google Cloud—more than doubled between May and July 2024. From January to July of this year, monthly Llama usage increased 10x “across some of our largest cloud providers.” (Hmm—I’d love to see a breakdown of these stats by CSP, though I don’t expect Meta to provide them… but I suspect you’re talking about Azure and AWS.)

Meanwhile, the number of partners in the Llama Early Access program has increased 5x with the release of Llama 3.1 last month. Click through to the Meta release if you want to read glowing reviews from Nvidia’s Jensen Huang and other partner executives from AWS, Databricks, and Groq. Llama’s long list of partners also includes Dell Technologies, IBM, Scale AI, and Snowflake, among others. You’ll notice that these partners include cloud providers, infrastructure providers, and data platforms.

Realistic business results for Llama

The enterprise AI use cases Meta shares also come from a variety of business areas. This is where theory and practice collide, showing why businesses should embrace AI.

  • For content analytics and authoring, Accenture is using Llama 3.1 as the foundation for a customized LLM that the consulting firm expects will streamline ESG reporting—delivering a 20% to 30% improvement in quality and a 70% increase in productivity.
  • In customer service, AT&T uses enhanced Llama models to improve customer service search responses by 33%, while also reducing costs and speeding up issue resolution.
  • How about making developers’ lives easier? DoorDash uses Llama to answer complex queries from its internal knowledge base and automate everyday tasks like creating pull requests for reviews, while Goldman Sachs uses Llama to help its engineers extract information from documents.
  • Japanese financial services giant Nomura uses Llama on AWS for a variety of purposes, from code generation and log analysis to document processing.
  • Shopify uses the open source LLaVA model, built on the Llama platform, which enables 40-60 million requests per day for product metadata.
  • Zoom uses a range of native and third-party models, including Llama, to power its generative AI assistant. My colleague Melody Brue has written extensively about Zoom’s approach to this, but suffice it to say that the AI ​​is a game-changer for Zoom.

These results are consistent with the kinds of advances I hear about when I talk to enterprise customers who are implementing AI in a serious way to get results. I hope Meta shares even more details about these and other use cases in the future.

Competitive differentiation with open source AI

Meta’s main competitors in this space are OpenAI, Anthropic, and Google Gemini. One of Meta’s key differentiators is that, unlike Gemini and (ironically) OpenAI, Llama is open source. This is clearly the technical approach—and operating philosophy—that Meta has taken. Llama’s company disclosure states bluntly: “Llama’s success is made possible by the power of open source.”

As you’d expect from any open source product, Llama is free, except for companies with giant user bases like Apple. Meta doesn’t generate any revenue from Llama, but like the Open Compute Project and PyTorch open source initiatives it also supports, it’s a major disruptor and enabler of business models—disrupting closed Llama providers and enabling researchers, enterprises, and startups to build the next generation of services.

OpenAI, regardless of the open/closed distinction, is causing its own stir with its wildly popular enterprise SaaS applications for consumers and enterprise PaaS services. Even today, rumors surfaced that OpenAI was seeking another round of funding at a valuation of $100 billion. Not all companies, of course, think that being closed is “bad.”

Meta’s commitment to open source was outlined in an open letter Mark Zuckerberg published last month, “Open Source AI Is the Way Forward.” In that letter, he examined Linux’s history of overtaking proprietary Unix and detailed why the open source approach is better for customers (customization of models, performance, affordability, avoiding vendor lock-in, etc.), better for Meta, and better for the world. Returning to the subject of that post, he stated bluntly, “The path Llama needs to take to become an industry standard is to be consistently competitive, efficient, and open generation after generation.”

Net-net, Llama looks like an enterprise giant delivering measurable value to some large enterprises across multiple verticals. Of course, I always want more customer data with full statistics. Today’s release is a great start, and I hope Meta reveals more in the coming months — while also providing even better AI models. It will need to do all of these things to solidify its position as a serious enterprise.