What enterprise leaders need to know about generative AI: 8 key takeaways from VB Transform

6 min read


Head over to our on-demand library to view sessions from VB Transform 2023. Register Here


It has been two weeks since VB Transform, the first major independent event focused on the impact of generative AI on the enterprise.

Generative AI is widely seen as the most powerful technology force since the internet, and enterprise companies are eager to to leverage it.

To help them navigate this new frontier, VB Transform brought together experts and speakers from various industries to share their insights and best practices. See our coverage of the event. But here are my eight key takeaways for enterprise leaders:

1. It’s all about your data layer

This may seem like an unsexy takeaway, but it is the most important one. Most enterprise companies have huge challenges in getting their data in order, and if they ignore or avoid this, they will miss out on the benefits of generative AI. Data is the fuel for the large language models (LLMs) that fuel generative AI, and without clean, reliable and secure data, LLMs will not perform well, or will even cause harm. One of our roundtables mapped out a best-practice playbook on how to get started preparing your data for LLMs. But if you want to take it to the ultimate level, you need to rewire your entire organization around data. Intuit provides a good example here, building a new operating system for generative AI. It’s one reason Intuit’s chief data officer Ashok Srivastava told me at VB Transform he’s sleeping well, which takes me to the next point.

Event

VB Transform 2023 On-Demand

Did you miss a session from VB Transform 2023? Register to access the on-demand library for all of our featured sessions.

 


Register Now

2. Large language models are going to be the default interface for all computing

LLMs will power every interaction we have, on the level the phone did and the touchscreen did, and the way graphic user interfaces did. Nick Frosst, cofounder of LLM-building company Cohere, spelled this out most clearly in his talk with VentureBeat’s Sharon Goldman. The Google-led search paradigm will be over. Instead, we will be able to ask natural language questions and get natural language answers from any source of information. This will create new opportunities and challenges for user experience design, personalization and privacy. And this area of user experience is the one area Intuit’s Srivastava says keeps him up at night.

3. Fear and anxiety (and excitement) pervade

I came into the event knowing this space is new, and that battle-scarred veterans were already saying they are in a state of peak excitement and fear, but this was palpable at the event. The race around generative AI may feel like a sprint, and some enterprise leaders may feel they are already behind, but in reality we’re very early in this race. Some key forces have slowed the move toward this interface (see above), including the need for companies to not share their customers’ personally identifiable information with key providers of foundational LLMs like Microsoft and Open AI’s ChatGPT. Still, respondents to our AI survey show companies are experimenting like crazy with generative AI. (Take the AI Survey yourself, and get a copy of the results for free).

4. Choice abounds: You can build, borrow or piggyback

A lot of decision-makers are getting their heads around the best way to build chatbots and other LLM-driven applications. There are several ways to go, depending on how fast you need to get to market, and how proprietary your data is.

If you’re a very big company with resources and clean data, building your own foundational model — or partnering with a company such as MosaicML to do this — might make sense, so that your own data is trained in building the LLMs. Then there’s the slightly easier version of this: Taking one of the open-source foundation models (LLaMA is now a big one, but there are many others), and fine-tuning its weights and biases to meet your own needs. And you’re not destined to be a laggard here. You can build your own model for about $200,000.

On the other extreme, If you’re doing something lightweight and you don’t need to worry about having a chatbot or other LLM-based application accessing your data, you can just use ChatGPT’s raw API. A slightly more customized version of that is using ChatGPT and then chaining it (via a framework like Langchain) to a vector database that can make sure you can query your own data. (See this good overview by Laura Reeder of Sequoia Capital.

5. Multiple use cases. But consider aiming for the stars

I counted about seven distinct buckets of use cases for LLMs for enterprise companies, articulated during this high-powered conversation with Amazon AWS VP of product Matt Wood and Google VP of data and analytics Gerrit Kazmaier. Briefly, they are: Generation, including not only content but new software; ranking, personalization and relevancy apps; apps to allow experts and others to learn more efficiently about new fields; collaborative problem-solving through automated decision-support; new customer experiences; building entirely new products; and building new companies. Just as the internet spawned new companies like Amazon, Netflix and Airbnb to redefine products with better experiences, the same will happen over the next six months to three years, said Wood. And this will happen faster than during the internet boom, because LLMs are much more accessible to more people.

6. Conversational business intelligence is a thing

You should learn how to build LLM-based chatbots to query your corporate data. ChatGPT is not doing a great job of meeting specific enterprise chatbot needs because it is generally intelligent, but not specifically intelligent. Increasingly, there’s realization that you need to help your users with prompts to know how to access the data they need, and there are many ways to do this. iGenius, a sponsor of VB Transform, is one company helping enterprise companies build these experiences in customized way.

7. Unlike crypto, real revenue is being made in generative AI

LLMs are still largely about saving costs through productivity gains. But are there any revenue use cases? It’s not clear yet. So far most of the most money being made in gen AI is by startups that are selling cost-saving generative AI solutions, or by companies like Nvidia, which is selling GPUs to run LLMs. Tim Tully, an investor in Mayfield, weighed in during a roundtable session at Transform, saying this revenue generation is the big difference between crypto and LLMs: ”I’ve been in tech for like 25 years. I’ve never seen anything like this. Companies go from like zero to 30 in like three months … It’s just incredible.” He said companies are getting contracts because they’re creating value, citing L’Oreal, Pepsi, Coca-cola, Honda and Michelin all buying generative AI contracts.

Aside from startups, big companies may also be able to use LLMs to generate revenue directly: As McKinsey points out, some incremental revenue can come from boosting sales by using LLMs to do better customization and personalization.

8. Some dreamier ambitions for LLMs may be overrated

In one intriguing roundtable session hosted by NTT, a sponsor of VB Transform, some participants talked about ways LLMs could be used for more radical breakthroughs. There was speculation, for example, about being able to speed up the learning process of new employees by ingesting corporate Slack and email conversations into LLMs to help summarize communications and other corporate processes that otherwise take years to learn. Other enthusiasts hope LLM apps will be able to predict trends in stock prices, or better reorganize factory floors. Matt Wood of Amazon talked of the flywheel effect that can be created with LLMs, using automated decision support systems. In his session, though, Cohere’s Nick Frosst threw cold water over applications that are too far afield from where they really excel, which is text generation and summarization. In other words, they are good at letting you pose questions about content, and getting answers to that as output.

Looking forward: VB Data Summit in SF

VentureBeat looks forward to tracking these trends as they unfold over the next weeks and months. We’re going to be biting off the first one of these takeaways — how to clean in your data layer — at our upcoming VentureBeat Data Summit 2023 on November 15 at the Terra Gallery in San Francisco. If you’re an enterprise decision-maker wanting to leverage LLMs, this would be a great place to network with your peers, get insights and make decisions. Join us in some peer networking, and pre-register now to get a 50% discount.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.



Source link