“Generative AI” is having a moment. Entrepreneurs are flocking to language- and image-generation models like GPT-3 and Stable Diffusion and building a whole lot of companies to take advantage of those technologies. Investors are writing very (very) large checks to the folks who do it well.
And they should! The generative capabilities of these models are remarkable, and will revolutionize whole industries. They’ll fundamentally change how many of us interact with the world. This technology is amazing!
That said — generative AI sells the capabilities of these models short. I’ve seen several founders, investors, and established companies get stuck in the “generative” rut, focused narrowly on products that allow users to generate content.
"I’ve seen several founders, investors, and established companies get stuck in the generative rut, focused narrowly on products that allow users to generate content."
Powerful language models (and to some extent image synthesis models) are so much bigger than that. The modern large language model (LLM) may be one of the most impressive systems ever built. These models serve as an incredible database of information (almost everything on the public internet is indexed!) that also can do human-like reasoning and deduction.
Let’s dive into how some teams are thinking bigger than just “generative.”
Building applications with LLMs
In my mind, there are two ways to use generative AI to build products:
- Expose the ability to generate to end users
- Use the ability to generate to build applications or experiences that service users
Since everyone is talking about the first way, it's worth diving into the second. I think the market is even more significant for the companies that figure it out. Let’s dive into some ways folks are succeeding in building apps powered by LLMs — not comprehensive, but hopefully can help directionally.
1. Search
Modern language models are exceptional at the semantic understanding of text across a broad set of domains. Search is a natural application of this capability — most searches are semantic, and most search engines seek to optimize (in part) for this.
LLMs dramatically simplify building semantic search. Some awesome teams are out there building on top of this capability:
There is a ton of room for innovation, too — check out this awesome demo of on-page search:
Beyond search-first applications, a lot of companies should probably look to improve existing search capabilities in light of the easy availability of LLMs. Some quick examples:
- Customer success platforms: Finding and resolving tickets can be challenging and domain-specific work. LLMs generalize exceptionally (especially with some fine-tuning), and can probably make it a lot easier to both find relevant tickets and resolve new tickets.
- Sales: In combination with strong meeting transcription and integrations with internal documents, search tooling should make it easier than ever to find relevant sections of notes and quickly respond to the needs of prospects.
2. Task automation
One of the most significant challenges with most generic automation tasks is simply understanding the operating environment. LLMs are very effective at “understanding” environments that can be parsed via language, and intelligent use of that understanding can be used to automate a large range of complex applications
Adept does an awesome job showing this off with their Action Transformer model:
Although Adept is approaching this horizontally, there are endless possibilities within applications to make things “easier” with language models. If your computer can fundamentally understand the interface of an application, how can you simplify the navigation of that app?
3. Data interfaces
Finally, let’s push the boundaries of “generative” slightly. In the data space, several teams have been exploring using LLMs to generate SQL or visualizations. Classically, this is still “generative” — using natural language to generate the interfaces to data assets. This demo by Mihail Eric and Andrew Mauboussin is a great example:
This is a great example where “generative” is all in the abstraction you provide. Concretely, the interface of something like GitHub Copilot makes it very clear that you’re working with something generative — you describe the code you want, and Copilot generates it.
For a range of data applications, there’s a lot of room to play with the interface you provide. Even if behind the scenes a model is generating SQL, depending on the application you may be able to expose outputs to users directly:
- Answer a data question directly (e.g., “What region has experienced the largest growth in sales this quarter?”)
- Generate visualizations directly (like AutoPlot)
- Summarizing available data assets (“How fresh is our current NA sales data?”)
There may be debate on whether generated SQL can be relied upon for business-critical applications, but so many data questions are not business-critical! There’s room for a new range of applications that make it easier and faster to answer ad-hoc questions about data.
Conclusion
This isn’t even close to an exhaustive list of possibilities for applying LLMs — it's not even close to the tip of the iceberg.
At Unusual, we can’t wait to see the creative, transformational ways startups will apply LLMs to build “magical” experiences. This technology can disrupt nearly every market, not just creative work. Unusual can be your day-zero partner to help bring that technology to market.
David Hershey is an investor at Unusual Ventures, where he invests in machine learning and data infrastructure. David started his career at Ford Motor Company, where he started their ML infrastructure team. Recently, he worked at Tecton and Determined AI, helping MLOps teams adopt those technologies. If you’re building a data or ML infrastructure company, reach out to David on LinkedIn.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.