OpenAI’s powerful large language model has accelerated the path toward AI integration and changed the parameters for getting there
Late last year, when ChatGPT first showed up in everyone’s news feeds, conversations about the business applications for AI instantly intensified. At the very least, every organization had to rethink aspects of their operations that would be affected by this powerful large language model (and others). We’ve been using GPT for over two years in the enterprise solutions being built on the OneReach.ai platform. Still, seeing the way the public reacted to and began experimenting with ChatGPT was deeply gratifying as someone who has spent the last two decades working toward democratizing conversational AI. The recent release of GPT-4 has led to some important realizations about technology that have taken on even more urgency.
1: GPT-4 Makes Traditional NLU Obsolete
The tedious and time-consuming process of training a language model has been vastly simplified by GPT-4, which can be trained conversationally. Thus, GPT-4 delivers a knockout punch to NLU approaches like Watson, Einstein, Service Now, and LUIS. GPT-4 also takes entity extraction to new heights and reigns supreme as a classifier when using low-shot training. By facilitating easier data extraction and sharing across organizations, GPT-4 is also a game changer for knowledge management.
2: GPT-4 Dropped Remaining Barriers to Democratizing AI
OpenAI isn’t revealing all of GPT-4’s secrets, but its low-shot training capabilities open pathways to democratizing AI. Researchers at Stanford fine-tuned Meta’s LLaMA large language model using OpenAI’s API and built a GPT clone called Alpaca for under $600. The demo was quickly pulled down due to inadequate content filtering, but the point had been made. Traditional NLU training teams and highly-trained linguists won’t be requirements for organizations eager to build solutions using generative AI.
3: No Code + GPT-4 = NEXT LEVEL
Every enterprise needs a plan for utilizing generative AI. The clearest path to integration relies on low/no-code development tools. When the people within an organization can take part in building automations, they can find novel ways to strip the tedious tasks out of their work lives and evolve the process automations that work for them. Less tedium means more time for creative problem solving, which creates a symbiotic relationship where machines are helping humans and humans are helping machines. In this scenario people can internalize the benefits of working with AI and advance their technological skills in meaningful ways. Large language models are easy to train and can supercharge interactions with customers and employees alike.
Using our platform we recently built a test IVR using generative AI that maintained a 100% containment rate during user testing (the industry average is 75%). Our IVR also earned an NPS of +70, putting us far ahead of the average +32. Beyond the stellar containment rate, the major takeaway here is that our test solution vastly outperformed traditional call center experiences in terms of customer satisfaction. People enjoyed interacting with a fine-tuned experience that leverages GPT.
We built our platform around no-/low- code design tools to make it easy to prop up experiences and iterate on them at a rapid clip. This more agile than Agile approach is a critical component to enterprise AI adoption.
4: GPT-4 Shortens the Road to Intelligent Digital Workers
Chatbot has always seemed like an inadequate term to me. I prefer intelligent digital worker (IDW). Intelligent digital workers can do more than chat. They can work across pieces of software and data stores to automate increasingly sophisticated processes. The on-ramp to creating IDWs got a lot shorter thanks to GPT-4. Business leaders would be wise to gain familiarity with the things GPT-4 can (and can’t) do and find applicable use cases within their organizations. There simply aren’t any excuses for not getting started right away.