- This Samsung phone is the model most people should buy (and it's not a flagship)
- The 50+ best Black Friday Walmart deals 2024: Early sales live now
- How to Dockerize WordPress | Docker
- The smartwatch with the best battery life I've tested is also one of the cheapest
- One of the most immersive portable speakers I've tested is not made by Sony or Bose
CIOs worry about Gen AI – for all the right reasons
“Getting answers to questions like that isn’t easy, right?” Cook said. “Often, you’ve got service reps that weren’t even born when those products were issued.
“Quite often today, they’ll have to put the client on hold or call them back,” Cook continued. “It’s not very efficient and not a great experience for the customer. The intent is for these generative AI tools to allow that to be a one-and-done type conversation with rapid response.”
And Ford Pro, a business that provides telematics services on top of fleet vehicles and EV chargers, is building an LLM-powered chatbot to provide internal teams with faster, more accurate access to documentation.
The end-state for no-code
Democratizing AI has been the rallying cry of no-code/low-code tools for some years now, and generative AI is taking the concept to another level. With a chatbot backed by generative AI, insights normally confined to the province of data scientists and business analysts can now be within reach for anyone given access.
Clearsense, Ford Pro, and New York Life are all building out that capability. With good reason.
What’s most exciting to Cloudera’s Venkatesh is not just that more people can pursue answers from the data. As well, more data is accessible than ever before.
“The hardest type of data to make sense of has been unstructured,” Venkatesh said. “But it’s critical. What was the customer’s experience? Did we solve the problem? How many times did they have to call? And why did we not see this sooner? So much of that is hidden away in the chat history, not all the rows and columns of structured data.
“You don’t have to teach models anymore that loan and mortgage can be used interchangeably in some contexts. LLMs pick that up on their own. So the cost of extracting semantic meaning is probably 100 times less expensive than it was even 18 months ago. That’s huge.”
Ain’t representin’
For now, at least, most see generative AI as a tool to speed human decision-making and interaction. If anyone’s developing gen AI-infused applications now to make decisions or speak directly with customers, I haven’t found them. Surprisingly, though, folks are pretty divided as to whether they think that day will come at all.
Ford’s Musser and Cook from New York Life believe that likely will happen for some applications, once guardrails are locked down and the possibility of hallucinations is all but eliminated. Boicey from Clearsense isn’t so sure.
“We don’t fully understand how, cognitively speaking, humans come up with responses,” he said. “So we’ll invariably miss out on some types of inputs and conclusions. Call it intuition. Call it experience. Whatever causes a human to contemplate the gen AI response and say, ‘Yeah I get that, but it’s not the right call for this patient right here.”
AWS’ Nandi agrees, saying that the potential cost of handing over the keys outweighs the benefits.
“Particularly for a regulated industry, you’ll need to have really good guardrails and controls in place,” Nandi said. “And by the way, one of the easiest guardrails you can implement? A smart person.”