Businesses Grapple with AI’s Limitations and Unreliability

Businesses are becoming disillusioned with AI as they discover its limitations and unreliability. Despite the hype around generative AI, businesses are finding that the technology often falls short of expectations. Large language models like ChatGPT are prone to hallucinating and spreading misinformation and have been accused of plagiarizing writers and artists. The hardware that generative AI uses also requires enormous amounts of energy, which has environmental implications.

Businesses are finding that the technology cannot be depended on. As Gary Marcus, a cognitive scientist and notable AI researcher, points out, many businesses have reported that while the technology is impressive, it is not reliable enough to roll out to customers. This is evident in instances where chatbots have been disabled after swearing at customers or offering unrealistic deals.

The core problem is that generative AI models are not information retrieval systems, but synthesizing systems. They lack the ability to discern from the data they’re trained on unless significant guardrails are put in place. This has led some experts to warn that AI is a bubble, similar to crypto or Dot Com startups. Projections that AI will become a trillion-dollar industry within the next decade may be overly optimistic, and the technology may undergo a lengthy period of stagnation. Investors who have poured billions of dollars into the industry expecting a lucrative turnaround may not have the patience to hold out.

read more > futurism.com

NIMBUS27