Irrational Exuberance in AI

AI/ML / Business / GenAI / Tips

Irrational Exuberance in AI

In 1996, Alan Greenspan said “How do we know when irrational exuberance has unduly escalated asset values, which then become subject to unexpected and prolonged contractions … ? And how do we factor that assessment into monetary policy?”

Investopedia summarizes the evolution of the term as “Irrational exuberance is unfounded market optimism that lacks a real foundation of fundamental valuation, but instead rests on psychological factors.” (emphasis added).

Both of them are talking about Irrational Exuberance in financial markets. But I believe the term applies equally well to the current excitement about Generative AI and Large Language Models. Everyone and their dogs are jumping on the LLM bandwagon

Don’t misunderstand me. I am very excited about the possibilities enabled by LLMs, which is a truly a revolutionary and fundamental technical advance. But I’m also very concerned about some of the unfounded use cases I hear people promoting and how quickly and easily they believe they can achieve success.

They are ignoring a key, use case element: risk.

The European Union just recently passed their AI Act. The AI Act partitions AI use cases based on their risk: Minimal Risk, Limited Risk, High Risk and Unacceptable Risk. Higher risk categories require increasingly more stringent testing and validation. Some use cases are so risky, that the AI Act labels them as Unacceptable Risk and prevents their deployment. There are many opposing opinions of the overall benefits and problems of the AI Act. But regardless of those overall opinions, and regardless of the AI Act’s specific risk categorizations, I believe the AI Act is generally correct: AI implementations must thoroughly evaluate their risk.

Never forget: LLMs hallucinate.

For low-risk AI use cases, I have few concerns about using an LLM, because LLM hallucinations have minimal costs to the user or business. But as the AI risk level increases, too many people simply don’t appropriately factor in the costs of hallucinations to their users and the costs to their business. There are costs to their reputation and to their financial and legal liabilities.

Developers who cavalierly talk about quickly deploying LLMs for High Risk, and some Medium Risk, use cases are displaying Irrational Exuberance.

Comment (1)

  1. Dr. carolyn turbyfill

    This is interesting. The comparison with Irrational Exuberance in the financial market is insightful.

    2024-05-04 at 12:21 pm

Comments are closed.


    Subscribe to see our posts as soon as they’re available.