The term “AI” is often used to describe machine learning (ML) projects, resulting in inflated expectations and distractions from the real and practical ways ML can improve business operations. This confusion exacerbates project failure rates and blurs the distinct lines between ML and Artificial General Intelligence (AGI), creating an unrealistic narrative that can lead to an AI disillusionment phase, or “AI Winter”.
- Machine learning (ML) has the primary function of making actionable predictions, which can bring tangible value to businesses by driving millions of operational decisions. For example, it can predict customer attrition or identify fraudulent credit card transactions.
- The term “AI” has been widely used in an inaccurate context, often representing ML tools and setting unrealistic expectations. It invokes the notion of Artificial General Intelligence (AGI) – software that can perform any intellectual task a human can – which most ML projects can’t deliver.
- The public tends to confuse “ML” with “AI”, which can lead to overselling ML business deployments and promising more than can be delivered. It is crucial for ML projects to focus on their real value – making business processes more effective – to achieve their objectives.
- The term “AI” is fraught with ambiguity, being a catch-all term that does not consistently refer to any method or value proposition. It often encompasses both AGI and narrow AI, creating confusion in common rhetoric and software sales materials.
- There’s a growing call to differentiate ML from AI to shield the industry from the next AI Winter. This requires resisting hype and maintaining a realistic view of ML’s true value proposition, ensuring it is not discarded when the exaggerated promises of AI are debunked.