With artificial intelligence reshaping the world and being introduced in every sector imaginable, including healthcare, smoking cessation support is no exception. But can true tobacco harm reduction be achieved without human empathy and intuition?

Artificial intelligence (AI) is being increasingly hailed as a transformative tool in healthcare — and tobacco control is no exception. From monitoring tobacco marketing trends to delivering personalised quit support, AI-driven systems are beginning to influence how public health organisations and individuals approach smoking cessation. Yet, while AI excels at pattern recognition and data synthesis, it still lacks the one quality that defines successful human interaction — genuine empathy and intuition. This limitation matters deeply in areas like tobacco harm reduction, where emotional connection and trust often determine success or failure.

Can AI help smokers quit?
A 2025 scoping review covering over 1,100 studies found that 57 met the criteria for examining AI applications in tobacco control. These ranged from predictive models using machine learning algorithms like XGBoost and SVM to anticipate relapse risks, to natural language processing tools that track emerging vaping trends on social media. Around 45% of studies aligned with WHO FCTC Article 14 — cessation services — demonstrating AI’s growing role in supporting smokers who want to quit. Yet, the same review highlighted serious ethical and practical concerns: algorithmic bias, data privacy, interpretability challenges, and even tobacco industry manipulation of AI tools to target vulnerable populations.

The global public health community has also started experimenting with conversational AI — chatbots, avatars, and virtual coaches that aim to mimic human counselling. The World Health Organization (WHO) launched “Florence,” a virtual health worker powered by AI, to help the world’s 1.3 billion tobacco users quit. Florence uses natural language processing to debunk myths about smoking, guide users through evidence-based quit plans, and refer them to national quit lines or support apps. For many people, particularly those without access to live counsellors, Florence offers a convenient and stigma-free way to begin the journey toward quitting.

A 2023 meta-analysis of five randomised controlled trials involving nearly 59,000 participants found that users engaging with conversational AI interventions were significantly more likely to quit smoking compared with those who received no digital support (RR = 1.29, 95% CI 1.13–1.46). These results echo other studies showing that AI-enhanced interventions can increase accessibility and engagement in cessation efforts, particularly among tech-savvy or remote populations.

However, researchers are quick to point out the limitations. The studies displayed high heterogeneity and bias, and dropout rates were substantial. Moreover, while AI can simulate understanding, it does not feel or intuit human emotion. It cannot read subtle cues — a trembling voice, a certain facial expression, a moment of hesitation, or a sigh of frustration — that a trained cessation counsellor would notice instantly. Quitting smoking is not just a data problem; it’s a deeply personal journey rooted in emotion, habit, and identity. AI can assist, but it cannot replace the nuanced empathy that a human coach or clinician provides.

Craving the human touch
This distinction mirrors another dynamic in the harm reduction world: the preference of many vapers for buying their products from brick-and-mortar vape shops. While online stores and automated kiosks offer convenience, physical vape shops provide something technology cannot — human connection. Experienced vape retailers often act as informal cessation advisors, guiding smokers toward safer nicotine alternatives, offering device education, and fostering a supportive environment that encourages long-term switching. This “human element” is key to successful harm reduction. It ensures that people feel heard, supported, and understood — something AI, for all its sophistication, cannot yet replicate.

Indeed, even in its most advanced form, AI operates without self-awareness. It recognises patterns but lacks the lived experience and emotional resonance that drive behavioural change. As behavioural scientists have long observed, successful smoking cessation often hinges on trust, empathy, and accountability — qualities rooted in human interaction. Mohr’s “Model of Supportive Accountability” underscores this: digital interventions become far more effective when paired with a human element, such as a caring and competent coach.

AI can support, but not replace
AI undoubtedly has a powerful role to play in expanding access to cessation support and enhancing global tobacco control. It can analyse complex data sets, identify at-risk populations, and deliver timely interventions at scale. But its future success will depend on how ethically, transparently, and collaboratively it is implemented — and whether it complements, rather than replaces, the human touch that remains central to harm reduction.

As the tobacco control landscape continues to evolve, a hybrid model seems most promising: one that blends the precision and scalability of AI with the empathy and lived experience of human support. Florence and other AI-driven tools represent the beginning of this journey, not its culmination. The true tobacco harm reduction endgame will not be achieved by algorithms alone, but through technology that empowers humans — counsellors, vape retailers, and peers — to do what they do best: connect, understand, and care.