When misinformation becomes a health threat

Editorial

Last week, Healthy.mt co-founder and content strategist Dace Skadina attended a conference on identifying disinformation, organised by the European Parliament Liaison Office in Malta in collaboration with the 3CL Foundation, the University of Malta, and MCAST.

We are more than ever immersed in an attention economy that is both ruthlessly competitive and easily weaponised, as aptly noted by representatives of the European Parliament.

As the forum for public discourse has shifted to profit-driven social media platforms, the cycle reinforces itself. Breaking it requires stronger media literacy, education, and smart regulation. This was the main thrust of the discussion during a conference. 

And here are some of the key WHYs behind its impact on health—both at the individual level and across the wider health landscape.

What if you were led to believe that conditions such as vision loss, cancer, or diabetes could be cured with a so-called miracle remedy trending on social media or through a private “healing” session. A compelling promise, perhaps—but one that can quickly unravel into serious harm.

Pseudoscience and unverified claims circulate across platforms, promoting detox teas, extreme fasting, or “biohacking” without evidence. As a result, people may adopt unsafe regimens, and misinformation spreads more rapidly than medical guidance. Trust in trained professionals and evidence-based care is undermined. Patients may even abandon proven treatments such as chemotherapy or surgery, while others profit from their vulnerability.

Mental health misinformation often shows up in subtle but damaging ways—from the idea that depression is simply a sign of weakness to the belief that therapy is ineffective. These narratives can quietly discourage people from seeking support, intensify existing struggles, and, in more severe cases, heighten the risk of self-harm or suicide.

Self-diagnosis, fuelled by late-night symptom searches and endless scrolling, has become an increasingly familiar reflex. Internet and AI sometimes replace the waiting room and search bars stand in for medical advice. Yet what begins as reassurance can easily slip into confusion, delaying proper care and, in some cases, allowing serious conditions to quietly worsen. No algorithm, however sophisticated, can truly replace the nuance of a clinician’s trained eye and human judgment.

That said, we fully support the role of artificial intelligence and innovation in medicine; however, it is essential to recognise that these tools are not a substitute for professional medical judgment.

“Weight loss in 10 days”– style diet and nutrition myths promise quick fixes and instant transformation, often encouraging extreme approaches such as cutting out entire food groups without medical justification or relying on so-called “detox” regimens. Frequently wrapped in fat-shaming narratives and the idealisation of unattainable body standards, these messages can quietly distort people’s relationship with food. The consequences can be serious—ranging from nutritional deficiencies and disordered eating patterns to longer-term metabolic disruption.

Persistent misinformation surrounding vaccine safety and claims of “overloading” children’s immune systems continues to circulate, despite scientific consensus to the contrary, resulting in reduced vaccination uptake, disease outbreaks, and increased risks to children’s health.

Throughout our 30-year professional career, we have observed a wide range of such examples. However, we will not share them here, as was emphasised at the conference—sharing false information, even in a critical context, can inadvertently contribute to its further spread.

Taken together, these behaviours can allow diseases to progress unchecked, leading to more complex and costly interventions for both individuals and healthcare systems as a whole, as well as preventable loss of life and a range of other serious consequences.

And finally, here are the perspectives of representatives of the European Parliament, which are of critical importance in the context of health.

As European lawmakers we set clear rules for the Big Tech platforms to stop the spread of disinformation, but in times of Transatlantic tension the enforcement of the Digital Services Act is stalling. We need to be serious about our rules, because those that want to provide services in the EU need to respect and protect our democracy.

MEP Alex Agius Saliba

“We must fight misinformation first and foremost on the ground in Malta. At European level, we are working on concrete measures to improve transparency and accountability in the digital space. But awareness and education locally remain key – initiatives like this play an important role in empowering citizens. From the institutional point of view, we also need to ensure the capacity to enforce the rules we enact.” said MEP Peter Agius.

In a world that is increasingly interconnected, where information spreads fast and there is so much of it that it is difficult to digest, it is easy to be misinformed, disinformed and even manipulated by what we see and read – which is why the European Parliament is acting through stronger legislation, greater transparency for online platforms, and support for media literacy and fact-checking to help citizens regain control over the information they consume.

Dr Mario Sammut, Head of the European Parliament Office in Malta, contextualising the discussion

27.04.2026.

Suggested

Discover more from Healthy.mt

Subscribe now to keep reading and get access to the full archive.

Continue reading