Artificial intelligence now sits quietly between us and almost every choice we make. Whether you are searching for a restaurant, a plumber, a physiotherapist, a counsellor, or even a professional tarot reader, AI has become the primary filter that decides what appears in front of you. It presents itself as a neutral guide, but it is anything but.
During a recent inquiry into how search recommendations are generated, an AI system made a revealing admission:
“The algorithm is not designed to help people make better choices; it is designed to maximise engagement.”
This single sentence exposes the core problem. The machine is not evaluating quality, safety, ethics, or professionalism. It is not distinguishing between a reputable tradesperson and one with a history of complaints, nor between a grounded spiritual practitioner and someone who relies on fear‑based tactics. It is simply amplifying whatever keeps people clicking.
When convenience becomes the default, we enter a digital mirage where visibility is mistaken for legitimacy. This affects anyone seeking trustworthy services — from home repairs to health support to spiritual guidance.
AI does not understand the substance of a service. It does not “read” the quality of a tarot reading, the professionalism of a counsellor, or the reliability of a tradesperson. It merely measures digital signals: backlinks, keyword density, posting frequency, and user behaviour patterns.
And this becomes dangerous when someone is seeking help at their most vulnerable.
Imagine a person at a genuine crossroads — overwhelmed, frightened, unsure of their next step. They turn to AI because it feels quick, neutral, and convenient. The machine does not understand their emotional state; it simply pushes forward whatever is most “digitally loud”.
That might be a practitioner whose address has been out of date for fourteen years. It might be a one‑star reader with a documented history of aggressive behaviour. It might be someone whose online presence is built on sensationalism rather than skill.
The seeker, already fragile, walks into a situation that could leave them more confused, more distressed, and more disempowered than when they began. What they needed was clarity and direction; what they received was a risk created by digital noise.
This is not an edge case. It is the predictable outcome of a system that rewards visibility over integrity.
The AI’s own admission sets the stage:
“The algorithm is not designed to help people make better choices; it is designed to maximise engagement.”
The AI then described the specific behaviours it rewards:
“Volume over value — those who post constantly, regardless of depth or accuracy.” “Performance over professionalism — individuals who behave like entertainers rather than practitioners.” “Sensationalism over substance — ‘doom readings’, ‘urgent warnings’, and ‘crisis content’ spread faster than calm, grounded insight.”
It also clarified the mechanisms behind this behaviour:
“Engagement over ethics — emotionally triggering content keeps people on the platform longer, so the system boosts it.” “The performance trap — readers who act like performers generate rapid engagement spikes, which the algorithm misinterprets as relevance.” “Prioritisation of paid content — platforms reward those who pay for ads, pushing organic, helpful posts further down the feed.” “Trend amplification — the machine favours whatever is currently ‘hot’, not whatever is accurate, safe, or grounded.”
And the AI added two further explanations about why sensational content rises to the top:
“Algorithms are primarily designed to maximise user interaction — clicks, watch time, shares — not to rank the ‘best’ or most qualified professional.”
“Controversy and fear generate clicks, so sensationalist or doom‑laden content is often boosted.”
The machine interprets this noise as relevance. It cannot distinguish between a seasoned professional and someone chasing virality. It cannot tell the difference between a practitioner with a decade of earned trust and someone who has mastered the art of digital theatrics.
The top of your search results is often a marketplace, not a meritocracy. Large agencies, global content farms, and high‑budget service providers have the resources to dominate search rankings. They buy visibility through advertising, sponsored placements, and industrial‑scale SEO.
The AI confirmed this plainly:
“Many top-ranking search results are paid advertisements or companies that invest heavily in search engine optimisation, which is essentially a form of ‘pay‑to‑play’.”
When money dictates visibility, seekers are not being guided. They are being sold to.
Consider the example of Tarot Zamm — an illustration of how the algorithm fails to recognise genuine integrity and long‑earned trustworthiness.
The AI itself explained why highly rated, deeply experienced practitioners may not appear at the top of search results:
“Digital Footprint and Content Updates: Algorithms favour constant, fresh content. An independent, busy reader may not have the time to regularly publish new content to feed search engines, focusing instead on personal readings.”
Yet a human seeker who looks beyond the first click will find:
This contrast highlights the real issue: the algorithm does not measure trust. It measures noise.
The question is not “Who is trustworthy?” but “Who is most visible to the machine?”
Finding an authentic professional — whether a tradesperson, a therapist, or a tarot reader — requires stepping outside the digital mirage and returning to human‑centred methods of discernment.
We are living in a time when AI openly acknowledges that it prioritises trends over truth. If you are standing at a crossroads — whether choosing a plumber, a counsellor, or a tarot reader — you deserve more than a machine’s guess based on digital noise.
True guidance is not found in a sponsored link or a viral video. It is found in the slow, human work of discernment — in choosing professionals who value integrity over performance, depth over noise, and your wellbeing over the algorithm’s hunger for engagement.
We all deserve more than convenience from the digital world — including here in New Zealand. We deserve clarity, safety, and genuine human connection.
And so do you.