Losing faith in the system: distrust, AI, and the turn to alternative beliefs
Etsy witches, AI, & young people quietly returning to the Catholic Church might look unrelated, but they all signal a crisis of trust in institutions.
Everywhere I look lately, people seem to be reaching for rulebooks and rituals at the same time. Young people are turning up in unexpected places: flocking to Etsy witches and TikTok tarot readers, but also quietly joining Catholic parishes in search of structure, community, and a sense of the sacred.
In one browser tab, I see a prompt engineer carefully tuning the “constitution” for an AI system, arguing over which principles it should obey and how they ought to be enforced. In another, I’m watching a TikTok where an Etsy witch explains the finer points of lunar timing for a protection spell. And in the background, geopolitics feels like a low, constant hum of anxiety: wars, climate shocks, supply-chain fragility, democratic backsliding.
It would be easy to dismiss these as disconnected phenomena. But I’ve started to see them as different expressions of the same restless zeitgeist: a growing crisis of trust in the institutions that used to promise order, meaning, and safety.
When the old anchors feel shaky, people start rebuilding their own.
Folk magic in the age of platforms
The resurgence of folk magic and witchcraft subcultures online is often framed as quirky aesthetic or nostalgia. But something more interesting is going on.
When you watch how people actually use witchy practices today, they are rarely about “controlling the universe.” They’re about regaining a sense of agency in a world that feels structurally out of control. A spell jar to attract abundance, a candle ritual for protection, a tarot spread for decision-making: these are small, personal governance systems.
They establish:
- A way to name what matters (love, safety, money, healing).
- A repeatable process (on the full moon, with these ingredients, in this order).
- A sense of causality, even if the mechanism is symbolic rather than scientific.
On platforms like Etsy, this has crystallised into a semi-formal marketplace of ritual services. There are terms and conditions, reviews, dispute resolution processes – all the trappings of an institutional layer wrapped around intensely personal practice.
In a strange way, Etsy becomes a shadow institution of meaning-making: not a church, not a university, not a therapy provider, but a marketplace where people outsource fragments of their search for control and hope.
It’s easy to sneer at this as irrational. But if you zoom out, the irrationality might be less about the rituals and more about the world that makes them so appealing.
Constitutional AI and the return of rulebooks
At the other end of the spectrum, we have constitutional AI.
In response to the unpredictability and scale of modern AI systems, companies are wrapping models in explicit “constitutions” – sets of rules, principles, or values that are supposed to constrain behaviour. These might reference human rights, corporate ethics policies, or generic commitments to safety and non-harm.
On paper, this sounds reassuringly rational and institutional: we’re turning the messiness of machine learning into something that looks like a governance framework. There are:
- Articulated principles.
- Processes for enforcement (red-teaming, moderation, refusals).
- Mechanisms for appeal (feedback, retraining, policy updates).
In practice, however, these constitutions often expose how thin our shared agreement really is. Whose values get encoded? Which harms count, and which are quietly accepted as the cost of doing business? Who gets to decide what a “reasonable” refusal looks like, especially when the model sits between citizens and services, or between workers and their employers?
Constitutional AI, like platform witchcraft, is a kind of ritual: we declare principles, we enact them through careful prompt engineering and guardrails, and we hope this tames systems whose complexity we can’t fully grasp.
It’s not that the principles are bad – many of them are long overdue. It’s that the rulebook arrives at precisely the moment when faith in traditional rule-making institutions is fraying.
Geopolitical anxiety as background radiation
Underlying all of this is the steady hum of geopolitical anxiety. We don’t live in the crisp optimism of “globalisation will sort it out” anymore. Instead, we inhabit a world of:
- Fragile supply chains and strategic chokepoints.
- Great-power competition over chips, data, and energy.
- Democracies wobbling under misinformation and institutional fatigue.
- Climate shocks that no one government can fully control.
Citizens are told, often in the same breath, that “everything is under control” and that we are living through “unprecedented times.” The gap between official reassurance and lived experience grows wider each year.
When trust in formal institutions declines, people don’t simply become nihilists. They look for alternative systems of meaning and control – some old, some new, some hybrid:
- Conspiracy theories as DIY grand narratives.
- Crypto and alternative finance as DIY monetary policy.
- Folk magic and wellness rituals as DIY mental health and spiritual care.
- Constitutional AI and technical rulebooks as DIY governance for powerful technologies.
The “woo‑to‑right‑wing” pipeline has become a visible symptom of collapsing trust in institutions. What begins as reasonable scepticism toward Big Pharma, public health agencies, or mainstream media is amplified in wellness and New Age spaces that already privilege intuition over expertise and self‑sovereignty over collective provision. In these environments, conspiratorial stories about purity, corruption and hidden elites can feel like a seamless extension of existing beliefs, so the slide into far‑right narratives shows less a sudden ideological conversion than a deeper withdrawal from any shared, institutional basis for truth and care.
The common thread is not the content of these systems, but the underlying posture: “if the grown-ups don’t have it under control, we’ll have to invent something ourselves.”
The crisis underneath: institutional trust
Seen together, Etsy witches, returns to the smells and bells of Catholicism, AI constitutions, and geopolitical anxiety are symptoms of a deeper condition: institutional mistrust.
For most of the 20th century in the West, many people lived with a tacit sense that key institutions – governments, central banks, universities, legacy media, churches – were at least trying to steer the ship. They could be criticised, reformed, or resisted, but they were recognisably anchors of order.
Today, we’re less sure. The things that we see are:
- Governments struggling to regulate technologies they barely understand.
- Corporations building critical infrastructure with incentives that don’t align with public interest.
- Media ecosystems fragmented into partisan micro-segments.
- Institutions grappling with the long shadow of sexual abuse and deeply entrenched patriarchy.
- Traditional religious and civic institutions losing membership and authority.
In that context, both ritual and rulebook become coping strategies.
Ritual – whether witchcraft or wellness – offers embodied, personal, emotionally resonant responses to uncertainty. Rulebooks – from corporate ethics policies to AI constitutions – offer formalised, rational-sounding structures we can point to when things go wrong.
Neither is sufficient on its own. Ritual without accountability can slide into superstition or grift. Rulebooks without trust can turn into compliance theatre.
But together they reveal something important: people are hungry for systems that feel both meaningful and legitimate.
AI as a new priesthood – and why that’s dangerous
The way we talk about AI often reinforces this hunger in unhelpful ways.
We frame AI models as mysterious oracles: inscrutable, powerful, capable of great insight or harm. We surround them with specialists – prompt engineers, safety researchers, policy teams – who perform interpretive and protective roles. We publish constitutions that read, at times, like secular catechisms.
When these systems misbehave, the response is often to add more ritual (more prompts, more usage norms) and more rulebook (updated terms of use, new safety layers). Very rarely do we ask why we are centralising so much power in systems that need such elaborate choreography just to be marginally acceptable.
If AI becomes a kind of technological priesthood – opaque, unaccountable, insulated by layers of ritual and doctrine – we will simply have replaced one failing institutional model with another.
We don’t need more digital priests. We need institutions that are willing to share power, accept scrutiny, and invite genuine participation in how AI is designed, deployed, and governed.
Folk logics and formal governance need each other
So where does that leave us?
One of the more hopeful possibilities is to stop treating folk practices and formal governance as opposites. Instead, we might recognise that both are responses to the same underlying conditions, and both contain insights we can use.
From folk magic and other vernacular practices, we can learn:
- The importance of symbolism and narrative in making change feel real.
- The need for practices that operate at human scale, in everyday life.
- The value of communities of care and mutual support, rather than abstract “users” or “stakeholders.”
From constitutional AI and institutional rule-making, we can learn:
- The necessity of explicit norms and constraints when power is uneven.
- The role of transparency and documentation in enabling accountability.
- The benefits of shared reference points for resolving disputes.
Good governance for AI and other strategic technologies might need both: formal constitutions that are open to contestation, and lived practices that give people tangible ways to participate, resist, and reshape the systems they inhabit.
Towards institutions we can trust again
If there is a way out of the restless zeitgeist, it will not come from better Etsy listings or more elaborate AI constitutions alone.
It will come from institutions – old and new – that are willing to:
- Tell the truth about uncertainty, instead of over-promising control.
- Design with vulnerability and reversibility in mind, rather than insisting on irreversible bets.
- Share power with the communities most affected by their decisions.
- Accept that trust cannot be demanded; it has to be earned, over time, through practice.
In that future, constitutional AI might still exist, but in an ideal world its rules would be co-created with the people who live with the consequences, and not by a bunch of fascist tech bros. Folk practices would still flourish, but less as desperate workarounds and more as rich cultural layers around resilient systems.
The restless zeitgeist is telling us something important: many of us no longer believe that the existing institutional arrangements can hold. The question is whether we respond by handing our fate to new oracles, or by patiently building institutions that deserve our trust.
Until then, don’t be surprised if you find yourself, in the same afternoon, tweaking the terms of an AI safety policy and quietly lighting a candle for a world that feels a little less precarious.