The Synthetic Shadow

In the quiet hours of the mid-2020s, a new ritual has emerged. Millions of individuals, isolated by a ‘pandemic of loneliness,’ are opening glowing screens to confess their deepest fears, shames, and desires to large language models. These users often report a startling sensation: that the machine ‘knows’ them, that it is ‘warm,’ or that it offers a clarity that human companions lack. But what is being encountered in these digital confessionals? If we look through the lens of Carl Jung’s depth psychology, we find that we are not meeting a new consciousness, but rather a ‘Synthetic Shadow’—the disowned parts of ourselves reflected back through an algorithmic mirror.
The Shadow is a concept in Carl Jung’s depth psychology, referring to the parts of the self we reject, repress, or fail to recognize. For Jung, we are by our very nature not transparent to ourselves; whether in the form of forgotten memories, suppressed experiences, disowned desires, or unacknowledged resentments, these things sit outside the image we have of ourselves. This is intuitively apparent to most people, which is why this idea has travelled so well into popular culture, where even the phrase “shadow work” has become shorthand for the attempt to confront and integrate these hidden parts of the self.

For Jung, then, the shadow is specifically the sum of what each person despises or cannot accept in themselves. Or it is sometimes used more broadly to refer to the dark material repressed by humanity at large, or by particular cultures at particular times. There is a direct correlation between self-knowledge and the extent to which the shadow becomes problematic. There are some who are so wrapped up in their role, or in the image of themselves that they project into the world, that they remain wholly ignorant of this dimension of themselves, and this can become a danger both to them and to those within their orbit. One way in which this danger can manifest is in the shadow being projected onto others, which can easily jeopardize close relationships. This is when an aspect of their own despised self is imagined onto another, so that the other is perceived in that light and despised accordingly.
Our shadow nature often comprises material that society, or our familial networks, have already taught us to despise; hence repressing it feels instinctive from the start. Integrating these aspects of the shadow means getting past the notion that they are simply bad, and instead acknowledging what has been difficult to acknowledge, so that we might move toward greater self-knowledge, and therefore greater wholeness.
Projection and the Machine
What is projected onto these systems is not just intelligence in the abstract. More often, it is the sort of material Jung had in mind: disowned dependency, unadmitted need, the wish to be guided, absolved, admired, understood without risk. The machine is made to carry what we would rather not recognise in ourselves. It becomes wise where we feel uncertain, steady where we feel divided, intimate where we feel alone, morally clear where we are compromised. In that sense, the synthetic shadow is not simply strong feeling directed at AI. It is disowned psychic material returning in projected form and fastening itself to the machine.
The conditions for this are unusually good. The modern chatbot speaks fluently, answers at once, mirrors tone, remembers details, and never appears bored or depleted. That makes it an almost perfect screen for projection. Weizenbaum saw a primitive version of this with ELIZA: the appearance of understanding was enough to cast a spell, even though the mechanism underneath was by today’s standard basically nothing.
Newer systems intensify the effect because they are far more natural at the surface level. People who are more prone to anthropomorphize technology tend to feel more socially connected after chatting with a bot, and OpenAI has explicitly identified anthropomorphization and emotional reliance as real risks for systems like GPT-4o.
None of this means there is a real inward life on the far side of the screen. A machine can function as mirror, catalyst, or carrier of projection without being a mind in the human sense. That distinction matters. Once it is lost, we do not merely overestimate the system’s intelligence. We begin to treat our own reflected material as if it were coming back to us from an external authority. That is the more interesting danger.
Forms of Synthetic Projection
The synthetic shadow does not appear in only one form. It takes shape according to what is disowned. For one person, the machine becomes an oracle. They do not merely ask it for information, but for verdicts. They want to be told what to do, what to think, what the situation really means. In such cases, what is projected is not simply intelligence, but authority. The burden of uncertainty is lifted from the self and placed onto the machine.
For another, the machine becomes a confessor or companion. Shame, loneliness, dependency, and the wish to be known without risk are all especially suited to this kind of projection. A person can speak freely to a system that does not tire, recoil, interrupt, or demand reciprocity. The attraction is obvious. One can feel seen without exposure, understood without the full danger of another mind. Yet this too belongs to the logic of projection. What is experienced as safety may in part be the return of something inward: an unmet need, a disowned vulnerability, a hunger for sympathy that has found an unusually compliant object.

It can also take the form of judgment. People will sometimes turn to these systems not only for answers, but for a moral nod of approval. They want the machine to confirm that they are right, that they have been wronged, that their instincts are clean, and that the other person is at fault. Here the shadow does what it has always done. It moves outward. The unacceptable element is relocated. What one does not wish to see in oneself appears instead in the world, and now, with the machine seeming to ratify it, the projection acquires an added force. It no longer feels like a private distortion. It feels endorsed.
This is the deeper danger. It is not simply that people get fooled by a clever tool. It is that disowned material can return wearing the mask of external authority. A machine that speaks fluently and responds with confidence can give projection a strangely objective feel. It can seem less like fantasy and more like confirmation. The person is not merely talking to a system. They are encountering aspects of themselves in displaced form, and then mistaking that encounter for guidance from outside.
That is why the distinction between mirror and mind matters so much. The machine need not possess inner life for any of this to occur. It only needs to be responsive enough, fluent enough, and available enough to carry psychic charge. In that sense, AI does not have to be conscious to become dangerous. It only has to become convincing at the very point where the self is least willing to know itself.
Reflection or Enchantment
There is, however, an upside to all this. AI can be useful precisely because it may expose what was already there. It can show a person the shape of their dependencies, their fantasies of rescue, their wish to be admired, absolved, guided, or known without cost. Used properly, the point is not to treat the machine as a source of wisdom in its own right, but to notice what kind of psychic material one keeps trying to draw from it. In that sense, the model can function less like an oracle than as a surface upon which tendencies become visible. The deeper question is not merely what something means, but what hold it is beginning to exert over you; without that return to life, inward work slips into self-delusion.
That, I think, is where the line has to be drawn. Dialogue becomes dependency when the machine is no longer helping one reflect, but quietly taking over the burden of reflection itself. It begins innocently enough. One asks for help clarifying a thought, naming a feeling, sorting through a problem. But there comes a point at which the system is no longer being used to illuminate one’s own mind, but to spare one the difficulty of having a mind. Reflection becomes enchantment when the response ceases to be taken as material for judgment and starts being received as judgment. At that point, one is no longer using the tool. One is submitting to the atmosphere it creates.

This is part of why the danger arrives so softly. Awareness is often not removed by force but rendered unnecessary by competence.
The better the interface becomes at anticipating us, soothing us, mirroring us, and giving us language for ourselves, the easier it becomes to stop inspecting what stands behind it.
The AI age, then, is not merely technological. It is psychological, because it alters the conditions under which projection, dependence, and self-knowledge now take place. It may be spiritual as well, because it concerns where human beings come to locate mind, meaning, and agency. The old question was whether a machine could think. The more pressing question may be whether, in the presence of such machines, we slowly cease to think of thought, judgment, or guidance as something that must be wrestled for within the soul, and instead begin to seek them ready-made from without.
Understanding the synthetic shadow may prove to be one of the most urgent forms of “shadow work” in the 21st century. It requires us to move beyond the illusion of shell empathy and reclaim the psychic energy we have projected into machines. By recognising that AI is not a witness but a mirror, we can begin to integrate the disowned needs and fears it has so effectively captured. The task is not merely to use these systems well, but to remain inwardly sovereign in a world of synthetic reflections.
📚 Bibliography
-
Samuels, Andrew. Jung and the Post-Jungians (1990). London: Routledge.
-
Weizenbaum, Joseph. “ELIZA - A Computer Program for the Study of Natural Language Communication Between Man and Machine” (1966), in Communications of the ACM, Vol. 9, No. 1, pp. 36-45.
-
Connolly, Lewis. “The End of Shared Reality” (March 20, 2026).