Under the oft misused guise of compassion and technological progress the assisted suicide movement has taken another chilling step forward. Philip Nitschke, otherwise known as “Dr Death,” the Australian euthanasia campaigner behind the notorious Sarco suicide pod, is now proposing that artificial intelligence should replace psychiatrists in deciding who is mentally fit to die. At the same time, he is developing a new, larger version of the device designed for couples who wish to end their lives simultaneously.
Sarco, a 3D-printed capsule that kills by flooding its chamber with nitrogen gas, first came to international attention in 2024 after it was used by a 64-year-old American woman in Switzerland. The incident triggered arrests and a criminal investigation, with Swiss authorities later declaring the device incompatible with national law. Despite this, Nitschke has pressed on, undeterred by legal, ethical, or moral objections.
Central to his latest vision is the removal of doctors from the process altogether. Under existing assisted suicide regimes, assessments of mental capacity are usually carried out by psychiatrists. Nitschke dismisses this as inconsistent and subjective, arguing instead for an AI-driven system using an avatar to determine whether someone is “of sound mind”. If the algorithm is satisfied, the pod would be activated for a 24-hour period, during which the person could choose to die.
Even more disturbing is Nitschke’s explicit rejection of any medical or terminal criteria. What began decades ago as a campaign for physician-assisted dying in narrowly defined circumstances has hardened into a belief that suicide itself is a human right, regardless of illness. Sarco is not about end-of-life care, it is about normalising death as a consumer choice, stripped of relational, medical, or ethical safeguards.
The proposed “Double Dutch” Sarco pod is designed for two people to die together; it would only activate if both occupants press their buttons simultaneously. Nitschke has said couples, including one from Britain, have already expressed interest in “dying in each other’s arms.”
Supporters frame these developments in the language of dignity, control, and innovation. Yet there is nothing dignified about replacing human judgement with code, or about treating suicide as a technical problem to be solved with better design. Technology is not neutral. It reflects the values and assumptions of those who build it, and in this case those values are starkly anti-human.
SPUC’s Communications Manager, Peter Kearney, says, “That such a profound judgement could be delegated to software should alarm anyone concerned about human dignity. Mental capacity is not a box-ticking exercise, nor is it reducible to a scripted conversation with a machine. Emotional distress, depression, loneliness, and fear can all masquerade as rational consent, particularly at moments of vulnerability. Handing life-and-death decisions to an algorithm risks mistaking despair for autonomy and abandoning people precisely when they need that care, support, and protection. At a time when societies are grappling with loneliness, mental ill-health, and inadequate care for the sick and disabled the answer cannot be faster, more efficient ways to die. The push to automate assisted suicide reveals the end point of this ideology: a world in which the vulnerable are offered death instead of help, and where responsibility is quietly outsourced to machines. Sarco is not a symbol of progress. It is a warning of how far the culture of death is willing to go, and how urgently it must be resisted.”
If you’re reading this and haven’t yet donated to SPUC, please consider helping now. Thank You!








