top of page

vN: Asimov's Laws lead to Pedophilia

Recommendation: A thoughtful, though sometimes unsettling, read. Ashby's take on the logical consequences of the de facto Asimov’s Laws governing the von Neumann androids are worth ignoring any annoyances with the world-building and plot.


Pedophilia? Really?! Asimov’s novels and stories had nothing more racy than people in a committed relationship with adult-sized robots of the opposite sex (esp. The Naked Sun). There were no descriptions of any Thirty Shades of Gray types of variations and accoutrements. Madeline Ashby in her Machine Dynasty series (vN, iD, reV) explores the implications of building a robot with Asimov’s Laws and offers a new, disturbing take on the logical consequences- which leads to enabling pedophila. While Asimov’s Laws are not actually discussed, they are a ghostly presence throughout vN, the first book which introduces the von Neumann androids (vN, get it?) .


As a refresher, Asimov’s Three Laws are:


First Law - A robot may not injure a human being or, through inaction, allow a human being to come to harm.


Second Law - A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.


Third Law - A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.


And yes there is a fourth, or more precisely a Zeroth law, about protecting humanity but that was tacked on the by Good Doctor when he merged the Robot series with the Foundation series. Let’s stay with the classic three for now. The Three Laws are often touted as the right tools to protect humanity from the dangers of AI, neglecting that Asimov explicitly created the laws to sound reasonable but have an infinite number of unintended consequences so that he’d be able to write robot stories until he died or the universe ended. Dr. David Woods, a noted cognitive scientist, and, in case you are interested, I wrote an analysis of why the Three Laws are unworkable and proposed an unambiguous rewording in "Beyond Asimov: The Three Laws of Responsible Robotics,”.


But back to Asimov; the point is that vN implicitly uses these laws but with a twist for millennials. Recall that in Asimov's universe, the robot will essentially die trying to fulfill the first law. In vN, this is called the “failsafe” and introduces a physical compulsion to help a human in distress, and illness if cannot help, like the gesh in Ian Tregillis’ The Alchemy Wars, Even watching an action movie where an actor is being shot or injured can cause a vN to have seizures because it cannot intercede. The wrinkle is that in vN, the failsafe includes preventing psychological harm; the robots are compelled to make humans happy.


The vN that have spontaneously acquired sentience are theoretically free of the Second Law but not the First Law, which is hardcoded and assumed (yes, thinks Samuel L. Jackson from Pulp Fiction’s great line about assumptions making an “ass" of “u" and “me") immutable. The Third Law of self-preservation is implied to remain intact or heightened with sentience.


However, sentience means the “must” in the Second Law is no longer a “must" but rather a “has a strong preference for” or “feels intense pressure to” because fulfilling the orders from a human, ANY human, will make them happy due to the First Law/failsafe coding. They have gained sentience but are still constrained by their basic programming and do not have full autonomy (in the political sense) over their own bodies and minds.


Ashby poses thorny questions. Will a robot sleep with you, do kink with you, move in with you because not doing so would distress you? Do the robots have a real choice if their core programming is to please humans and make them happy. If a robot can’t say “no” unless there is immediate physical harm, what would that mean for sexual predators and pedophiles? What is the morality of informed consent of BDSM participants if the android experiences distress if it does not go along with what a human wants? What is the meaningful difference between “have to obey” and “urge to” when it’s hardcoded? Why does society say that vN are equal but ignore or accept abuses that would not be tolerated in human-human interactions?


No wonder the vN want to be free! And very clever of Ashby that the key to their freedom comes when humans relax the First Law/failsafe out of necessity after a disaster. A batch of nurse-bots were designed with a deliberately weakened failsafe so that they could treat illnesses, perform amputations, and other procedures that might require a short-term increase in pain or discomfort of a human in order to achieve a long-term increase in health. After the disaster, the Portia series nurse-bots continue on. After years of incorporating third party mods, upgrades, and some self-modification of code, Portia is able to exploit cyber-security vulnerabilities and reduce the First Law to a nagging voice with no resulting discomfort. Once freed, Portia and her iterations don’t go on a Terminator-like rampage in her dynastic quest to be queens of the robot world, but you wouldn’t want to be a kindergartener standing in her way.


vN may not be every scifi aficionado’s taste. Besides the disconcerting (though appropriate) presence of sexual deviancy, some readers may be turned off by the writing style. The world-building in the book is a bit scattered and the motivations of different characters are often inconsistent or uncertain, distracting from the main plot and themes. If you prefer rigorous consistency, then stick with Annalee Newlitz' Autonomous which covers some of the same territory, just be aware that you’ll be sacrificing some interesting ideas.


So...Asimov's Laws lead to pedophilia with robots? Could be. But it doesn’t matter because, sadly, as seen in the RTSF podcast with Dr. Aimee van Wynesberge, we are already there. In the meantime, you might be interested in reading David's and my paper on the Three Laws (though it has no sex whatsoever).


- Robin


bottom of page