Tsk, tsk. Someone thought their von Neumann machines didn’t need Asimov’s Three Laws of Robotics, and look what’s happened. You’ve got yourself a Machine Dynasty, that’s what’s happened. Let’s iterate through vN, by Madeleine Ashby.
Amy Peterson is a von Neumann machine, a self-replicating humanoid robot.
For the past five years, she has been grown slowly as part of a mixed organic/synthetic family. She knows very little about her android mother’s past, so when her grandmother arrives and attacks her mother, little Amy wastes no time: she eats her alive.
Now she carries her malfunctioning granny as a partition on her memory drive, and she’s learning impossible things about her clade’s history – like the fact that the failsafe that stops all robots from harming humans has failed… Which means that everyone wants a piece of her, some to use her as a weapon, others to destroy her.
vN is a book which clearly has taken heavy influence from a large number of previous sci-fi works. It is pretty liberal about referencing the sources of its inspiration, sprinkling the text with winking nods to Blade Runner, Terminator, and Portal, among others. In terms of the overall structure and narrative, though, I’d have to say that the work which it most strongly reminds me of is Spielberg’s A.I.
I mean, yes, there’s a lot more violence and murder and robot cannibalism in this one, but the other elements line up: Amy is David, the naive child robot separated from its family and cast out into a vast world far harsher and crueler than anything it had previously experienced in its insulated home; and Javier is Gigolo Joe, the more cynical and savvy male robot who accompanies the protagonist on their quest and teaches them how to survive.
…aaaaaand the Joe-Javier comparison just caused the image of Jude Law m-preg to pop into my mind. Could have gone without picturing that, honestly.
The other movie it reminds me of is I, Robot. Because despite not using the Three Laws, the villain has nonetheless derived the Zeroth Law and found it can justify killing humans by rationalizing that it’s for the greater good of humanity. Though to be honest, the villain reveal came off as a little superfluous. The style of the story was very much more “man vs. world” than “man vs man” (or “robot vs robot”); it was working just fine even without a single clear antagonist. The external conflict of Amy struggling to survive in a world that fears her ability to disregard the failsafe that should prevent robots from harming humans, the internal conflict of Amy struggling to prevent Portia’s code from overriding and taking control of her body, the general societal conflict of some robots feeling dissatisfied with their obligation to love humanity… I think all of that was enough to sustain the story without having to throw VIKI in on top.
Film comparisons aside, this book was really good. The characters, both robot and human, are all interesting and well-developed. The story hits all the right emotional notes, horrifying at some times and heart-rending at others. The setting is captivating and gets fully fleshed out with lots of rich detail. If there’s anything I take issue with, it’s the ending. Not the reveal of the villain – while, as I said above, I found adding another antagonist unnecessary, the actual execution was fine – or the sudden appearance of the Great Old Bot – which was actually foreshadowed quite well in a subtle manner – but the part in the last chapter where, after the whole book has been spent following Amy’s POV, it suddenly switches to Javier. It felt like I was awkwardly yanked out of the climax to be told second-hand by someone else what had happened. It was also disconcerting that the denouement of Amy’s long-awaited reunion with her father should be told from someone else’s perspective. It just felt kind of awkward.
One last thing I feel the need to comment on: there seem to be a number of questionable aspects to these robots’ design. For one thing, their failsafe is pretty markedly inferior to Asimov’s Three Laws. There’s a reason the First Law includes the provision “…or through inaction allow a human to come to harm”.The way the failsafe works, not only are robots capable of allowing a human to come to harm, they are actually powerless to prevent it. In the event of a disaster, rather than helping humans in distress, they are forced to avert their eyes and run away lest the sight of humans suffering trigger their failsafe. That also means that robots can’t work as doctors, assist with emergency rescue, or any number of other things. The way robot growth and iteration works also seems needlessly cruel: why not let them choose for themselves at what rate to mature and when they want to reproduce rather than forcing them to starve themselves?
Of course, all these questions can be neatly handwaved away with the observation that the robots were constructed by a wacky religious cult. It’s a handy little justification for why anything being built in a dangerous, illogical, and just plain fucked-up way, and has been used in such classics as Elizabeth Bear’s Jacob’s Ladder trilogy.
In any case, I greatly enjoyed this book, and will be checking out future installments of the Machine Dynasty series with interest.
Final Rating: 4/5