The Big Idea: Ferrett Steinmetz
Posted on July 28, 2020 Posted by John Scalzi 9 Comments
What does your cell phone or XBox have to do with AI-laden weapon systems? Ferrett Steinmetz has an inkling, and in this Big Idea for his new novel Automatic Reload, he lays it out for you.
You’re already too slow for technology. You knew that the first time you used a calculator; it could multiply 6,317 x 256 before you could pick up a pencil.
Some day a computer will be able to shoot you before your brain can tell your trigger finger to fire.
That’s right; militaries everywhere are working on targeted weaponry as we speak, which is a significant problem, but not an unsolvable one. Our troops are endangered on patrols because they don’t see the dangers surrounding them; wouldn’t it be nice to have automated weaponry seated on their shoulders that scans for incoming threats and neutralizes them before of our boys can get shot?
And while it’s nice to think we’d never use such automated systems in combat, well, war has never rewarded ethics as thoroughly as it should. Which means eventually some guns will be shooting themselves.
Which leaves one big question:
Who’s programming those targeting systems?
In my book Automatic Reload, the answer is “Our hero, Mat” – a man who’s a walking weapons platform, having replaced his arms and legs with auto-targeting armed prosthetics. He gets hired as a freelance mercenary to do jobs that computerized weaponry can’t do on their own. Because as it turns out, enemy combatants are very creative, and the one thing computers aren’t very good at is reacting to new strategies. So he stomps in to rescue a hostage, his fragile human torso shielded by bulletproof armor and surrounded by four lethal limbs, reprogramming his reaction packages on the fly to shoot the bad guys, and no one else.
Mat’s also breaking down under that pressure. Because he knows what happens when his auto-fire routines aren’t tuned correctly. For him, combat is like being in a car wreck – he’s patrolling when suddenly his limbs catapult him around to shield him, airbags deploying as his guns fire, and if things go wrong he’ll be dead before he knows what happened.
Or, worse, a bystander will be dead. It’s happened before. He had lax input restrictions on the drone he was flying for the US Air Force, and the routines he programmed generated an automated approval to fire upon a group of suspected terrorists.
Problem was, one of those suspected terrorists had his kid with him. The kid wasn’t a suspect. Mat got to watch that kid blown to smithereens through hi-def streaming visuals. That was why Mat quit the Drone Corps, and why he’s been determined to program perfectly safe target-capturing routines ever since, but…
Technology is hard.
The programs that run those armed prosthetics work a lot like your cell phone does – which is to say they’re occasionally glitchy, working well until they don’t. And as much as people would like to believe that programmers know it all, unfortunately, we don’t. The programs we’re building are reliant on libraries and interpreters and drivers built by other very flawed and very human programmers, so each of those potentially harbor bugs or misconfigurations.
The scary thing about the Internet, or your smart phone, or the apps on your television, is that nobody fully understands them. You’re perfectly justified in getting mad when nobody at tech support knows how to fix your bricked XBox, but the ugly truth is there’s so many layers of programs involved that it’s often impossible to tell what precisely broke down.
At this point, “stuff bugs out sometimes and we don’t know why” is pretty much the fundamental core of all our technology – which is not reassuring when your auto-targeting glitches can accidentally headshot a teenager riding a bike.
And so Mat is reacting to his televisually-induced PTSD by trying to achieve perfection. He’s one of the top prosthetic armament technicians on the planet. He spends sleepless nights replaying old missions, running tests, fine-tuning his weaponry, determined to make sure that his routines never take the wrong shot.
Except he’s entering war zones.
Things go very wrong in war.
Automatic Reload is about a lot of things, and it’s not quite as heavy as I’m making it out to be here… mainly because at its core, Automatic Reload is actually a romance. Mat finds a woman – a woman who happens to be a genetically engineered killing machine, but a woman – who he falls in love with, because she also understands the stress of combat. They talk about old movies, they dance, they destroy the cop cars who are chasing them, it’s strangely sweet.
But Automatic Reload is also about the morality of unplanned technology. We’re charging ahead into an uncertain future, where we’re deploying computer-targeted mass surveillance using data taken from cameras to try to automatically catch terrorists, and not thinking about how all our old biases are ingrained as assumptions into our AI routines. We’re using social media to deliver personalized experiences to trusting users, without thinking how those personalized news stories can warp someone’s perceptions – or how evil people can hack those news stories to propagate misinformation.
Automatic Reload is very personal. Because Mat is on a mission to rescue this assassin (who he kinda maybe loves), and he’s willing to go to great lengths to keep her safe. And yet every battle piles on more stress as Mat is forced to bring the war straight to New Jersey, where one buggy line of code could cause an auto-generated massacre.
Yet Automatic Reload is also about the ethics of programming in a world where technology is fundamentally unreliable. Your cell phone’s far too useful to give up, so you’ll forgive it when occasionally the screen goes black.
Likewise, when it comes – if it comes – the power of automated weaponry will be such an advantage that nobody will want to give it up. But it won’t be perfect, and it won’t be bloodless.
What cost will we be willing to pay?
Automatic Reload’s got a few suggestions for you.
Automatic Reload: Amazon|Barnes & Noble|Indiebound|Powell’s
Read an excerpt. Visit the author’s site. Follow him on Twitter.
For a number of years now, U.S. miltiary personnel have sat in air-conditioned control rooms and murdered people (including a wedding party) by remote control. We can’t blame technology for us being cold-blooded. There’ll always be moral decisions; even refusing to make one is making one.
The book sounds like fun with a lot of thought and depth; I look forward to it.
What will happen to the “action” and “military” genre when its just machines blowing up other machines?
I read the excerpt and I liked it, I’ll probably read it all some day. But I was shocked by the little thought given to the “thugs” the main character kills at the begginning, specially as these seem to be desperate people living in poverty who probably had few other options, I wonder if that is something that the MC later evolves on, or the author simply considers them acceptable “faceless mooks” to kill or if I’m becoming too much of a bleeding heart to even read this kind of book anymore.
If it helps, literally the next chapter deals with this to some extent (though his humanization is a part of the book’s plot), specifically saying:
“I realize that I have a future if I walk away—whereas this kid’s bosses will shoot him in the head twenty minutes from now if he hesitates. This kid is an unwise equation, the sum of every bad decision he’s made compounded by wretched starting circumstances. Though he clearly regrets his decision, his outs have narrowed to ‘hope he gets lucky sticking with the side he’s already picked.’”
Woa, ask about a book and get an answer from the author in less than an hour, what a time to be alive.
It certainly helps, thanks :)
Have you read Murderbot?
TheHorror: In addition to Murderbot, there’s Ken MacLeod’s Corporation Wars. And Charles Stross’ touches on it (but it’s not the focus) in Saturn’s Children and Neptune’s Brood.
Ferrett: it’s a wild read so far. Looking forward to Silva’s story.
And I’ve never thought of home inspector. Why couldn’t I have thought of this when I was having panic attacks?
I think I will buy the book, I like first person, and I won’t criticize the world building.
Maybe someday we’ll live in a world where pistols can stun and starships can target the engines only, or phaser banks only. I don’t mind such fantasies, anymore than I mind TV shows about crime families, as long as people keep in mind that in real life you don’t want your sister marrying into the mob, and you don’t want your country declaring war in real life.
Sometimes I think the best reason for democracy is to avoid the long chain of events, starting with a leader targeting minorities, or corporations without oversight, that leads to war.
As you know, the UN’s main purpose is not to feed the children or protect heritage sites: It’s to avoid war. Not a nice thought, but there you are.
The people who “guard us while we sleep” (Orwell) will always be cold blooded, or as Orwell noted, will “inevitably be less civilized.” The trick is for good people, as my ex-military buddies do, to avoid believing in war. I am writing this in a North American donut shop where I am sure the locals believe in peace.
Will definitely read this, love the idea especially in asymmetrical warfare scenarios. What will the enemy do to defeat the automate drones and weapons? if they are as glitchy as smart phone it might not be that hard.