The Big Idea: Thomas K. Carpenter
Ever feel like “the algorithm” just knows too much? Would you trust an algorithm with something as important as space travel? In Thomas K. Carpenter’s newest Audible Original, Saturn’s Monsters, AI has the potential to make much more dangerous calculations than just advertising bizarre things to you.
THOMAS K. CARPENTER:
All data lies.
I’m an engineer by background. I’m used to data. But show a group of engineers or scientists the same graph, and you’ll have a dozen different interpretations. Now take that idea and supercharge it with algorithms that cannot step back and contemplate the impacts of their decisions—and you have a recipe for disaster.
In Saturn’s Monsters, a group of scientists and engineers grow interplanetary ships in our friendly ringed gas giant’s atmosphere. By using the materials present in the clouds, and nanobots they bring with them, they’re “3-D printing” surfaces onto a flying ship, growing it large enough to travel outside the Solar System. But that kind of work can’t be done manually, so they release the ships into the gas giant, using algorithms to keep the ships aloft in those hurricane-like winds.
The Big Idea is about the dangers of algorithms, and how the very data we select to build our machine learning programs have many unintended consequences. Don’t get me wrong, algorithms can be a powerful force for good, they tease out relationships our human brains might never have seen, helping design technology that meets people’s needs, but as I said in the opening sentence, “All data lies” and those lies can get people killed, or at the very least, ruin their lives. We’ve already seen that insurance companies or banks, using algorithms to predict safe customers, have essentially coded in the implicit biases contained within our society, and financially injured those the algorithms should have protected.
This Big Idea didn’t happen on the first go around of the story. I wrote a shorter version about five years ago that focused mostly on the ships after they’d been grown, which was exciting, but lacked the details about the team and how they worked together on the project. It was more Michael Bay than Michael Creighton. So I scrapped it after a few rejections.
I started a second version of Saturn’s Monsters after encouragement from Andrea Stewart who was always asking me if I’d sold the first story, but quickly realized that the gravity was all wrong on Jupiter, where I had initially set the story. I enlisted my teenage son, who was in the middle of his AP Physics class, to calculate the gravitationally habitable portion of the planet, only to learn that the station would have to be too far out from the cloud structure to work. Thankfully, Saturn is significantly lighter because it doesn’t have a heavy metal core, coming in only 8% heavier than Earth, making it a much safer location for our team (Thanks Aiden!)
During this rewrite of Saturn’s Monsters is when the Big Idea of the danger of algorithms became a part of the story. As I focused on the team and how they managed to pull off this amazing feat—the pain and hard work of laboring under dangerous conditions—the story, and most importantly the characters, came alive.
I won’t ruin the tale for you, I’m sure you can guess that things don’t go as planned with these AI driven ships that were named after mythological monsters (You might also wonder: why would you bother naming ships after creatures that kill humans? But hey, the NSA named their machining learning communication network Skynet, so I guess we all think that the lessons of the past don’t apply to us).
In an algorithm based world, the type and quality of the data we use to create this machine learning will matter, as well how much human intervention we choose to keep in the system as a circuit-breaker. The crew of Saturn Two proves on their mission to make the human race space faring that not only does data lie—but data kills.
Saturn’s Monsters: Audible Original