"There's a great big beautiful tomorrow/Shining at the end of every day."
--- Disneyland’s Carousel of Progress, 1967.
I was thinking about our big beautiful tomorrow recently when I watched a video produced by a Google subsidiary called Boston Dynamics.
In it, they introduced their latest invention, a robotic dog called Spot who weighs in at 160 pounds, can run, jump and climb hills and stairs with the best of them.
The company has produced several other “animals” that run faster and jump higher than their human overlords. They share one other trait: they are terrifying.
Spot is no Golden Retriever. He is a headless, tailless menacing machine that looks like it was rejected by “Star Wars” as too evil looking.
During the video presentation, a Boston Dynamics employee stepped into camera range and delivered a swift kick to Spot’s midsection. The “dog” staggered briefly, legs flayed, then regained his balance. You could almost hear the growls.
When they figure out how to pack a brain into one of these contraptions, Spot and his buddies, remembering that kick, may someday gather in packs and chase us off a cliff.
Or, as one wag remarked, “An artificially intelligent elevator will ask him "Are you the guy who kicked the robo-dog?" just as the doors are closing.” Fade to black.
We’ve been assured that we have nothing to fear from robots, even nightmarish creatures like Spot. And being a nation that embraces technology, we believe it.
Then we read this recent news dispatch:
“When a South Korean woman invested in a robot vacuum cleaner, the idea was to leave her trustworthy gadget to do its work while she took a break from household chores.
“Instead, the 52-year-old resident of Changwon city ended up being the victim of what many believe is a peek into a dystopian future in which supposedly benign robots turn against their human masters.
“The woman, whose name is being withheld, was taking a nap on the floor at home when the vacuum cleaner locked on to her hair and sucked it up, apparently mistaking it for dust.
“Unable to free herself, she called the fire department with a “desperate rescue plea” and was separated from the robot’s clutches by paramedics, according to a South Korean newspaper.”
Then, there was this:
“A Swedish company has been fined 25,000 kronor ($3,000) after a malfunctioning robot attacked and almost killed one of its workers at a factory north of Stockholm.
“The incident took place when an industrial worker was trying to carry out maintenance on a defective machine generally used to lift heavy rocks. Thinking he had cut off the power supply, the man approached the robot with no sense of trepidation.
“But the robot suddenly came to life and grabbed a tight hold of the victim's head. The man succeeded in defending himself but not before suffering serious injuries.”
OK, so things go wrong sometimes. But what happens when things go wrong with something more deadly than a vacuum cleaner? Think of Spot with a heat-seeking missile strapped to his back.
The day of the Killer Bots is not that far away.
Gen. Robert Cone, the chief of the Army’s Training and Doctrine Command, was quoted in a published report that he thinks there’s a chance the size of the military’s brigade combat teams will shrink by a quarter in the coming years from 4,000 total troops down to 3,000.
Picking up the slack, he said, could be a fleet of robotic killing machines akin to the ground versions of the unmanned aerial vehicles, or drones, increasingly used by the world’s armies.
We are already beginning to develop robots that can coordinate autonomously—that is, with no human input—in order to complete team objectives. Just last August, Harvard’s School of Engineering and Applied Sciences invented a robotic swarm consisting of 1,000 small robots that worked to form shapes.
So here we stand at the threshold of a great big beautiful tomorrow populated by robotic killing machines that can think for themselves.
No less a visionary than Stephen Hawking, the preeminent physicist, has warned that success in creating artificial intelligence “would be the biggest event in human history, [but] unfortunately, it might also be the last.”
It’s serious enough that in Geneva this past year, 118 nations present at a UN conference agreed about the need to tackle the future threat of robotic killing systems, according to Human Rights Watch.
Abandon the research and development of robotics? No, but let us proceed with caution.
Let’s hope this is one case where the human race doesn’t learn a lesson through trial and error.
Robert Rector is a veteran of 50 years in print journalism. He has worked at the San Francisco Examiner, Los Angeles Herald Examiner, Valley News, Los Angeles Times and Pasadena Star-News. He can be reached at Nulede@Aol.Com.