The futurist: More on a workerless world
It’s easy to start viewing automation as a silent killer, much like an Ebola virus for jobs. But in the end, there are always humans directing the effort, and humans benefiting from the destruction.
Once we cut through the slight-of-hand misdirects, we begin to see the real wizard behind the curtain.
Knowing that it’s nothing more than a human vs. human game, we can begin to see the limitations of our own actions. For this reason I’ve created the “Three Laws of Automation Parity” to help guide our thinking about this future threat.
NOTE: Since this is my first discussion on this topic, I’m very likely missing key points. I would invite you to let me know where I’m off base and add to this conversation below.
1.) The Law of Human-Automation Equilibrium
As we move into a future dominated by automation and technology, it’s important to understand that people still drive the economy. If people become unemployed and lose their income, they also lose their purchasing power. And when large numbers of consumers lose their ability to consume, the whole economy suffers.
What’s bad for the economy is also bad for the controllers of automation.
Whenever the proper balance between humans and automation drifts too far into the automation camp, an economic backlash will occur.
Automation is a tool of the power elite, and the number of people who are controlled by it is a key ingredient of the power formula.
As an example, people who control the cellphone industry are far more powerful if a billion people are using their devices, than if only a million are. Consequently, when people can no longer afford their phones, or don’t like the devices, it directly affects their sense of power.
Yes, certain people are willing to win at any cost. For them, the carnage and destruction that follows is easily dismissed with comments like “I can’t help it if they were too stupid to hang on to their job.”
However, even the most ruthless have empathetic family members. And one of their greatest fears is often having people despise them after they’re dead. Their legacy is hugely important, and even though they want desperately to win, they want to leave on a positive note.
2.) The Law of Diminishing Returns
Humans are still capable of making a wide range of complicated decisions on an ongoing basis. Even though we are able to automate down to a certain level, it becomes prohibitively complex and expensive to automate past a certain level.
The simple task of cleaning involves tens of thousands of nuanced decisions to formulate an appropriate response. As an example, walking in to clean your grandparent’s attic, every object has an emotional value that is used by you to sort, organize, and discard the objects in front of you.
The complexity of this type of decision-making is not easily transmitted to an emotionless machine. Even if this technology could be developed, it would likely not be used because it interferes with a critical component of our humanity.
Another important example is in the field of healthcare. Human to human touch is not easily replicated. We like being around others, and when someone is hurt or injured, the need for human interaction increases
Yes, we will automate many aspects of the field of healthcare. But we will find it prohibitively expensive and complex to automate past a certain point.
The Law of Diminishing Returns is the barrier we, as humans, will naturally resist crossing for reasons we can’t always explain.
3.) The Law of Overestimating Capabilities
Seven years after the Wright Brother’s inaugural flight in 1903, Waldo Waterman built the first flying car. It was a logical extension of the airplane and people could instantly see the efficiencies that could be gained with a flying car. Now, 102 years later, we have little more than museum pieces to show for our flying car efforts.
In 1947 Dennis Gabor invented holography, a technology that he would later receive the Nobel Prize for in 1971. We are now celebrating the 65th anniversary of a technology that never materialized in the way he imagined it.
In 1950, computer visionary Alan Turing imagined a world where computers could think and respond like humans. Now, 62 years later, we have yet to pass his infamous “Turing Test.”
According to Amara’s Law, “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” And the long run is often times very very long
It’s easy for us to see a new technology in a movie and extrapolate the speed of adoption and the impact it will have But one of our biggest mistakes is over simplifying the process for getting there.
Every problem creates an opportunity and as the numbers of unemployed rise, this too becomes another entrepreneurial opportunity.
We are entering an unprecedented era where all of the rules are about to change. We won’t be able to trust our instincts or many of the things we’ve traditionally been able to count on.
Economists will all be scratching their collective heads wondering why our economy is acting so weird. But then again, they scratch their heads when everything is normal and wonder why our economy is acting so normal.
Over the coming decades we will indeed see many jobs go away, and it will be up to up to devise better systems for rapid job creation.
Sometimes it takes reaching a higher pain threshold before we are willing to make the changes necessary. Look for many of these pain thresholds to peak in the near future as we dip our toes in our next era for humanity. That’s when things will get very interesting.