Civilization creates a dilemma for all of us. It’s a dilemma we seldom notice – except in the extreme. It is a dilemma created by living in a world of machines.
The Chinese are planning on creating one such machine for their society. In the planned Chinese “Social Credit” system every citizen will be rated based on their credit history, purchasing history, and more. The hope is that this grand societal machine will make China better – a more trustworthy place. It will create for every Chinese citizen a terrible dilemma.
An episode of the TV series Black Mirror played with this kind of social machine: a world where every time you so much as bump into a stranger you mutually rate each other out five stars – the resulting score controlling your access to all the good things in life. Black Mirror painted a predictably dystopian world – cheery trivialized bliss floating atop an ocean of mentally intolerable manipulative anxiety.
Watching that show, watching how the characters deformed themselves in-order to adapt to their absurd society, it occurred to me that this is more than just a sci-fi problem. This is more than just a Chinese Communist problem. This is more than just a social media problem. This is a fundamental problem of civilization.
We must adapt to the machine.
We humans have always adapted to our environment. We change our behavior to fit in, to succeed. For a hunter-gatherer, adapting to your environment meant adapting to nature. For a citizen of the modern world, adapting to your environment means adapting to civilization.
Now nature can be harsh – deserts, and ice, and lions – yet the ways of nature are always at their core suited to human nature. More accurately, human nature evolved to suit the ways of nature. In contrast, while civilization can be comfortable – sofas, and chocolates, and painkillers – the ways of civilization are a mixed lot. Some of it we created because it helps us. Some of it we created because when you create one thing you have to go on creating other things. And some of it… who knows why we made that?
Today more than ever, our environment is dominated by the products of human minds – technology and institutions. Machines built from matter, and machines built from people. Success in the modern world means adapting yourself to this world of machines.
Here comes the dilemma.
The things you must do in-order to get by in an artificial world often conflict with human nature. It is impossible to adapt to it without also harming yourself. Self-destructive adaptation. A lose-win or win-lose situation. One can be a functional member of society, or one can be a functional human being. But one can only do both in Utopia, or the wilderness.
All institutions and technologies have biases, tendencies, and needs that have nothing to do with why we created them. These machine-needs are born out of the nature of the machine itself. If you want to use the machine, then you must satisfy the demands of the machine. Those demands can be neutral, a bonus, or harmful – in the extreme they may even undermine the very reason the machine was created.
A technological example: Computers dislike rain, bright sun, and being jiggled around. So we use them indoors, sitting down, in the dark. We adapt ourselves to the needs of the computer. Use them too much and you grow sick from a lack of sun and exercise.
An institutional example: Schools have too few teachers to adapt themselves to the needs of individual children. Instead the children must adapt to the needs of the school. They must be made manageable: silent, still, passive – the very opposite of a healthy child.
Or take these very words as an example: too be ideally adapted to the internet, I would make this 140 characters long, give it a click-bait title, front it with a picture of cleavage and a kitten, then sit there counting views as a measure of success. But I would not have learned anything, and neither would you.
These self-destructive adaptations are everywhere. One major form is where we mistake some measure important to the machine with a measure that is important to us human beings. For example:
- Step counts on an exercise monitor VS. being healthy
- Facebook likes VS. friendship
- Qualifications VS. expertise
- Awards VS. quality
- Triple A ratings VS. low risk
- Number of citations VS. truth
- GDP growth VS. national well-being
- A good social credit score VS. a trustworthy citizen
The machines are dumb. They can only understand the world through these simple metrics (typically things you can put a number on, use to fill up spreadsheets, and present as evidence at an end of year review). Therefore these metrics are what the machine rewards and punishes. Whenever you interact with a complicated technology or institution you will be called to live your life based on these metrics.
Sometimes those metrics align with what we want. Sometimes they really really don’t.
Imagine a class with two notable students. Samantha always gets A+ on every test. Harriet does poorly on tests, and typically brings class to a screeching halt by starting arguments with the teacher. Obviously society will reward Samantha. She’ll get prizes at the end of the year. Her parents will buy her celebratory meals. Universities will happily accept her. Likewise society will punish Harriet. She’ll get sent to detention. Stern letters will be written to her parents. She wont meet entry requirements for university.
Perhaps Samantha really is smarter than Harriet. But it is also possible that Samantha is an unimaginative idiot with a knack for uncritically memorizing garbage. It is also possible that Harriet is an actual genius who has figured out that the school curriculum is flawed and that their teacher is a moron. But the machine cannot handle this nuance. It is a machine and machines are dumb – reliant on simple metrics of success and failure. Therefore the education-machine produces an idiot who thinks herself a genius, and a genius who thinks herself an idiot. Both adapted to the machine.
The mismatch between human needs and techno-instuitional needs is perhaps greatest with those things that are big and complicated, created top-down, or based on a grand theory.
Compare the difference between using a chair and using an airplane. The chair is simple. Only one concession is required of you – put it on a flat surface. The airplane is monstrously complicated. Preflight checks, security screenings, weight checks, runways, fuel depos, air safety regulations…. All of society must adjust itself to the demands of the airplane.
Likewise with institutions. Compare a private tutor with a university. The tutor is hardly an institution at all, more a personal relationship. They can adapt to your exact requirements. A university, however, is an impersonal giant. It must facelessly classify and order it’s students by schedules and GPAs. Instead the students must conform themselves to the demands of an institution too dumb to truly understand them as human beings.
Now consider bottom-up versus top-down. Bottom-up creations are made by their users. They want a creation that works for them, not the other way around. But with a top-down plan? Here the designer is free to disregard the user entirely. Kings don’t have to live under laws made for peasants.
The worst nightmares result when planners create grand complicated top-down schemes based on some flawed theory. The machine they create may demand outright insanity from its users. Mao’s Great Leap Forward and other such lunacies spring to mind.
So what are we to do?
First, better design. Down-sizing the down side. Fitting designs better to human needs. Using better metrics of success. Bringing users into the design process. Using the simplest effective technique rather the biggest most bloated technique. The machines will never match us perfectly, but we can do a lot better.
Second, be aware. Take note of how many common behaviors in society are actually adaptations to the machine, rather something inherently human. Be aware of when you are acting naturally, and when you are adapting to the machine. You may have no choice but to continue, but at least you’ll know, and at least you’ll have a chance to adapt more healthily.
Be especially aware of metrics – indicators of success and failure. Whenever you see a metric, ask, “Does this actually measure what I care about?” If it doesn’t then ignore it, or find something that does.
Also be aware of the dangers of envy and what passes as success. In a world of machines, it is possible to succeed as a member of society by failing as a human being. Samantha is not always better than Harriet.
Third, know when to adapt and when to rebel. If the cost of adapting is low and the gains are large, then by all means adapt. Likewise, if the cost of rebelling is low, and the cost of adapting is high, then… go on – be a rebel. Go out there and stand on the street corner, tear off that business tie, throw your smartphone on the ground, and shout, “I’m not your prisoner anymore!” You’ll probably get arrested, but for a little while, just a little while, you might feel more human. And that’s gotta count as a win.
Deeper Down the Rabbit-hole:
China’s social credit system, on Wikipedia here.
An outline of the Black Mirror episode, on Wikipedia here. (Caution spoilers! You should probably just watch the show. It’s a good show!)
One of the mother of all adapting-to-the-machine dilemmas. Go along with it and your country starves to death. Refuse to go along with it and you risk getting shot. Mao’s Great Leap Forward, on Wikipedia here.