Adapting to the Machine

Civilization creates a dilemma for all of us. It’s a dilemma we seldom notice – except in the extreme. It is a dilemma created by living in a world of machines.

The Chinese are planning on creating one such machine for their society.  In the planned Chinese “Social Credit” system every citizen will be rated based on their credit history, purchasing history, and more. The hope is that this grand societal machine will make China better – a more trustworthy place. It will create for every Chinese citizen a terrible dilemma.

An episode of the TV series Black Mirror played with this kind of social machine: a world where every time you so much as bump into a stranger you mutually rate each other out five stars – the resulting score controlling your access to all the good things in life. Black Mirror painted a predictably dystopian world – cheery trivialized bliss floating atop an ocean of mentally intolerable manipulative anxiety.

Watching that show, watching how the characters deformed themselves in-order to adapt to their absurd society, it occurred to me that this is more than just a sci-fi problem. This is more than just a Chinese Communist problem. This is more than just a social media problem. This is a fundamental problem of civilization.

We must adapt to the machine.

We humans have always adapted to our environment. We change our behavior to fit in, to succeed. For a hunter-gatherer, adapting to your environment meant adapting to nature. For a citizen of the modern world, adapting to your environment means adapting to civilization.

Now nature can be harsh – deserts, and ice, and lions – yet the ways of nature are always at their core suited to human nature. More accurately, human nature evolved to suit the ways of nature. In contrast, while civilization can be comfortable – sofas, and chocolates, and painkillers – the ways of civilization are a mixed lot. Some of it we created because it helps us. Some of it we created because when you create one thing you have to go on creating other things. And some of it… who knows why we made that?

Today more than ever, our environment is dominated by the products of human minds – technology and institutions. Machines built from matter, and machines built from people. Success in the modern world means adapting yourself to this world of machines.

Here comes the dilemma.

The things you must do in-order to get by in an artificial world often conflict with human nature. It is impossible to adapt to it without also harming yourself. Self-destructive adaptation. A lose-win or win-lose situation. One can be a functional member of society, or one can be a functional human being. But one can only do both in Utopia, or the wilderness.

All institutions and technologies have biases, tendencies, and needs that have nothing to do with why we created them. These machine-needs are born out of the nature of the machine itself. If you want to use the machine, then you must satisfy the demands of the machine. Those demands can be neutral, a bonus, or harmful – in the extreme they may even undermine the very reason the machine was created.

A technological example: Computers dislike rain, bright sun, and being jiggled around. So we use them indoors, sitting down, in the dark. We adapt ourselves to the needs of the computer. Use them too much and you grow sick from a lack of sun and exercise.

An institutional example: Schools have too few teachers to adapt themselves to the needs of individual children. Instead the children must adapt to the needs of the school. They must be made manageable: silent, still, passive – the very opposite of a healthy child.

Or take these very words as an example: too be ideally adapted to the internet, I would make this 140 characters long, give it a click-bait title, front it with a picture of cleavage and a kitten, then sit there counting views as a measure of success. But I would not have learned anything, and neither would you.

These self-destructive adaptations are everywhere. One major form is where we mistake some measure important to the machine with a measure that is important to us human beings. For example:

  • Step counts on an exercise monitor VS. being healthy
  • Facebook likes VS. friendship
  • Qualifications VS. expertise
  • Awards VS. quality
  • Triple A ratings VS. low risk
  • Number of citations VS. truth
  • GDP growth VS. national well-being
  • A good social credit score VS. a trustworthy citizen

The machines are dumb. They can only understand the world through these simple metrics (typically things you can put a number on, use to fill up spreadsheets, and present as evidence at an end of year review). Therefore these metrics are what the machine rewards and punishes. Whenever you interact with a complicated technology or institution you will be called to live your life based on these metrics.

Sometimes those metrics align with what we want. Sometimes they really really don’t.

Imagine a class with two notable students.  Samantha always gets A+ on every test. Harriet does poorly on tests, and typically brings class to a screeching halt by starting arguments with the teacher. Obviously society will reward Samantha. She’ll get prizes at the end of the year. Her parents will buy her celebratory meals. Universities will happily accept her. Likewise society will punish Harriet. She’ll get sent to detention. Stern letters will be written to her parents. She wont meet entry requirements for university.

Perhaps Samantha really is smarter than Harriet. But it is also possible that Samantha is an unimaginative idiot with a knack for uncritically memorizing garbage. It is also possible that Harriet is an actual genius who has figured out that the school curriculum is flawed and that their teacher is a moron. But the machine cannot handle this nuance. It is a machine and machines are dumb – reliant on simple metrics of success and failure. Therefore the education-machine produces an idiot who thinks herself a genius, and a genius who thinks herself an idiot. Both adapted to the machine.

The mismatch between human needs and techno-instuitional needs is perhaps greatest with those things that are big and complicated, created top-down, or based on a grand theory.

Compare the difference between using a chair and using an airplane. The chair is simple. Only one concession is required of you – put it on a flat surface. The airplane is monstrously complicated. Preflight checks, security screenings, weight checks, runways, fuel depos, air safety regulations…. All of society must adjust itself to the demands of the airplane.

Likewise with institutions. Compare a private tutor with a university. The tutor is hardly an institution at all, more a personal relationship. They can adapt to your exact requirements. A university, however, is an impersonal giant. It must facelessly classify and order it’s students by schedules and GPAs. Instead the students must conform themselves to the demands of an institution too dumb to truly understand them as human beings.

Now consider bottom-up versus top-down. Bottom-up creations are made by their users. They want a creation that works for them, not the other way around. But with a top-down plan? Here the designer is free to disregard the user entirely. Kings don’t have to live under laws made for peasants.

The worst nightmares result when planners create grand complicated top-down  schemes based on some flawed theory. The machine they create may demand outright insanity from its users. Mao’s Great Leap Forward and other such lunacies spring to mind.

So what are we to do?

First, better design. Down-sizing the down side. Fitting designs better to human needs. Using better metrics of success. Bringing users into the design process. Using the simplest effective technique rather the biggest most bloated technique. The machines will never match us perfectly, but we can do a lot better.

Second, be aware. Take note of how many common behaviors in society are actually adaptations to the machine, rather something inherently human. Be aware of when you are acting naturally, and when you are adapting to the machine. You may have no choice but to continue, but at least you’ll know, and at least you’ll have a chance to adapt more healthily.

Be especially aware of metrics – indicators of success and failure. Whenever you see a metric, ask, “Does this actually measure what I care about?” If it doesn’t then ignore it, or find something that does.

Also be aware of the dangers of envy and what passes as success. In a world of machines, it is possible to succeed as a member of society by failing as a human being. Samantha is not always better than Harriet.

Third, know when to adapt and when to rebel. If the cost of adapting is low and the gains are large, then by all means adapt. Likewise, if the cost of rebelling is low, and the cost of adapting is high, then… go on – be a rebel. Go out there and stand  on the street corner, tear off that business tie, throw your smartphone on the ground, and shout, “I’m not your prisoner anymore!” You’ll probably get arrested, but for a little while, just a little while, you might feel more human. And that’s gotta count as a win.

 

~

 

Deeper Down the Rabbit-hole:

China’s social credit system, on Wikipedia here.

An outline of the Black Mirror episode, on Wikipedia here. (Caution spoilers! You should probably just watch the show. It’s a good show!)

One of the mother of all adapting-to-the-machine dilemmas. Go along with it and your country starves to death. Refuse to go along with it and you risk getting shot. Mao’s Great Leap Forward, on Wikipedia here.

 

 

Advertisements

The Paradox of Progress: why does making things better make things worse?

One more labor saving device and I think I’m going to crack.

A strange truth shadows modern society: the better things get, the closer we all drift towards collectively admitting ourselves to the psych-ward. We are the richest miserable people to ever exist.

Yet it’s not just us. This enigma haunts civilization – each jump up in technology has resulted in humanity face-planting into some awaiting tree branch we didn’t see coming. Agriculture gave us food. Great! Then it gave us cholera. Not so great. The industrial revolution gave us rapid fire consumer goods. Then rapid fire machine guns. Then Auschwitz, and nukes, and climate change, and the Great Pacific Garbage Patch. We stand today at the peak of our powers, the precipice of our annihilation, and in desperate need of a pick-me-up.

Why is progress so problematic?

One answer is we’re all just whiners. Buck up. Be happy. Everything we measure is getting better, you whiny wimps.

Unless… we’re measuring the wrong things.

The global economy is very good at meeting material needs. Too good. Like a supercharged robot gone rogue, the global economy is so committed to making stuff that it is on a suicidal quest to convert the entire mass of the solar system into a spinning disc of Buy-One-Get-One-Free deals. Meanwhile all the other things us humans need are being destroyed: community, meaning, stability, equality, nature….

Coupled onto all this is a pill-poppingly depressing narrative. We are told we live in a meritocracy. Rags to riches – anyone can do it. We all end up where we deserve. Therefore, if you fail then you are a pathetic worthless loser. And by the way, we’re all engaged in ruthless selfish competition. Don’t bother asking for help. Just die. Worm.

Slather on top the 24hr tragedy news-stream and it can’t start to feel like the world is ending. What’s worse, we might not be wrong. Climate change, nukes, mass extinction….

Be happy?

Go suck a foot.

The rate of change alone is enough to do us in. We go from snappy youngsters with all the latest tech, to confused eighty-year olds stumped by doors. Our progress is progressing too fast.

Technology has another big problem – The Law of Unintended Consequences. Each new techno-power cuts the red ribbon to a new district of possibility. Sadly, many of those new neighborhoods turn out to include crime-infested ghettos of horror. Sometimes, it’s all ghetto. We’re looking at you leaded petrol.

These muck-ups aren’t all the fault of incompetent inventors. The nature of the system invites surprises. Solving problems creates new problems.

Imagine you are a butt-naked farmer. It’s you, the dirt, and some beans. A three component system. As simple as it gets.

But you aren’t growing enough. To boost production you make a digging stick. It’s a stick. You dig with it. Great! One wrinkle – now you have to cut up a tree. Now it’s five components: you, the dirt, the beans, the stick, and the tree.

Wood is hard. Snapping it up by hand is borderline impossible. The time you spend tugging at that tree could’ve gone into growing beans. So you make a stone-axe. Now the system has seven components: you, the dirt, the beans, the stick, the tree, the axe, and the stones.

Each round of problem solving – progress – lifts the complexity of the system exponentially. Every new component requires resources, maintenance, and managing. They all interact. Each problem solved creates a myriad of new problems to solve. Keep this process going and soon you’ll be needing deep-shaft mines, trade caravans, governments, and ten-thousand years later, the entire global economy. Billions upon billions of components.

The Law of Unintended Consequences rules supreme. Components conflict. Energy requirements grow. Resource depletion sets in. Interactions spin off in unexpected directions.

Problematically all this complexity is subject to the law of diminishing returns. Going from bare hands to a digging stick is a big win. But adding the stone-axe is only useful insofar as it allows more digging sticks. With us, we have reached a point where we are considering vast Geo-engineering projects whose sole purpose would be to save our bean farming from all the things we done to improve our bean farming.

Progress becomes regress.

We end up in a dizzying world which paradoxically winds down the more it winds up. We feel like we’re losing our minds. Thankfully we solved that problem by inventing anti-depressants. What could possibly go wrong?

~

Deeper Down the Rabbit-hole:

Just in case you weren’t already feeling depressed, according to Dr Joseph Tainter diminishing marginal returns on complexity is the sign of a civilization about to collapse, watch on Youtube here.