Are we humans still ‘top dog’ in this brave new world of massive IT?

On 5 October 2021, I held the closing keynote at the Digital Architecture Design Day 2021, the title of which was Master or Servant? Are we humans still ‘top dog’ in this brave new world of massive IT?

Quick summary in bullet points:

  • What makes digital technology special? Answer: it is machinery to perform pure logic. Simple as that. Pure logic is special: it is free from time and space. It is dimensionless. Reality isn’t.
  • What are key aspects of logic that play a role?
    • Certainty
    • Brittleness
    • That it is free from space and time, it is ‘unreal’. (and that there are thus fundamental barriers when it is applied to the real world).
  • What is the information revolution?
    • The industrial revolution is the addition of machines performing large amounts of physical behaviour under human intent; the information revolution is likewise the addition of machines performing large amounts of logical behaviour under human intent.
    • Before the industrial revolution there was already a lot of physical behaviour in the world: all of nature; before the information revolution there was only a very tiny amount of logical behaviour in the world (only by humans) — it is a more fundamental revolution.
    • The pure volume of interconnected machine logic has an effect in itself:
      • The enormous amount of interconnected machine logic becomes harder and harder to change, that is why large IT-heavy changes become so hard to do. (Adding more/new is easier).
      • That resistance to change is a form of inertia: all that logic expresses something that looks like mass.
      • Because logic becomes harder to change (through volume) we — much more flexible — people start to adapt to it (e.g. through Agile and DevOps methods)
      • The fundamental barriers are becoming noticeable: that is why people are interested in computers that are not digital such as quantum computing — but that has limitations of itself)
  • How does the information revolution affect us humans as a species?
    • We are ‘the logical animal’, but we are actually very bad at logic, we’re better at frisbee. We overrate the importance and strength of the logical side of behaviour.
    • All living species are defined by their behaviour (physical, chemical, etc.), tool use is part of our human behaviour.
    • Creating cities and freeways as much natural behaviour (for humans) as creating termite mounds is (for termites).
    • All that logic we have added to the world is the expression of human intent.
    • The enormous amount of machine logic has thus extended us in terms of how we can interface and with who. (Never mind, Elon, we do not need to become physical cyborgs, we already have become mental cyborgs).
    • The enormous amount of machine logic itself becomes a large set of actors (algorithms fulfilling human intent) we interact with. We intertact with algorithms and they interact with us. We influence them, they influence us.
  • How does the information revolution affect the foundations of human societies? There are a couple of properties of human psychology and intelligence that are important, especially with regard to trust:
    • We humans build (trust)-relations based on exchanging information. ‘Idle talk’ and ‘gossip’ are key elements. We check this talk against what we can experience. This creates a stability-providing ‘negative feedback’ mechanism within the group.
    • Human intelligence has developed for of the advantage of working in larger groups than the 20-30 max of other primates. Our intelligence has developed to handle groups of 150-200 max (‘Dunbar’s Number‘).
    • Note again: our intelligence is not very logical: Even your world class professor in analytic philosophy is in essence still a ‘gossipy primate’ with a decent ‘estimation machine’ between the ears.
    • All larger social structures are built on ‘shared convictions’. Like a belief in the rule of law, in the importance of individual freedom,  in equal access to x y or z, religion, etc..
    • There are good and bad ways for convictions to become shared. Lying propaganda (e.g. tabloid press, alternate reality talk radio) is a bad way. Truth-finding (e.g. quality press, science) is a good way.
    • On social media we have overnight become able to spread/receive ‘gossip’ to/from millions and the stabilising negative feedback is hardly there and ineffective. Even worse: it is the perfect tool for lying propaganda. And what us more: the profit-driven algorithms of social media create destabilising ‘positive feedback’. Engineers know: ‘negative feedback’ stabilises systems. Positive feedback derails them. The positive feedback of social media derails the system ‘human society’. E.g. people end up in extreme rabbit holes of disinformation (even smart people, as — remember — we are actually not that smart). Society fragments.
    • There is thus a conflict between ‘value’ (profit) and ‘values’ (shared convictions). And because ‘value’ exploits our deeply rooted natural tendencies, it is winning. All societal mechanisms that favour ‘truthful convictions’ (science, independent judiciary, quality journalism) are under siege.
  • So where does that leave us and what should we do?
    • We tend to think of ourselves as ‘intelligent’ but though we’re the most intelligent species on earth, we have to confront the fact that we’re not really very intelligent at all. We estimate and we copy. We are easily fooled. All of us. Repeat: Even that professor in analytic philosophy is a ‘gossipy primate’. Read Kahneman. Read Ariely. We need to come to terms with  the fact that our intelligence is severely limited and organise society based on that fact. E.g.:
      • Limiting freedom of expression. Note: this is not new, there are many such limitations and even in the US, lies are not protected by the First Amendment (except lies about the government). We should criminalise untruth when it damages society (but while making sure individual freedom even to tell untruth is not destroyed by regulation). The criminal spreading of untruth. This is hard.
    • We need to do something about the way for-profit positive feedback (amplification) of untruth destroys the ‘sharedness’ of our convictions, because without shared convictions, no civilisation.
      • Algorithms need to be regulated such that people and society are protected against their evil effects, without destroying the freedom a good society requires. This is even harder, because we cannot say people may not be manipulated. After all: all communication is a manipulation of the other’s mind. You don’t want to forbid advertising for instance.
      • Regulation on the criminalisation of ‘amplifying untruth’ is an option (facebook and Google as they stand now are criminal in that respect, their positive feedback destroys society).
    • People working in IT should take a ‘hippocratic oath’ not to work on systems that have the effect of damaging shared-conviction based civilisation (we should probably start here).

PS. “What do you think of human intelligence”? Paraphrasing Gandhi: “It would be a great idea”.

1 comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: