Why is Data So Unreliable?

This post is about the information that is used by a supercomputer we all carry around with us (no, dummy, not your mobile phone. Not quite yet). It is about the idea that we all suffer from innumerable distortions and agendas. Even those who claim to have no agenda…well, that is the agenda. To pretend they are above it or haven’t got one is disingenuous and indicates a worrying lack of self-awareness.

This supercomputer is an organic machine, part of a larger organism with a wide range of sensors for input, it requires very careful maintenance, is easily damaged and easily corrupted, benefits from being made to regularly solve complex problems, has seemingly limitless storage capacity and is carried around on your shoulders.

In our rush to worship at the altar of AI, data, quantum computers, and machine learning we step neatly over the weirdness and utter unpredictability that this supercomputer (for ease of typing it shall henceforth be known as the brain) brings.

optical illusion
Two straight lines, bent by the brain.

Sure, you can reduce all the external inputs into a measurable thing. The number of photons that hit the retina, the loss caused by an aged lens, the sensitivity of our fingers, sense of balance, the speed at which we solve problems and so on. Despite possessing a brain in all its judgemental and unpredictable glory we seem desperate to quantify and measure everything possible. For with the surety that comes from turning every conceivable bit of input into a number, then surely it must be within our ken to calculate the output? And if we don’t get it right then we can re-examine the computational processing algorithms and refine them until we come to as close an output that the programmer(s) expect. You see, there will be some parameters set for an acceptable/realistic/likely outcome somewhere in the brief, and usually, the aim is that the output matches the expectation.

The brain starts to gallop ahead, and I suppose this is what intrigues the scientists trying to create AI when the capacity for the random social variables and the filters they create comes in. The ability of the brain to make the weirdest associations from two apparently random bits of data always astonishes me.

For example: 30 years ago I shared a flat with Oliver Reed’s nephew, who apparently strove to exceed or at least replicate the lifestyle of his uncle. I was persuaded to take a tab of acid (LSD), which I tore in half because it scared me but, hey, peer pressure. This had a startling effect on me and I had to sit alone as it felt as if snooker balls of thoughts were cannoning into one another and going off at funny angles. To this day, if I see a snooker game I am reminded of that moment. I imagine it will be a very very long time before there will be a computer that can make those sort of weird cognitive leaps.

Data can be as much social and experiential as pure numbers fed into STATA, R or SPSS and manipulated in various approved ways. The data coming from the brain’s sensors is reliably distorted through one or several social lenses. Are you rich, poor, foreign, insecure, angry, a victim of something, in a wheelchair, aspirational, impaired, with a neurological condition (I have MS – have had for 26y)? Perhaps you are clinging to the notion that you are utterly impartial and free from an agenda and thus right? That is a powerful filter, often producing feelings of self-righteous indignation that can’t always be adequately expressed in 280 characters.

trump-twitter-03
It doesn’t stop this fellow from trying though.

When you are designing research, analysing research, presenting research, having research presented to you then try to remember that your brain and the recipient’s brain have different filters. They may seem externally similar, but at some point, that same information will hit a unique filter and the gap between intent and understanding soon becomes apparent.

The human desire to avoid cognitive dissonance is strong and mismanaged leads people to do terrible things to try and ‘fix’ it. Thankfully, we have a great ability to bend data to fit our pre-conceived notions of what feels right or to fill voids with made-up data. We ALL do it. I believe that the most we can do is to open up the analysis to others who manifestly do not share a similar agenda. As independent as possible and are trained to look for inconsistency, in the accrual of data or the motives of those who have handled it before you see/hear it.

So-called facts in newspapers are a good place to start asking why and how. In a commercial environment, people bandy around poor data and try to cover it with the force of personality of seniority. BBC Radio 4 has an excellent series (available for download) called Thought Cages that deals with the vagaries of the human brain in an amusing and engaging way.

We all lie and deceive all the time. It is in our nature. Sometimes you need a different variety of deceiver to look into your world and help you identify the deceits. I can help you, so contact me via LinkedIn.

 

Advertisements

What Are You Really Worth?

I do mean really, can you really put a value on your being, your presence on this planet?  For starters, define value. I imagine that it is quite a different figure to that which you are paid, the increase you are going to carefully negotiate for, cometh the pay/performance review. What have you added to your company’s bottom line? Really, how can you express your actual hours of toil to yourself, your family, the shareholders and so on?

Instead, try this: what would a stranger pay for you?  This throws up all sorts of quite deep questions. Perhaps I mean the value of your life, a binary live/die scenario. What value does your life hold to a stranger? Why should they invest their money in your preservation? What is the bottom line for a stranger if they do not have an emotional investment in your continued existence? Perhaps that stranger is just a middle-man and your worth to them can only be expressed in what another third-party will pay for you, regardless of what the final owner of you does with/to you. The more I write the more it sounds like a people trafficking scenario being described in an article on a site devoted to understanding data.

You are worth nothing. Your personal data is worth everything. You are not the customer, you are the product.

Nevertheless, my friend Nick brought the following to my attention:

Future value of data

Image credit: PwC (a publication of some sort) 2019

It turns out that ‘experts’ have predicted the estimated (now there is a get out of jail free word when used in stats/studies) value, not of life per se, but more of a person’s worth. A worth that only some people/organisations will value.

Another shock: Headline grabbing bar charts with bold colours and zero bloody context around them. I get told off occasionally for worrying about trivia like this. My reply is that it isn’t trivia, it is e v e r y t h i n g.  This is clearly a graphic designed to show thought leadership of some description and therefore imbue the reader with a warm feeling that they are in the hands of ‘experts’ who ‘get’ this kind of stuff and that said experts are the ones to choose to help shape your organisational vision for the next millennia. You’ll be at the bleeding edge of thought and stand to leapfrog all your competitors in a trick where you simultaneously disappear in a puff of smoke and hit the ground running towards a new and lucrative market enjoying an unassailable lead. If, of course, you employ the genii at said group of thought leaders proffering such a compelling image of the future.

Wouldn’t it be interesting to know how these figures were arrived at? Why is a US citizen worth three times that of their European cousins? What was measured, what was controlled for, what was the working hypothesis (apart from baffle the punters with smoke, mirrors and a pretty chart?), when was the analysis conducted, what was excluded and why, how was it analysed, can we have the raw data ourselves please, what data, how is value computed, what markets will pay that, will some pay more or less? And so on…

Here is a little test you can run yourself. Call up a software/hardware firm or management consultancy and see if in a ten-minute chat they can refrain from using the words: Big Data, Blockchain (a new one getting traction), AI, Algorithm, paradigm (falling out of favour these days, I guess the era of New Paradigms has come and gone) or cloud. My guess is at least 4 out of six will crop up. Just saying.

This begs the question, how can we harness this apparent worth and charge for it? Perhaps there could be some charitable models developed around this?