Episode 1

full
Published on:

1st Jun 2021

Ethical Algorithms - an introduction

Algorithms touch almost every part of our lives - but what are they? Do we really need to care what they are or how they work?

In this episode John Wyatt and I introduce provide some simple answers to these questions, and then explore by way of a case study, the Public Examination fiasco of 2020 in England. In doing this we raise a number of questions about the use of algorithms, and suggest some possible ways that we could protect people better from the unintended consequences of getting things wrong.

(this episode was originally recorded at the end of 2020)

Transcript
Jonathan Ebsworth:

Hello, and welcome to the TechHuman

Jonathan Ebsworth:

Podcast. I am Jonathan Ebsworth, one of the founders of the

Jonathan Ebsworth:

TecHuman.org website. In this podcast, I'll be talking with

Jonathan Ebsworth:

guests about the impact of different aspects of technology

Jonathan Ebsworth:

on human life. Today, we're beginning the first in an

Jonathan Ebsworth:

occasional series, exploring aspects of algorithms. What are

Jonathan Ebsworth:

they? How are they being used? And why do they matter. And

Jonathan Ebsworth:

we'll hear more about these themes in coming months. My

Jonathan Ebsworth:

friend and TechHuman.org, co-founder, Professor John

Jonathan Ebsworth:

Wyatt, is joining me to see if we can begin to shed some light

Jonathan Ebsworth:

on what these seemingly obscure algorithms are, and consider how

Jonathan Ebsworth:

we should respond to their growing use in our daily lives.

Jonathan Ebsworth:

John, hello,

John Wyatt:

hi, Jonathan. It's good to be here.

Jonathan Ebsworth:

It's great to talk to be better in person, but

Jonathan Ebsworth:

wonders of technology have enabled us to do this almost as

Jonathan Ebsworth:

well, remotely.

John Wyatt:

I think that podcasts are one of those routes

John Wyatt:

of being able to keep a conversation and reach out to

John Wyatt:

other people.

Jonathan Ebsworth:

Yes, I'm amazed actually how well

Jonathan Ebsworth:

technology has served us through this protracted period of

Jonathan Ebsworth:

upheaval. John, perhaps you'd introduce yourself and explain

Jonathan Ebsworth:

how we began to talk together and why we're having this

Jonathan Ebsworth:

conversation.

John Wyatt:

Sure. My background is as a medic. I spent many

John Wyatt:

years working as a paediatrician, a specialist in

John Wyatt:

the care of newborn babies at a big intensive care unit in

John Wyatt:

central London at UCL. And it was really through my work

John Wyatt:

there, I became increasingly interested in medical ethics,

John Wyatt:

but also in the way that technology changes our

John Wyatt:

understanding of the world, and and in particular, how it

John Wyatt:

changes our understanding of what it means to be human. And

John Wyatt:

I've always been interested in computers and artificial

John Wyatt:

intelligence, and so on, and then seeing this whole explosion

John Wyatt:

of technology in in computer based technology hasn't made me

John Wyatt:

think that this is the next big issue, which is which we as a

John Wyatt:

human race are going to face. And so for the last three or

John Wyatt:

four years, I've really been focusing on artificial

John Wyatt:

intelligence and the questions and challenges that it's

John Wyatt:

raising. So what about you, what's your background?

Jonathan Ebsworth:

So I came from almost the opposite end of

Jonathan Ebsworth:

that spectrum, I am a technologist, I've spent my

Jonathan Ebsworth:

whole working life building systems mostly for business. And

Jonathan Ebsworth:

in the last few years, I too, have been wrestling with the

Jonathan Ebsworth:

impact of the technology that I'm putting together on really

Jonathan Ebsworth:

all of human life. And as we enter a period, which the World

Jonathan Ebsworth:

Economic Forum has labelled the Fourth Industrial Revolution,

Jonathan Ebsworth:

the digital revolution, which is powered to a large degree by

Jonathan Ebsworth:

artificial intelligence, I can see the whole of life being

Jonathan Ebsworth:

transformed. And as a technologist, I want to take

Jonathan Ebsworth:

some responsibility for the impact of stuff that I and my

Jonathan Ebsworth:

profession get up to. And it was really in a meeting that you and

Jonathan Ebsworth:

I had a chance meeting we had talking to some teachers about

Jonathan Ebsworth:

the impact of technology we met, and really found that we had a

Jonathan Ebsworth:

shared interest in particularly understanding how all of this

Jonathan Ebsworth:

would change human life and how our faith and orthodox

Jonathan Ebsworth:

Christianity might help us get some some insight into the world

Jonathan Ebsworth:

of technology and how we as human beings, as creative

Jonathan Ebsworth:

beings, could live better than perhaps we are just being swept

Jonathan Ebsworth:

along by by this tidal wave.

John Wyatt:

Yes. And so we struck up a friendship and and

John Wyatt:

that led to creating a new website called techHuman.org,

John Wyatt:

which is still very much work in progress. But we see this as an

John Wyatt:

opportunity for we've called it 'hosting the conversation about

John Wyatt:

technology, and particularly artificial intelligence'. We're

John Wyatt:

not starting with a very strong predetermined view either being

John Wyatt:

very much in favour or very much against, but we do see the

John Wyatt:

importance of these issues and the need to have a place where

John Wyatt:

different perspectives different opinions can be offered,

John Wyatt:

particularly in the context of the Christian faith.

Jonathan Ebsworth:

Yes, I certainly found connecting my

Jonathan Ebsworth:

faith to my work very difficult. And our conversations actually

Jonathan Ebsworth:

have been very helpful to me, in helping me recognise that that

Jonathan Ebsworth:

technology isn't just a neutral thing, and it just depends

Jonathan Ebsworth:

whether you use it for good or bad whether it is or isn't a

Jonathan Ebsworth:

good thing. But actually it comes with a load of baggage

Jonathan Ebsworth:

that has direct consequences and actually understanding some of

Jonathan Ebsworth:

that baggage is very important in terms of making sense of

Jonathan Ebsworth:

things.

John Wyatt:

Yes,an an analogy I quite often like to use is is

John Wyatt:

the saying that if you want to understand what water is then

John Wyatt:

ask a fish And, you know, that's the problem is that we're so

John Wyatt:

immersed in technology, it's so pervasive, that it's almost

John Wyatt:

invisible to us, we don't see the way that it's changing and,

John Wyatt:

and even distorting our understanding of the world, and

John Wyatt:

our understanding of ourselves. So I think this is quite a

John Wyatt:

challenge. You know, it's not an easy topic I find, trying to get

John Wyatt:

my head around some of these issues very, very challenging

John Wyatt:

and complex. But I'm utterly convinced it's really important

John Wyatt:

for this, this time in world history, this time, in our own

John Wyatt:

futures.

Jonathan Ebsworth:

I think there's a tendency, because it's

Jonathan Ebsworth:

complicated that we go and ask the fish, the technologists, to

Jonathan Ebsworth:

explain it all, and to explain the consequences. And whilst we

Jonathan Ebsworth:

may understand how it interacts, and how it fits, the pieces fit

Jonathan Ebsworth:

together, as technologists wouldn't ill equipped to comment

Jonathan Ebsworth:

on how it impacts human life. Which is why we need people like

Jonathan Ebsworth:

you, John, I think,

John Wyatt:

Well, I think we need everybody don't we be in is

John Wyatt:

in a similar vein, I've often said about medical ethics and

John Wyatt:

matters of life and death, these things are far too important to

John Wyatt:

leave to the medics. And I think artificial intelligence is too

John Wyatt:

important to leave to the technologists, and everybody has

John Wyatt:

a perspective, everybody's coming from somewhere. But we

John Wyatt:

need a place where where we can debate some of these issues. And

John Wyatt:

I think particularly the challenge for Christians, of

John Wyatt:

course, it is that is that many of the issues we're facing are

John Wyatt:

genuinely new, you know, we haven't had to face some of

John Wyatt:

these issues at all, the Christian history going back

John Wyatt:

2000 years, has faced many challenges. But the challenges

John Wyatt:

being raised by artificial intelligence, to some extent,

John Wyatt:

are completely new. And therefore, I think all of us who

John Wyatt:

are Christians are, in some sense, scrambling around trying

John Wyatt:

to find a way to, to develop a Christian response.

Jonathan Ebsworth:

I think in order to have a helpful

Jonathan Ebsworth:

conversation, the first thing we need to do is is to get

Jonathan Ebsworth:

ourselves to the same sort of place where we have a shared

Jonathan Ebsworth:

understanding of what it is we're talking about. And that's

Jonathan Ebsworth:

part of what this podcast series is going to try to do is, is to,

Jonathan Ebsworth:

perhaps help us all to get to a good starting point so we can

Jonathan Ebsworth:

have a meaningful conversation about how this fits into our

Jonathan Ebsworth:

lives.

John Wyatt:

Yeah, so today, we're going to talk about

John Wyatt:

algorithms. And I think it's a word which, which trips off the

John Wyatt:

tongue, we hear it all the time. And yeah, I suspect that many of

John Wyatt:

us are a bit hazy about actually what an algorithm really is. So

John Wyatt:

So let me ask you, what's your definition? Or how would you

John Wyatt:

explain it to someone what what is an algorithm?

Jonathan Ebsworth:

So at its most basic, an algorithm is an

Jonathan Ebsworth:

algebraic expression. So x plus 2 equals y is is an algorithm.

John Wyatt:

I'm afraid so many people gave-up algebra that

John Wyatt:

teens, you now have x plus y. Ah, that sounds terrible. So why

John Wyatt:

should I be interested in algebra,

Jonathan Ebsworth:

The the world of digital technology is

Jonathan Ebsworth:

reaching further and further into our lives. And almost all

Jonathan Ebsworth:

computer logic is driven by these sorts of expressions, they

Jonathan Ebsworth:

may be explicit, or they may be implicit, but they're being

Jonathan Ebsworth:

applied more and more to our lives. And this is why it

Jonathan Ebsworth:

matters. So I think most of us even even people your and my

Jonathan Ebsworth:

age, John, are familiar with the use of algorithms to credit

Jonathan Ebsworth:

score to decide whether we can get a mortgage or not. But

Jonathan Ebsworth:

actually, I'm not so sure that the people are aware that the

Jonathan Ebsworth:

products Amazon recommend to you are calculated by algorithms,

Jonathan Ebsworth:

what appears in your newsfeed on Facebook is calculated by

Jonathan Ebsworth:

algorithms. What Netflix offers you as things that you might be

Jonathan Ebsworth:

interested in? Is all driven by algorithms. Almost everywhere we

Jonathan Ebsworth:

look we find algorithms.

John Wyatt:

A part of the problem, isn't it that although

John Wyatt:

conceptually, the idea of a simple formula seems not that

John Wyatt:

difficult, these algorithms can end up being extraordinarily

John Wyatt:

complex, involving 1000s and 1000s of lines of computer code.

John Wyatt:

And so much so that even the people who write them are not at

John Wyatt:

all clear how they work. I've heard it said that the the

John Wyatt:

Google search engine, which is a very sophisticated algorithm, is

John Wyatt:

so fiendishly complex that even the Google engineers themselves

John Wyatt:

don't really understand how it's working.

Jonathan Ebsworth:

You talked about artificial intelligence.

Jonathan Ebsworth:

As we get into some of the most sophisticated forms of

Jonathan Ebsworth:

artificial intelligence, we truly do not understand how some

Jonathan Ebsworth:

of that works, works out how it's making these decisions. It

Jonathan Ebsworth:

just does. And so it is fiendishly complicated, and what

Jonathan Ebsworth:

we need to do is is perhaps step back into some slightly simpler

Jonathan Ebsworth:

space to try and make sense of what algorithms are, and then

Jonathan Ebsworth:

use that to build a more considered discussion of some of

Jonathan Ebsworth:

the more sensitive applications of algorithms.

John Wyatt:

So we thought we'd start off by looking at the use

John Wyatt:

of algorithms in education, and in particular, look at what was

John Wyatt:

described as a great fiasco or debacle, which happened earlier

John Wyatt:

this year in the UK. And in August.

Jonathan Ebsworth:

If I look at the public exams, in actually

Jonathan Ebsworth:

pretty much the whole of Great Britain, as separate systems in

Jonathan Ebsworth:

Scotland, Wales, Northern Ireland, England. We were locked

Jonathan Ebsworth:

down in March, nationally, it was clear that the public exams

Jonathan Ebsworth:

could not be sat in the normal way. And so very quickly, those

Jonathan Ebsworth:

public examinations were cancelled. And so a means had to

Jonathan Ebsworth:

be found to award grades to the candidates in the case of a

Jonathan Ebsworth:

levels. And let's focus on a levels because it's a it's a

Jonathan Ebsworth:

kind of finite thing we can get our arms around, there are just

Jonathan Ebsworth:

over 700,000 entries for a levels. And therefore 700,000

Jonathan Ebsworth:

results had to be found that were reasonable that they

Jonathan Ebsworth:

maintained the value of exams, so we couldn't have massive

Jonathan Ebsworth:

grade inflation, that they could be transparent. People could

Jonathan Ebsworth:

understand how they'd got the the grade that they were being

Jonathan Ebsworth:

given that they were fair, that they're reflected as well, they

Jonathan Ebsworth:

could the ability of the students, and then the answers

Jonathan Ebsworth:

were reached legally in terms of handling personal data.

John Wyatt:

So just to explain to to people who are not from

John Wyatt:

the UK that Why is the a level so important, it's because it's

John Wyatt:

the final grade of your high school career, isn't it and it

John Wyatt:

determines, in particular, your ability to get into university

John Wyatt:

or colleges of further education, it becomes an

John Wyatt:

extremely important measure of how you how you've done and what

John Wyatt:

your potential is for the future.

Jonathan Ebsworth:

Yes, and particularly for the most

Jonathan Ebsworth:

competitive courses like medicine, and veterinary

Jonathan Ebsworth:

science. Failure to get the grades that you anticipate it

Jonathan Ebsworth:

simply means that door is shot. So it has life potentially

Jonathan Ebsworth:

lifelong consequences for for young people.

John Wyatt:

And stepping aside a minute, why are you interested

John Wyatt:

in this? I mean, you're a technologist, you know, why have

John Wyatt:

you got interested in education?

Jonathan Ebsworth:

I was interested in this because for

Jonathan Ebsworth:

me, it felt like a wonderful case study of the power of

Jonathan Ebsworth:

algorithms applied to human life, but one that was at a

Jonathan Ebsworth:

manageable scale. So it's one thing to talk about a number of

Jonathan Ebsworth:

billion Facebook users and what's happening with them, but

Jonathan Ebsworth:

it's a bit easier for me at least to get my head around

Jonathan Ebsworth:

what's happening to 720,000 a level candidate entries, I

Jonathan Ebsworth:

should say, and what happened to the process to get them to

Jonathan Ebsworth:

grades.

John Wyatt:

Okay, so it's a case study where computer algorithms

John Wyatt:

were used to to overcome a problem that the actual exams

John Wyatt:

could not be done. So the answer is, let's use the clever

John Wyatt:

computers. And they'll come up with a simulated estimated grade

John Wyatt:

for each pupil.

Jonathan Ebsworth:

That that was the idea. And it's quite

Jonathan Ebsworth:

reasonably started with with the question, so what data do we

Jonathan Ebsworth:

have and the in England it was Ofqual, the the examination

Jonathan Ebsworth:

qualification standards body that drove this process and

Jonathan Ebsworth:

similar process were followed in the other parts United Kingdom.

Jonathan Ebsworth:

And the data they had was any mock exams that a student had

Jonathan Ebsworth:

already completed, which typically would have been done

Jonathan Ebsworth:

before we hit lockdown, perhaps the historic performance of that

Jonathan Ebsworth:

student in earlier exams and the performance of the academic

Jonathan Ebsworth:

institution they attended. The one other piece of data that I

Jonathan Ebsworth:

think Ofqual hoped was going to help them a lot was the grades

Jonathan Ebsworth:

that their teachers thought they were going to get. This became

Jonathan Ebsworth:

known as the centre assessed grades at each academic

Jonathan Ebsworth:

institution moderator moderated those those scores, and

Jonathan Ebsworth:

submitted them as we think our cohort of pupils are going to

Jonathan Ebsworth:

get this distribution of grades. And the idea I think the hope

Jonathan Ebsworth:

was that they could use rely very heavily on their central

Jonathan Ebsworth:

test grades. Unfortunately, it was a problem.

John Wyatt:

So just just to clarify, were they individual

John Wyatt:

grades that every teacher was asked Joe Bloggs, how's they

John Wyatt:

going to do in mass, how they're going to do in English and so

John Wyatt:

on. And then they summed up all the teachers in a particular

Jonathan Ebsworth:

Yeah, that say they had 25 people sitting

Jonathan Ebsworth:

school.

Jonathan Ebsworth:

out A-Level maths. There will be 25 anticipated grades for that

Jonathan Ebsworth:

school submitted to off call saying these are the grades we

Jonathan Ebsworth:

think Joe, Mary, Sanjay are going to get. And this was

Jonathan Ebsworth:

submitted as an input into the process along with the other

Jonathan Ebsworth:

data that I've described,

John Wyatt:

okay, but they could then look at all the grades from

John Wyatt:

one particular school and compare those with all the

John Wyatt:

grades from another particular school, for instance,

Jonathan Ebsworth:

indeed, and when they added up all of those

Jonathan Ebsworth:

central assessed grades, they found they've got a big problem.

Jonathan Ebsworth:

And the big problem was, in the previous year, they a star and a

Jonathan Ebsworth:

grades were about 25% of the total entries. When they looked

Jonathan Ebsworth:

at the central SAS grades at a national level, it was 37.7%.

Jonathan Ebsworth:

Were A* or A grades. So 50% more A* and As were being suggested,

Jonathan Ebsworth:

we re going to come out of an exam cohort that's completely

Jonathan Ebsworth:

unacceptable from the qualification standards, bodies

Jonathan Ebsworth:

point of view.

John Wyatt:

So the teachers were being much more optimistic than

John Wyatt:

compared to the actual grades that had previously been

John Wyatt:

obtained. And and have we Any idea why they were being so much

John Wyatt:

more optimistic?

Jonathan Ebsworth:

I think it's, it is conjecture. But I suspect,

Jonathan Ebsworth:

probably to two things that are at play here. One is human

Jonathan Ebsworth:

optimism - a belief in the best for your students. And perhaps

Jonathan Ebsworth:

anticipating that there was going to be some downward

Jonathan Ebsworth:

moderation, so we better get things up so that there's some

Jonathan Ebsworth:

room to push grades down. But a 50% increase year on year in a

Jonathan Ebsworth:

star a grades is beyond any statistical probability. You

Jonathan Ebsworth:

wouldn't ever have a genius candidate cohorts coming through

Jonathan Ebsworth:

at that level, when you've got an entry number of over 700,000.

Jonathan Ebsworth:

It's just not not possible, not plausible.

John Wyatt:

So they realised that there was a problem that

John Wyatt:

they couldn't just take the teacher grades at face value,

John Wyatt:

because you'd have far too many candidates then getting the

John Wyatt:

highest levels of eight levels. And that would swamp the

John Wyatt:

universities.

Jonathan Ebsworth:

Correct? I think, in political circles,

Jonathan Ebsworth:

that there's a huge worry about devaluing the currency of the

Jonathan Ebsworth:

these most the highest level of examinations, that pupils in

Jonathan Ebsworth:

England study.

John Wyatt:

Yeah, so so because the a level grade is is used as

John Wyatt:

as an objective marker of people's academic ability. If

John Wyatt:

the whole value of an A or an A star grader is being devalued,

John Wyatt:

it becomes just much less useful and much less reliable.

Jonathan Ebsworth:

Yes, and to give Ofqual some credit, what

Jonathan Ebsworth:

they didn't do is go and shut themselves in a darkened room

Jonathan Ebsworth:

and pop out and say, here's the answer and award the grades,

Jonathan Ebsworth:

they did go through a fairly extensive consultation. They had

Jonathan Ebsworth:

about 100 days, from the point at which the country was locked

Jonathan Ebsworth:

down, the exams were cancelled to having to issue the the a

Jonathan Ebsworth:

level results in early mid August. And in I think it was

Jonathan Ebsworth:

May, they had four week consultation where they said,

Jonathan Ebsworth:

this is what we're proposing to do. What do you think, and they

Jonathan Ebsworth:

got a lot of responses and much of the feedback they took on

Jonathan Ebsworth:

board. Not all of it, but much of it.

John Wyatt:

And and yet, despite all that, when the results were

John Wyatt:

actually announced, there was a terrible outcry. So why was

John Wyatt:

that?

Jonathan Ebsworth:

The terrible outcry was based on on the

Jonathan Ebsworth:

standard cry of "It's not fair!". Now we knew this was

Jonathan Ebsworth:

going to happen. And the reason we knew it was going to happen

Jonathan Ebsworth:

is the Scottish exam system runs about two weeks ahead of English

Jonathan Ebsworth:

one, and it went through exactly the same path. Their results

Jonathan Ebsworth:

came out, I think, the third or fourth of August. And I think

Jonathan Ebsworth:

the belief in government circles was Oh, there'll be a few

Jonathan Ebsworth:

outliers, we can handle those through appeal. But what it

Jonathan Ebsworth:

turned out to be there was there was a very, very large number of

Jonathan Ebsworth:

young people who simply didn't get the grades that they

Jonathan Ebsworth:

expected. And it would appear reasonably expected that we're

Jonathan Ebsworth:

going to get which meant people who thought they were going to

Jonathan Ebsworth:

medical school or vet school, or to some of the high entry

Jonathan Ebsworth:

courses, was simply losing out on those opportunities. And

Jonathan Ebsworth:

after a day or two in Scotland, it became clear that this was

Jonathan Ebsworth:

not a sustainable position. So in fact, I think that's

Jonathan Ebsworth:

something like 9th or 10th of August, Scotland said we're

Jonathan Ebsworth:

going to revert to the teacher assessed grades. England said

Jonathan Ebsworth:

no, we're To go ahead with our algorithm, because we've worked

Jonathan Ebsworth:

this out, it's all been done very carefully. It's all fair.

Jonathan Ebsworth:

And we had an exact repeat of the issue. So it was very

Jonathan Ebsworth:

predictable. Then within a week, they had to revert to the centre

Jonathan Ebsworth:

assessed grades, which of course meant they had 50%, or A* and

Jonathan Ebsworth:

As, than had been achieved in a previous year. So we ended up in

Jonathan Ebsworth:

the in the very place, we hope we weren't going to.

John Wyatt:

So So what do we learn from this, apart from the

John Wyatt:

fact that if you really want to muck things up, use a computer?

Jonathan Ebsworth:

And I think, actually, I think that's a very,

Jonathan Ebsworth:

a very good message. What it made me realise was that one of

Jonathan Ebsworth:

the things we in technology and perhaps in education as well

Jonathan Ebsworth:

aren't very good at doing is seeing the unforeseen

Jonathan Ebsworth:

consequences. We're quite good at looking at the macro picture,

Jonathan Ebsworth:

but not looking at the individual consequences on on a

Jonathan Ebsworth:

human being. So these exam results are fundamental to those

Jonathan Ebsworth:

young people's future lives. And in the interest of trying to

Jonathan Ebsworth:

protect the value of those at a level qualifications. what we

Jonathan Ebsworth:

ended up doing a national level was sacrificing individual hopes

Jonathan Ebsworth:

and aspirations in the interest of maintaining currency, the

Jonathan Ebsworth:

value of that currency. So I think the first thing is that

Jonathan Ebsworth:

we're just not good at seeing the consequences. We don't think

Jonathan Ebsworth:

about we get very blinkered

John Wyatt:

Unexpected, unanticipated consequences seems

John Wyatt:

to be a fundamental feature of, of this modern digital

John Wyatt:

technology. I think that story is repeated so many times. And

John Wyatt:

we could think of many examples of it, and I'm sure it's

John Wyatt:

something we'll come back to in the future. But I think it and

John Wyatt:

by the very fact that these outcomes and are unanticipated,

John Wyatt:

you know, I think it's, it's easy to blame the technologists.

John Wyatt:

Surprisingly, not surprisingly, they don't have 2020 sight, they

John Wyatt:

can't see what unexpected impact might might be. I mean, how

John Wyatt:

might one as a technologist, mitigate the situation? How How,

John Wyatt:

how might you take it into account?

Jonathan Ebsworth:

So that's the right answer. But if I look at

Jonathan Ebsworth:

medicine, as an analogue for this, we have to go through

Jonathan Ebsworth:

extensive testing for any new treatment, firstly, to check

Jonathan Ebsworth:

that it's seems to be effective, and secondly, that it doesn't do

Jonathan Ebsworth:

any harm, or doesn't do unexpected harm. And that if

Jonathan Ebsworth:

there are any bad side effects, that there are proportionate,

Jonathan Ebsworth:

appropriate, consistent with the problem we're trying to fix in

Jonathan Ebsworth:

the first place. And those protocols are very, very well

Jonathan Ebsworth:

prescribed at a globe. Well, there's an international levels

Jonathan Ebsworth:

of the FDA in the states nimit, HRA and you know, all the stuff

Jonathan Ebsworth:

better than I do. And I can't help wondering whether there

Jonathan Ebsworth:

aren't lessons to learn from the way that we take drug treatments

Jonathan Ebsworth:

out to market to applying algorithms to at least large

Jonathan Ebsworth:

scale sensitive decisions.

John Wyatt:

Yeah, that's, that's very interesting. And I think

John Wyatt:

there's a lot in what you say, because basically, I have

John Wyatt:

terrible disasters in the past with drugs. You mean a whole

John Wyatt:

number of terrible disasters where new drugs turned out to

John Wyatt:

have unexpected side effects of the thalidomide crisis is, is

John Wyatt:

one which many people are aware of where a drug given to

John Wyatt:

pregnant women as just as a tranquilliser turned out to have

John Wyatt:

catastrophic effects on the developing foetus. Because of

John Wyatt:

that, these kinds of scandals over the years of a very

John Wyatt:

rigorous technique has developed of testing new drugs for safety

John Wyatt:

and efficacy, very rigorous standards before they can be

John Wyatt:

released at all. And then even once they are released, they're

John Wyatt:

and they're regarded as an investigational product, they

John Wyatt:

have to be special monitoring, for side effects and so on. And

John Wyatt:

and when you compare that with the way that computer programmes

John Wyatt:

are rolled out, in very sensitive areas of life, like

John Wyatt:

like education, or the justice system or whatever, none of that

John Wyatt:

kind of certification process exists, does it or is available?

Jonathan Ebsworth:

I know from experience that that actually,

Jonathan Ebsworth:

testing of computer systems is not always as reliable, it

Jonathan Ebsworth:

should be that so we can't always be certain it's going to

Jonathan Ebsworth:

do exactly what we thought it was going to do. But when it

Jonathan Ebsworth:

comes to the unexpected consequences, we almost never

Jonathan Ebsworth:

look for that in the world of technology. And one of the

Jonathan Ebsworth:

things that I found attractive about the drug thing was not

Jonathan Ebsworth:

just the discipline protocols of getting it to market. But also

Jonathan Ebsworth:

that the adverse event handling. Once it's got to market that if

Jonathan Ebsworth:

there are problems that there is a protocol in place that says,

Jonathan Ebsworth:

this is how you report it, this is how it's investigated. And if

Jonathan Ebsworth:

there's a problem, this is what we do about it. Which for IT

Jonathan Ebsworth:

systems, there's nothing like that.

John Wyatt:

That's absolutely right. And as a doctor, I've

John Wyatt:

frequently used that method where I've been giving a

John Wyatt:

medication to particular patient, and there's been an

John Wyatt:

obvious drug reaction. And then, you know, there are special

John Wyatt:

forms to fill out and notification. And there's a very

John Wyatt:

sophisticated system of continuous monitoring, and

John Wyatt:

informing doctors of what the latest information is on drugs.

John Wyatt:

So. So I and it has been discussed as not having to

John Wyatt:

having some kind of regulatory authority and also having some

John Wyatt:

kind of certification, that before a programme is released,

John Wyatt:

it has to be tested and to meet a certain quality standards.

Jonathan Ebsworth:

Yes, and there are some questions,

Jonathan Ebsworth:

whenever I've raised this sort of suggestion, the instant

Jonathan Ebsworth:

response I've had from most people is that nothing would

Jonathan Ebsworth:

ever get to market nothing. We went to ever innovate, because

Jonathan Ebsworth:

that process for drugs, takes a ridiculously long time and cost

Jonathan Ebsworth:

too much money. But we've seen how when the pressures on you

Jonathan Ebsworth:

can move fast with COVID vaccines, and the speed at which

Jonathan Ebsworth:

it is possible to do something very sensitive, and still get it

Jonathan Ebsworth:

out at scale. So I'm not sure that the it's too slow, is for

Jonathan Ebsworth:

something that is sensitive, is necessarily a valid. rebuttal.

John Wyatt:

Yes. And maybe just as in medicine, it's it's the

John Wyatt:

awful disasters, which then lead to, you know, to positive

John Wyatt:

outcomes eventually, and I wouldn't be at all surprised in

John Wyatt:

the education world is that is that if a new algorithm is

John Wyatt:

designed in the future for changing and modifying

John Wyatt:

examination grades and so on, that that's something like that

John Wyatt:

process would, would now go through them people have learned

John Wyatt:

learned the lesson.

Jonathan Ebsworth:

Yeah. And I think one of the other

Jonathan Ebsworth:

challenges we've got to face is that not every algorithm is

Jonathan Ebsworth:

going to be as sensitive as this one. If we're talking about an

Jonathan Ebsworth:

algorithm that governs Candy Crush, as a game on a mobile

Jonathan Ebsworth:

phone, that's really unimportant, and doesn't really

Jonathan Ebsworth:

matter too much whether the algorithm is working exactly as

Jonathan Ebsworth:

intended or not. So maybe we don't have to apply this to

Jonathan Ebsworth:

every single algorithm, but determine those ones that are

Jonathan Ebsworth:

sensitive that affect people's lives. And so for those ones, at

Jonathan Ebsworth:

least, we need to go through some better, more rigorous

Jonathan Ebsworth:

process of certification and testing.

John Wyatt:

Yes. And I think another issue that I've been

John Wyatt:

aware of is, is that, certainly in the criminal justice system,

John Wyatt:

many of the algorithms that are used are commercially sensitive

John Wyatt:

and therefore protected by confidentiality agreements. And

John Wyatt:

so that's often been a problem, hasn't it that it isn't possible

John Wyatt:

even for other technologists to get access to the actual source

John Wyatt:

code or to understand what what an algorithm is doing? No, it's

John Wyatt:

very hard. And I'm not a lawyer. But I've been told that

John Wyatt:

protecting algorithms under a standard IP law is is actually

John Wyatt:

very difficult to do. So you have to simply shroud them in

John Wyatt:

confidentiality,

Jonathan Ebsworth:

which does make testing very, very

Jonathan Ebsworth:

difficult. But just because something's hard, I don't think

Jonathan Ebsworth:

gives us an excuse to ignore the problems. And I think this is

Jonathan Ebsworth:

now so pervasive, we need to do something, at least around

Jonathan Ebsworth:

sensitive algorithms.

John Wyatt:

Yes, and certainly in the UK, and in Europe, I

John Wyatt:

think the obvious approach is, is to have a government or a

John Wyatt:

quango or a regulatory authority, an official

John Wyatt:

regulatory authority, which is given legal power to compel

John Wyatt:

manufacturers and technologists to to go through a regulatory

John Wyatt:

process before these algorithms in certain sensitive areas are

John Wyatt:

released into the wild.

Jonathan Ebsworth:

Yes. And one other aspect of algorithms in

Jonathan Ebsworth:

general, which I did come up in the context of the a level

Jonathan Ebsworth:

process was about bias or fairness. Whether that's to deal

Jonathan Ebsworth:

with discrimination with social groups or racial groups, or,

Jonathan Ebsworth:

frankly, any group that one of the problems in the a level

Jonathan Ebsworth:

algorithm issue was that they had differential weighting based

Jonathan Ebsworth:

on the cohort size from an individual school, so a school

Jonathan Ebsworth:

that only perhaps put five entries in Centre-assessed

Jonathan Ebsworth:

grades got a heavier weighting than if it was a very large

Jonathan Ebsworth:

institution that had perhaps 20, 30, 40, 50 candidates going

Jonathan Ebsworth:

forward for an exam where it was moderated much more heavily. And

Jonathan Ebsworth:

the view reported was that that meant that smaller schools,

Jonathan Ebsworth:

perhaps private schools, actually were given an advantage

Jonathan Ebsworth:

in comparison to perhaps larger bodies, which are typically the

Jonathan Ebsworth:

state funded schools. This idea of bias and algorithms is a very

Jonathan Ebsworth:

important one. I think it's very socially sensitive at the

Jonathan Ebsworth:

moment.

Jonathan Ebsworth:

So, John, from your point of view, are there any particular

Jonathan Ebsworth:

lessons that we can learn from the examination fiasco? From a

Jonathan Ebsworth:

Christian point of view that that perhaps we can take forward

Jonathan Ebsworth:

into other discussions and thoughts?

John Wyatt:

Well, I think an immediate thought which occurs

John Wyatt:

to me is that I think, as you mentioned a bit earlier, that

John Wyatt:

risk when we, when we receive news, which we find very

John Wyatt:

difficult, if it's another human being who's made a judgement, we

John Wyatt:

can when we find it easier to accept, even if we don't agree

John Wyatt:

with it, you know, if our teacher says, Well, I'm really

John Wyatt:

sorry, but you know, my conclusion is X,Y,Z. Whereas if

John Wyatt:

it's a machine that's doing this, there's something about

John Wyatt:

our humanity, which it seems to oppose, you know, and I think

John Wyatt:

from a Christian point of view, this fits with this idea that we

John Wyatt:

are ultimately created for relationships with other human

John Wyatt:

beings, and relations with God Himself. In other words, for

John Wyatt:

relationships between persons, and when we have an impersonal

John Wyatt:

machine, saying the machine has decided, you know, and it's

John Wyatt:

saying something of immense significance to me, it somehow

John Wyatt:

goes against my humanity, the way that I made, I made to be

John Wyatt:

able to engage with human beings, I actually, it's

John Wyatt:

something rather weird about getting a final decision from a

John Wyatt:

machine.

Jonathan Ebsworth:

Yes, I think that's very, very true. It's

Jonathan Ebsworth:

hard. It's very hard to deal with emotionally and actually

Jonathan Ebsworth:

hard to challenge. The machine becomes unanswerable, doesn't

Jonathan Ebsworth:

it?

John Wyatt:

Yes. And I can't follow the reasoning so. So when

John Wyatt:

it's when it's a human being, even if I find it very, very

John Wyatt:

painful, I can at least ask the teacher or the judge or whoever

John Wyatt:

it is the doctor Why, why did you make that decision? What

John Wyatt:

what? What was the basis on which it is made? And I can, I

John Wyatt:

can at least attempt to follow it with myself and understand it

John Wyatt:

and see the world through this other person's eyes. When the

John Wyatt:

machine is just saying, well, the algorithm said 4.76 4.76 is

John Wyatt:

below the grade, end of story. There's no way I can follow that

John Wyatt:

through I can understand it, I can I can come to terms with it

John Wyatt:

emotionally. So often, we're struggling to find how on earth

John Wyatt:

do we respond as Christians, but I think the first thing is we

John Wyatt:

really need to understand the world we live in. And we can't

John Wyatt:

short circuit the process of trying to develop a Christian

John Wyatt:

response. The first thing is to understand and then the second

John Wyatt:

thing is to say, well, then how do we respond from the point of

John Wyatt:

view of the Christian faith?

Jonathan Ebsworth:

John, thanks so much. Look forward to our

Jonathan Ebsworth:

next conversation.

John Wyatt:

Thanks a lot. I'm looking forward to.

Jonathan Ebsworth:

That's all for our first episode. Thanks so

Jonathan Ebsworth:

much for joining us. In our next conversation, we'll look at some

Jonathan Ebsworth:

of the technology opportunities and issues that have been thrown

Jonathan Ebsworth:

up by the series of lockdowns that have been running here in

Jonathan Ebsworth:

the UK, and around much of the world sporadically over the last

Jonathan Ebsworth:

12 months or so. We will return to the world of algorithms in

Jonathan Ebsworth:

future episodes, looking at their use in criminal justice in

Jonathan Ebsworth:

business areas like recruitment, credit control, insurance,

Jonathan Ebsworth:

touching almost every aspect of our daily lives. We know there

Jonathan Ebsworth:

is some big social issues that will come up, perhaps none more

Jonathan Ebsworth:

sensitive than algorithmic bias and algorithmic unreliability.

Jonathan Ebsworth:

In all these episodes, we're trying to understand how we as

Jonathan Ebsworth:

followers of Jesus can live well in a society that's dominated by

Jonathan Ebsworth:

technology. Thank you for joining us. For those of you who

Jonathan Ebsworth:

are interested in finding out more about these issues, please

Jonathan Ebsworth:

do go to www.TechHuman.org. You'll find an article on the

Jonathan Ebsworth:

public examination fiasco, which explains some of the issues and

Jonathan Ebsworth:

a little more detail. You will also find other articles both

Jonathan Ebsworth:

short and long, as well as an interesting range of book

Jonathan Ebsworth:

reviews. If you have any subjects you'd like us to look

Jonathan Ebsworth:

at, please let me know at Jonathan has tech human.org.

Show artwork for TechHuman

About the Podcast

TechHuman
Technology Humanity and Faith
Technology is reaching ever further into our lives, our homes, our relationships, our travels and our workplaces. In this podcast. We want to explore what the technology revolution means, how we see it enhancing, changing, threatening or damaging the world we live in. We discuss how we might seize the opportunities, minimise any threats and overcome the harms that have emerged.

Whether we love technology or are suspicious of it; it is increasingly hard to avoid. In this podcast, I will talk with some of my friends who particular experience or insight into important aspects of these changes. We want to understand what is happening and uncover some practical insights the most significant challenges and opportunities that emerge. Two intertwined themes will be at the heart of our conversations – what does it mean to be fully human and how can we live well in an increasingly ‘digital’ world.

This podcast is for anyone who is interested in understanding the impact of technology on human life and on society. It is aimed at adults and older teens who want better to understand the issues that surround technology innovation. We begin from the orthodox Christian perspective that humankind is created uniquely in the image of God – and that our humanity was designed for embodied relationships. Our conversations should be relevant to those of any faith or of none who are interested in better appreciation of the impact of technology on our lives today and in the future.

Professor John Wyatt, my co-founder at TechHuman.org will join me for many episodes …..

About your host

Profile picture for Jonathan Ebsworth

Jonathan Ebsworth