Back arrow
June 4, 2023

Surfing the digital wave

The best thing we can do is build surfboards and ride the wave.

- Scott Davis

It seems that we, as humanity, are at an inflection point, a period in human history where quite literally anything could happen!

Some, like Yuval Noah Harari, believe that unless we regulate and control the evolving artificial intelligence it could well be the end of human history as we know it.

What would happen once a non-human intelligence becomes better than the average human at telling stories, composing melodies, drawing images, and writing laws and scriptures?” The answer, he believes, casts a dark cloud over the future of human civilisation. We should regulate AI before it regulates us.  (Yuval Noah Harari)

Others, like Scott David, believe that if we synthesize human and Artificial Intelligence and augment our thinking we may finally have the tools we need to cope with the other major challenges of the 21st Century.

Some, like Jaan Tallinen and those at the Future of Life Institute (FLI), believe that we need to pause the giant AI experiments in order to take time to more fully understand the risks.

Others, like Pedro Domingoes, criticise this call and want to forge ahead because, as Alan Kay said

The best way to predict the future is to invent it.

Regardless of which side one takes what this all demonstrates is that it is not the ‘it’, the AGI, that we should be worried about, it is us, the humans, and how we are going to deal with whatever emerges.

The one good thing about the FLI letter is that it has been a catalyst for debate and has finally brought the issue of AGI to the public forum.  The reality is that regardless of what the technologies can or can’t do it is our social systems that will generate the real social change, for instance companies like IBM pausing its hiring to replace 7,800 jobs with AI and Microsoft’s development of Co-Pilot.  Companies are not hiring graduate developers, they are caught up in the hype around the tech and this is causing major ripples in the labour market.

Whilst it may take a decade or more for AGI to emerge (in whatever form that may be) there is no doubt that in the short term the hype around it will impact peoples’ lives and this is what will create more risk than the potential of the machine.

One way of looking at the current situation is that following the disruption of the Covid19 Pandemic there are now the accompanying advances in AI, and the other technologies which are now converging, which have shaken up and unfrozen much within our social and economic systems.  This can be illustrated by considering two models:  Kurt Lewin’s Freeze-Unfreeze (Lewin 1947 Frontiers in group dynamics) and William Bridges Transition.

It is the unfrozen state (the Interstice) which provides the potential and opportunity for change and renewal before the new-normal is established. This is a time of excitement and energy but it is also a time of fear and potential unrest because change can be frightening as the old ways die and the new is not yet clear (see Kubler Ross).

In facing any change we humans need to feel a sense of agency in order to craft a path forward and accept the change being presented to us, and this is what we sought to explore in our recent Brave Conversations in Brussels.

We presented our participants with three case studies, each of which posed a number of questions around personal choices in response to specific situations in which Artificial Intelligence was a key determinant.  The first was based on a challenge posed by a large language model released on to the Internet; the second related to AI and a health care issue, and the third related to the development of government policy.  In each case we armed our participants with Mark Moore’s Strategic Triangle asking three questions to determine how they as individuals could potentially respond to each case:

  • Ethos – What should we do?  What do our values, ethics and morals guide us to do?
  • Logos – What can we do?  What resources to do we have?
  • Pathos – What may we do?  What authority are we acting on?

For much of the last fifty years advances in information systems have been made by scientists, such as Geoffrey Hinton and his peers, who have developed technologies because they could – they were able to do it, they could solve whatever problem they were addressing, and they charged forward.  They didn’t necessarily ask if they should – the combined outcome of the Ethos and Pathos in the trilogy.

As a complement to this investors, particularly in Silicon Valley, helped thrust these technologies in to the commercial realm because they understood the value of digital information and digital disruption – Uber, Air BnB and all the companies participating in what Shoshana Zuboff has termed Surveillance Capitalism.

Funnily enough when I met Zuboff at a signing of her book in 2019 I asked her if she had seen all of this coming when she wrote The Support Economy in 2002 – she said, “Yes we did, we just hoped it wouldn’t happen.”  Sadly it did.

The commodification of personal data for commercial gain has created a marketplace that trades exclusively in human futures (Zuboff), feeding the Social Machines we have today, exploiting our innate human need to connect with each other.  What is worth considering here is that whilst there has been enormous focus on the issues of privacy and surveillance what has not been much discussed are the ways that these platforms view the emergence of communities as a byproduct rather than the driver of their success.

The Web was created by Tim Berners-Lee as a tool to facilitate communication and information sharing between people within a community and it was the trust within that communities that enabled the sharing to occur.  Companies then sought to commercialise the Web, which had been given to humanity for free by Tim, and as a result sought to create monopolies by closing elements of it down – the walled gardens of social media.

Just over a decade ago the Open Data Movement gained traction, particularly due to the election of the tech-savvy Barrack Obama as US President in 2008 and the Parliamentary Expenses Scandal in the UK in 2009.  There was huge hope for this movement which changed the paradigm around public sector information from being closed and hidden to that of being a public asset to be harnessed and exploited for public good – this resulted in the ‘open by default‘ principle.  Sadly, despite the excitement and early wins achieved by Government Digital Services around the world the truth is that they managed to pick the low hanging fruit but the challenges of true digital transformation have proved to be painstakingly difficult – governments are still talking about it in the same way that they were over a decade ago.

It seems to me, having lived through and closely observed these events, that each phase of opening up emerges from within communities that are seeking to solve real problems that affect them.  Someone then has the bright idea of commercialising which then encourages the sharks to start circling with their growth and profit mindset and next thing whatever was shared and open became monetised and closed, no longer focused on the needs of the community but geared to exploiting ‘consumer’ behaviour to generate advertising and retail sales revenue.

We are now witnessing this once again in the AI space as the hype is driving the investors to scramble.   Someone like Cathy Wood, CEO of Ark Invest sees a massive industry emerging where currently there is virtually none, and this is what happened with the Web and with Open Data.  The digital disruptors understood the affordances of digital information and companies like Facebook and Google hoovered up whatever they could to both ingest new technologies and also to close down competition.  Because Governments had absolutely no idea of how digital information works they didn’t see what was obvious and right in front of them – because there is no market now doesn’t mean that there won’t be soon! This marketing myopia is responsible for the mega corporations we have today which dominate the online world existing as the most valuable companies of all of recorded human history.

There is hope that perhaps the release of ChatGPT and the ability of the general public to use the systems may wake people up to its potential, and build some sort of momentum towards either regulation and/or Anti-Trust action, something that people like Zephyr Teachout are fighting for (see Break ‘Em Up!).

There is also hope that there may be global communities who can use the very technologies themselves to craft some new phase of openness in partnership with governments and the Third Sector with the objective of serving humanity rather than big corporations (I won’t hold my breath for this one as they will most likely be too slow).

Finally, starting with the European Union’s AI Act, governments may not repeat the mistakes in the first two digital waves of leaving regulation too late and may listen to the chorus of voices calling for it (Sam Altman, CEO of OpenAI for one).  I say may because thus far their track record is not good.

I think that the real kicker will be when the smart devices we wear on our bodies are embedded within our bodies – smart contact lenses as an obvious example – because then they will be required to meet the standards of Medical Device regulation, although by that time it will most likely be too late.

What there is no doubt about though is that there is a shift happening on all fronts, and I believe it is the younger generation, the Millennials and younger, who need to take the lead now in determining how humanity proceeds.  They have grown up in the digital soup and as I have written before I believe this is their time.  They are the ones who are now crafting careers, bringing up families and they are the ones who will be supporting us Elders as we age. They are much more connected with their peers globally than we ever were and they don’t seem to be as binary in how they see the world.

As I reflect on this I am brought to consider The Strauss-Howe Generational Theory, and the idea of the Fourth Turning. My late 20’s aged son made an interesting comment to me a few months ago – he said “I can feel a change coming Mum, I’m not sure what it is, but it’s big“.

One thing is for sure is that when, not if, some form of artificial sentience emerges it will shake humanity to the core – we will need to reconsider everything we think is ‘normal’ in our daily lives from how we learn, how we work and even to our concepts of God. History tells us that when humans go through major change it is often with violence and aggression as we lash out to apportion blame or seek redress.  This is not going to do any good as the machines won’t care, and this is why, above all

The lesson of AI and of formidable breakthroughs to come, such as quantum computing is that we may now be reaching the point where something most unnatural to humans is the only thing that can save us: humility. (Howard French)

One thing that surfing teaches you is to be humble and to respect the ocean and realise that it and its’ waves can swamp you at any moment.  It teaches you to read the tides and the wind and work with the environment, not try to fight against it.  Much of this comes from experience but also from being open and having the basic skills (such as knowing how to swim).

I believe that this is where we are now and that what we have to do is to nurture, educate and empower people to harness the good in the digital realm, to learn to craft our surfboards, to learn to ride the waves well and use that knowledge to help future generations always focus and remember their humanity and their part of the greater whole on this Pale Blue Dot of a planet we all inhabit.


Creative commons CC BY-NC-SA

Creative Commons CC BY-NC-SA: This license allows reusers to distribute, remix, adapt, and build upon the material in any medium or format for noncommercial purposes only, and only so long as attribution is given to the creator. If you remix, adapt, or build upon the material, you must license the modified material under identical terms.

CC BY-NC-SA includes the following elements:

BY

Creative commons BY

– Credit must be given to the creator

NC

Creative commons NC

– Only noncommercial uses of the work are permitted

SA

Creative commons SA

- Adaptations must be shared under the same terms