Analogue leadership in a digital world

Socio Technical, Socio Digital, Techno Social …

Socio Technical, Socio Digital, Techno Social …

We believe that unregulated generative AI is a clear and present danger to democratic
sustainability. The imminent problem is not super intelligent robots taking over the world, but the
threats to human individual and political freedoms posed by the deployment of simultaneously
exciting and yet potentially dangerous new technologies. We need to address the full range of AI
challenges, and in so doing, the public’s voice must be at the table, not only those of the already
powerful.  (Statement of the Digital Humanism Initiative 2023)

The last few months have been a bit of a whirlwind in terms of travel, meeting interesting people, exploring ideas and discovering insights.

In my previous post I talked about our Brussels Brave Conversations and some of the thoughts that came to me as I wandered around Brussels and began to explore the world that is the European Parliament.  As a complement to this I went to the Digital Humanism Summit 2023 in Vienna at the invitation of George Metakides and Hannes Werther where many of the Computer Science and Artificial Intelligence luminaries from Europe and the United States came together to talk about Generative Artificial Intelligence and the sustainability of democratic societies.

The explosion of Large Language Models on to humanity in 2022 – 2023 has suddenly propelled the conversations around these technologies into the public domain and with this has come a sort of mild panic about existential risk, the decimation of communiites and the irrelevance of human beings (Harari 2023).

The question is that we now have within our grasp the most powerful technologies that human kind has ever developed so how can we ensure that they are used for good (the benefit of humankind and the planet) rather than evil, and how can people feel secure about the developments of such technologies which are way beyond the abilities of most people to understand?

It is paramount that AI developers and regulators are asking themselves the right questions about the potential impact of AI. She suggests a greater focus on ensuring people feel secure in a world with AI, rather than trying to convince them to trust it. (Joanna Bryson at ANMC23).

As these conversations around AI unfold I am often bemused that it has taken so long for the proverbial penny to drop.  These technologies have been around for a very long time but as always it is the human condition not to really focus on things until they are right in front of us – we often seem to have little imagination about things that aren’t already around us, which is also why Science Fiction is so important a genre for people to engage with.  It is also why we seem to get distracted with the next bright shiny thing that emerges and then become somewhat derailed in our common sense and perspective.  As the Gartner® Hype Cycle™ so brilliantly illustrates we get excited, then we get disillusioned, then things start to calm down and we start to look at things from a more realistic perspective.  See the Gartner AI Hype Cycle 2023.

So by the time we had our 2023 London Brave Conversations at Newspeak House people were beginning to become a bit more balanced in their approach, many had actually used many of the tools and there were many thoughtful and insightful conversations around the benefits of AI whilst appreciating the need to take responsibility for how and when they are used and for the benefit of whom.

As these conversations mature it will be wonderful to see people embrace the tools to help them and help others, and I hope people will be brave whilst also being wise.

As a species we are called homo sapiens – the wise humans.  Now more than ever we need that to be the case.

Gartner, What’s New in Artificial Intelligence from the 2023 Gartner Hype Cycle, 17 August 2023, GARTNER and HYPE CYCLE are registered trademarks of Gartner, Inc. and/or its affiliates and are used herein with permission. All rights reserved.’

September 2023