Analogue leadership in a digital world

Kardamyli 2023

Kardamyli 2023

Last month I spent five days in the delightful village of Kardamyli for the third Kardamyli Festival.

I first heard if this festival during lockdown through the HowToAcademy but sadly could not attend the first one in 2021 as I was in Australia but this year I made it.

The festival is held in the village of Kardamyli, one of the oldest settlements in the Peloponnese and once home to “Britains leading travel writer” and adventurer Patrick Lee Fermor.

The site itself is located in a large car park opposite Πέτρινοι Πύργοι στην Παραλία (Pétrinoi Pýrgoi stin Paralía), a picturesque beach populated by stone cairns with their own individual personalities added to by the daily passers by on their way to the beach.

I came to Kardamyli largely out of curiosity and the quality of the world class speakers featured on the programme many of whom I have long followed and greatly admire. The festival is undoubtedly a labour of love by all involved including a band of friends and family volunteers who cheerfully did the meet-and-greet, played bouncer and guard, and shepherded the 350 odd attendees who turned up to pretty much every session.

Nothing about Kardamyli disappointed.

We stayed in a lovely home-run studio, wandered the town, had some interesting conversations and explored many of the ideas that were raised.  The Festival began with Bettany Hughes exploring Socrates’ concept of “The Good Life” which especially resonated given the foundational concepts upon which Intersticia is based, particularly the work of John O’Neil.

Building upon this was Andrea Wulf’s “Magnificent Rebels”, a work which I read when it first came out and found fascinating in terms of how fate brought together some of the most important thinkers of late 18th Century Germany in one place at one time.  Many of these thinkers were instrumental in helping to define “The West” and the construct that underpins it which archaeologist Naoise Mac Sweeney extrapolated in her sweeping view of the evolution of the idea and concept of “The West” from where it began to where we are now.

Every society has a set of beliefs that go far beyond the life of the individual and have the power to define – and to divide – us.  (Neil MacGregor)

Neil MacGregor took us on a whirlwind tour of humanity through the objects, places, rituals and spaces that connect with and represent the theological dimensions that cultures and societies have used to identify themselves, ranging from the challenge of observing Ramadan in Space to the creation of the Shrine of Pont d’Alma for Princess Diana in France.  He concluded by asking some very poignant and crucial questions for “the West” centred around the challenge of how to define our shared beliefs as a society in the age of secularism, ‘the individual’ and the underpinnings of liberal democracies in their desire to embrace and embody multiculturalism.

These questions were further interrogated by the best Prime Minister that the UK never had” Rory Stewart who challenged the West in its need to explore new forms of both economic and democratic models suited to the 21st Century calling for Aristotle’s rhetoric as a powerful tool with which to explore the challenges and opportunities which lie ahead.

Stewart identified the three corners of what Harvard Professor Mark Moore has transformed in to The Strategic Triangle which is one of the foundational models for the design and analysis of Public Policy and which we use as a fundamental model for both our Founders and Coders Social Machine curriculum and in our Brave Conversations.

The triangle is based on the interplay and interconnection between:

  • Pathos – the need for emotional communication and resonance in exploring ideas
  • Ethos – the need to discover moral character in order to talk about Truth
  • Logos – the need for new ideas and vision

A key element to this is in understanding the notion of the authorising environment, where the power lies within a society, how it is wielded and where it’s limits lie.  This was especially relevant when German journalist Kai Strittmatter gave his perspective on Xi Xin Ping’s China in what I found to be a very one-sided and naïve criticism of an alternative to The West as represented by the Chinese State.  My main criticism is that Strittmatter was critical of the China surveillance model without acknowledging or even recognising the insideousness of our Western Surveillance Capitalism, let alone being in any way open to the potential advantages that might be presented by Chinese Data and AI Regulation.  I cannot claim to have any knowledge of the Chinese system but I don’t believe that we in The West should be lauding the system we live in to be one that is superior.  There could have been a lively and useful debate around this but sadly very few really interrogate our own system from the data perspective and the talk by Anjana Ahuja, whilst being aimed at the “average punter” gave some good insights, was fairly superficial and lightweight.

BBC Russia journalist Steve Rosenberg made his first visit outside of Russia since the invasion of Ukraine in 2022 and through his stories and songs gave us a glimpse of the reality distortion that is life in 2023 Russia.  Both he and Kai Strittmatter provided contrasting perspectives on aspects of humanity that we in the West too often fail to appreciate because we don’t know how to interrogate their belief systems, mythologies and deeply ingrained traditions, rites and institutionalised practices.  If more of us did we would be far more prepared for the events which surround us and perhaps more nuanced in our analysis of them.

This became startlingly obvious when, on Saturday 7th October, we awoke to news of Hamas’ Operation Al-Aqsa Flood  and the outbreak of the largest Israeli-Palestinian confilct since the War of Independence in 1948.  My first inkling of the event was through a Telegram message from our Palestinian Fellow who comes from Gaza, but from that moment on the rest of the Festival was underpinned by what was happening in that part of the world.  Tom Holland addressed this directly as he opened the second day by giving some historical perspectives to the events we were witnessing.  After the Bar Kokhba revolt in 132CE it was the Emperor Hadrian who had determined to deal with the Judean uprisings once and for all by renaming the city of Jerusalem to be Aelia Capitolina and the Province of Iaedea to be named Syria Palestina, (Palestine) after the historic enemies of the Jews, the Philistines.

It goes back that far and yes this still matters.

History never repeats but it does often rhyme.  (Mark Twain)

Those who cannot remember the past are condemned to repeat it. (George Santayana, The Life of Reason, 1905)

To have a major world event happening in the midst of this Festival demonstrated the crucial importance and value of literature and history in helping to frame any possible futures we might envisage but also to perhaps suggest alternative paths we may take in order to not repeat the mistakes of the past.  It is through our myths, stories, poems, traditions and belief systems that we as humanity seek to articulate and describe our deeply embedded cultural DNA, so deeply ingrained within us that we fail to recognise their power as the scripts which run our lives.

The Kardamyli Festival provided those who attended with the opportunity to reflect on this for a few days inspired by the work of those who do this full time.  Armed with these insights and the most peaceful and idyllic setting there should be no excuse not to dare to strip away some of the filters and lenses with which we view the world in order to identify and address our own blind spots and perhaps, just perhaps, begin to frame the world anew.

My only criticism of the Festival was the lack of diversity in the audience, but it is only new and it may be that with time it will grow and mature in order to attract those with more outlying ideas whose intents are nonetheless philanthropic in nature.

Humanity 101 – Intersticia Retreat 2023

Humanity 101 – Intersticia Retreat 2023

The privilege of trust is the opportunity to empower others.

The privilege of trust is to recognise that leadership is inherently temporary and carries a specific responsibility to do no harm.

The privilege of trust is to recognise the vulnerability of others and see it as an opportunity to encourage not to exploit.

The privilege of trust is to elevate the needs of the many above the ambition of the few.

(Paul Gilbert, https://www.lbcwisecounsel.com/resources/articles/article/the-privilege-of-trust/)

I wanted to begin this post with a quote that spoke to my feelings about our fourth Intersticia Retreat which we held in early September 2023 at the beautiful Darwin Lake Holiday Village.  Eighteen members of our Intersticia community, plus Nancy the wonder-dog, came together amidst British Rail strikes, delayed flights and in to THE most glorious English Autumn weather, a symbol in itself of one of the global challenges we all face.

This fully face-to-face opportunity has been a long time coming and much has happened in the six weeks since we met which is having a direct impact on many members of our community.

As we entered the interstice of our Retreat every one of us came with our own expectations and hopes for the week – suffice to say that one thing about our community is that one never knows what will emerge and whatever our initial plans the group will always take things in unexpected and creative new directions!

The general plan was to craft a mix of learning within all the various ‘spaces’ through the combination of different groupings and conversations.  For the ‘formal’ we planned for a combination of content and discussion complemented by work in both plenary and small groups; for the ‘informal’ our objective was to use the small groups of each cottage to collaborate around the task of catering; for the broader environmental context we planned for long walks and the opportunity to reflect in the beauty of the Peak District.  Needless to say things never go according to plan!

It became apparent from the first day that our group of curious, highly intelligent, generous and fearless young people were wanting to contribute as much as they could to the learning of others and to the interrogation of any and all topics of concern.  As the week progressed ach and every member contributed to what our shared space became with each session stretching through questions and discussion and the intersticial space between their professional and career aspirations and their personal ambitions and challenges opening up.

This is precisely what we have been hoping to achieve and create for our community and the most wonderful thing to observe was that there was a safe and supportive haven where each individual could experiment, explore and push the boundaries of their own personal development in concert with that of their colleagues.

Whilst we didn’t go on as many, or as long, walks and explorations as we had hoped we did explore together some of the elements that Stewards of Humanity for the 21st Century will need to become proficient with in order to have any measure of impact or success, and our hope has always been to be able to draw on our own community for expertise in this.  Our 2023 Retreat demonstrated this very powerfully with all of the content sessions being delivered by members of our community.

The container of the Retreat itself was held very powerfully by Sam Crock and Marianne Darre who worked tirelessly to craft and recraft the agenda as each day reformulated itself.  Our walking was continuously reformatted by Louise Sibley who creatively presented different options based on weather, timing and everyone’s energy, whilst Dan Sofer found innovative ways for his Small Group to meet embracing the natural environment.

One of the key areas we explored was that of Voice and how to express oneself and in this we were ably led by Jess Chambers who brought her vast knowledge and experience to get everyone thinking about how we all communicate.

In life we have enough breath to speak our thoughts. 

This was beautifully complemented by Dr Marco Valerio who brought us the work of his PhD on The Placebo Effect and Somatics which enabled everyone to begin to more fully understand the power of mind-body connection and the impact it has on how we are perceived and how we interpret the world around us.  For me the poignancy of this was the importance of appreciating and understanding that whilst we are hurtling towards an AI driven world we do live in the physical space – the interstice between the analogue and the digital.

Dr Philip Hayton gave this context by introducing the concept of VUCA, first described by the US Army War College in 1987 to describe a complex, multilateral world perceived to have emerged after the end of the Cold War.

 

Philip built on Marco’s work by presenting this as the challenge that we all need to develop skills to cope with, and referred in particular to Polyvagal Theory and the need to understand, proactively work with, and continually recognise the power of our innate biological nervous systems in how we react to the environment around us.

The complexity of living in such an environment is something that challenges us all, and as humanity struggles to manage the convergence of multiple issues including the intensity of climate change, the increasing development of artificial intelligence, the rise of inequality, the increasing humanitarian needs as the result of conflict, (see the UN Foundation and my own questions to Pi).  As the current commotion around AI intensifies – it is estimated that some half a trillion US dollars is being invested in it’s development – the reality is that the only way to even begin to address these is to invest in people.

As we have always stated, however, we need Analogue Leadership in a Digital World and therefore, as much as we need to understand our biological selves we also need to learn to understand the technologies which are evolving. This is where Dr Ardavan Afshar‘s presentation on Machine Intelligence was so important, and stimulated a very long and detailed conversation around its role and place within our societal systems.

For every dollar we invest in Artificial Intelligence we need to invest in human minds, in human beings. (Yuval Noah Harari, interview with Zanny Minton Beddoes and Mustafa Suleyman – The Economist)

As is so often the case we could have included so much more drawing on the work of our Fellows and this is now one of our key challenges as the expertise within our community continues to grow and develop.  But it presents us with an exciting challenge – to craft a programme for each of our Fellows helping them to learn individually but also for them to teach the rest of us and grow the knowledge base of Intersticia itself.

This is the challenge that John O’Neil gave us when we began Intersticia many years ago, and it is incredibly rewarding to feel that we are beginning to achieve some small success even though we have a long way to go.

Our 2023 Intersticia Workspace at Darwin Lake, Matlock

Small Group Work

I would like to thank each and every one of our Fellows who participated and contributed, but in particular I’d like to give a huge thanks to our Elders – Sam Crock, Marianne Darre, Louise Sibley, Dan Sofer and Philip Hayton – who so selflessly and generously gave their time, insights, knowledge and wisdom to everyone in our community.  It was also wonderful to have Ed Saperia join us which meant that all of our key partners were with us.

Intersticia has now been slowly crafting our community for the past decade and bit by bit bringing people together whom we feel share our values, ambitions and hopes for the future of humanity (I presented a video telling the story of our first decade which can be viewed with permission only).  This 2023 Retreat nudged this just a bit further forward but with it raised the bar of what I believe we can, and should, aim to achieve.

A cohort of authentic, courageous, dedicated, humble, curious and yet gracious people who will each take the challenge of creating a better world.

Our work is just beginning.

Digitally Savvy

Digitally Savvy

A few weeks ago I had the distinct pleasure to do an interview with Simon Western on his Edgy Ideas podcast.

As always in a real human-to-human conversation it enabled me to think through some ideas which have been percolating for quite a while.

Thank you Simon and for Aodhan Moran for introducing us.

Listen to the “Edgy Ideas” Podcast with Simon Western.

Surfing the digital wave

Surfing the digital wave

The best thing we can do is build surfboards and ride the wave. (Scott Davis)

It seems that we, as humanity, are at an inflection point, a period in human history where quite literally anything could happen!

Some, like Yuval Noah Harari, believe that unless we regulate and control the evolving artificial intelligence it could well be the end of human history as we know it.

What would happen once a non-human intelligence becomes better than the average human at telling stories, composing melodies, drawing images, and writing laws and scriptures?” The answer, he believes, casts a dark cloud over the future of human civilisation. 

 

We should regulate AI before it regulates us.  (Yuval Noah Harari)

Others, like Scott David, believe that if we synthesize human and Artificial Intelligence and augment our thinking we may finally have the tools we need to cope with the other major challenges of the 21st Century.

Some, like Jaan Tallinen and those at the Future of Life Institute (FLI), believe that we need to pause the giant AI experiments in order to take time to more fully understand the risks.

Others, like Pedro Domingoes, criticise this call and want to forge ahead because, as Alan Kay said

The best way to predict the future is to invent it.

Regardless of which side one takes what this all demonstrates is that it is not the ‘it’, the AGI, that we should be worried about, it is us, the humans, and how we are going to deal with whatever emerges.

The one good thing about the FLI letter is that it has been a catalyst for debate and has finally brought the issue of AGI to the public forum.  The reality is that regardless of what the technologies can or can’t do it is our social systems that will generate the real social change, for instance companies like IBM pausing its hiring to replace 7,800 jobs with AI and Microsoft’s development of Co-Pilot.  Companies are not hiring graduate developers, they are caught up in the hype around the tech and this is causing major ripples in the labour market.

Whilst it may take a decade or more for AGI to emerge (in whatever form that may be) there is no doubt that in the short term the hype around it will impact peoples’ lives and this is what will create more risk than the potential of the machine.

One way of looking at the current situation is that following the disruption of the Covid19 Pandemic there are now the accompanying advances in AI, and the other technologies which are now converging, which have shaken up and unfrozen much within our social and economic systems.  This can be illustrated by considering two models:  Kurt Lewin’s Freeze-Unfreeze (Lewin 1947 Frontiers in group dynamics) and William Bridges Transition.

It is the unfrozen state (the Interstice) which provides the potential and opportunity for change and renewal before the new-normal is established. This is a time of excitement and energy but it is also a time of fear and potential unrest because change can be frightening as the old ways die and the new is not yet clear (see Kubler Ross).

In facing any change we humans need to feel a sense of agency in order to craft a path forward and accept the change being presented to us, and this is what we sought to explore in our recent Brave Conversations in Brussels.

We presented our participants with three case studies, each of which posed a number of questions around personal choices in response to specific situations in which Artificial Intelligence was a key determinant.  The first was based on a challenge posed by a large language model released on to the Internet; the second related to AI and a health care issue, and the third related to the development of government policy.  In each case we armed our participants with Mark Moore’s Strategic Triangle asking three questions to determine how they as individuals could potentially respond to each case:

  • Ethos – What should we do?  What do our values, ethics and morals guide us to do?
  • Logos – What can we do?  What resources to do we have?
  • Pathos – What may we do?  What authority are we acting on?

For much of the last fifty years advances in information systems have been made by scientists, such as Geoffrey Hinton and his peers, who have developed technologies because they could – they were able to do it, they could solve whatever problem they were addressing, and they charged forward.  They didn’t necessarily ask if they should – the combined outcome of the Ethos and Pathos in the trilogy.

As a complement to this investors, particularly in Silicon Valley, helped thrust these technologies in to the commercial realm because they understood the value of digital information and digital disruption – Uber, Air BnB and all the companies participating in what Shoshana Zuboff has termed Surveillance Capitalism.

Funnily enough when I met Zuboff at a signing of her book in 2019 I asked her if she had seen all of this coming when she wrote The Support Economy in 2002 – she said, “Yes we did, we just hoped it wouldn’t happen.”  Sadly it did.

The commodification of personal data for commercial gain has created a marketplace that trades exclusively in human futures (Zuboff), feeding the Social Machines we have today, exploiting our innate human need to connect with each other.  What is worth considering here is that whilst there has been enormous focus on the issues of privacy and surveillance what has not been much discussed are the ways that these platforms view the emergence of communities as a byproduct rather than the driver of their success.

The Web was created by Tim Berners-Lee as a tool to facilitate communication and information sharing between people within a community and it was the trust within that communities that enabled the sharing to occur.  Companies then sought to commercialise the Web, which had been given to humanity for free by Tim, and as a result sought to create monopolies by closing elements of it down – the walled gardens of social media.

Just over a decade ago the Open Data Movement gained traction, particularly due to the election of the tech-savvy Barrack Obama as US President in 2008 and the Parliamentary Expenses Scandal in the UK in 2009.  There was huge hope for this movement which changed the paradigm around public sector information from being closed and hidden to that of being a public asset to be harnessed and exploited for public good – this resulted in the ‘open by default‘ principle.  Sadly, despite the excitement and early wins achieved by Government Digital Services around the world the truth is that they managed to pick the low hanging fruit but the challenges of true digital transformation have proved to be painstakingly difficult – governments are still talking about it in the same way that they were over a decade ago.

It seems to me, having lived through and closely observed these events, that each phase of opening up emerges from within communities that are seeking to solve real problems that affect them.  Someone then has the bright idea of commercialising which then encourages the sharks to start circling with their growth and profit mindset and next thing whatever was shared and open became monetised and closed, no longer focused on the needs of the community but geared to exploiting ‘consumer’ behaviour to generate advertising and retail sales revenue.

We are now witnessing this once again in the AI space as the hype is driving the investors to scramble.   Someone like Cathy Wood, CEO of Ark Invest sees a massive industry emerging where currently there is virtually none, and this is what happened with the Web and with Open Data.  The digital disruptors understood the affordances of digital information and companies like Facebook and Google hoovered up whatever they could to both ingest new technologies and also to close down competition.  Because Governments had absolutely no idea of how digital information works they didn’t see what was obvious and right in front of them – because there is no market now doesn’t mean that there won’t be soon! This marketing myopia is responsible for the mega corporations we have today which dominate the online world existing as the most valuable companies of all of recorded human history.

There is hope that perhaps the release of ChatGPT and the ability of the general public to use the systems may wake people up to its potential, and build some sort of momentum towards either regulation and/or Anti-Trust action, something that people like Zephyr Teachout are fighting for (see Break ‘Em Up!).

There is also hope that there may be global communities who can use the very technologies themselves to craft some new phase of openness in partnership with governments and the Third Sector with the objective of serving humanity rather than big corporations (I won’t hold my breath for this one as they will most likely be too slow).

Finally, starting with the European Union’s AI Act, governments may not repeat the mistakes in the first two digital waves of leaving regulation too late and may listen to the chorus of voices calling for it (Sam Altman, CEO of OpenAI for one).  I say may because thus far their track record is not good.

I think that the real kicker will be when the smart devices we wear on our bodies are embedded within our bodies – smart contact lenses as an obvious example – because then they will be required to meet the standards of Medical Device regulation, although by that time it will most likely be too late.

What there is no doubt about though is that there is a shift happening on all fronts, and I believe it is the younger generation, the Millennials and younger, who need to take the lead now in determining how humanity proceeds.  They have grown up in the digital soup and as I have written before I believe this is their time.  They are the ones who are now crafting careers, bringing up families and they are the ones who will be supporting us Elders as we age. They are much more connected with their peers globally than we ever were and they don’t seem to be as binary in how they see the world.

As I reflect on this I am brought to consider The Strauss-Howe Generational Theory, and the idea of the Fourth Turning. My late 20’s aged son made an interesting comment to me a few months ago – he said “I can feel a change coming Mum, I’m not sure what it is, but it’s big“.

One thing is for sure is that when, not if, some form of artificial sentience emerges it will shake humanity to the core – we will need to reconsider everything we think is ‘normal’ in our daily lives from how we learn, how we work and even to our concepts of God. History tells us that when humans go through major change it is often with violence and aggression as we lash out to apportion blame or seek redress.  This is not going to do any good as the machines won’t care, and this is why, above all

The lesson of AI and of formidable breakthroughs to come, such as quantum computing is that we may now be reaching the point where something most unnatural to humans is the only thing that can save us: humility. (Howard French)

One thing that surfing teaches you is to be humble and to respect the ocean and realise that it and its’ waves can swamp you at any moment.  It teaches you to read the tides and the wind and work with the environment, not try to fight against it.  Much of this comes from experience but also from being open and having the basic skills (such as knowing how to swim).

I believe that this is where we are now and that what we have to do is to nurture, educate and empower people to harness the good in the digital realm, to learn to craft our surfboards, to learn to ride the waves well and use that knowledge to help future generations always focus and remember their humanity and their part of the greater whole on this Pale Blue Dot of a planet we all inhabit.

A European Brave Conversation

A European Brave Conversation

Last week we held our 21st Brave Conversations event at Atelier 29 in Brussels and the first in partnership with the Digital Enlightenment Forum (DEF).

We began on a wet, cold Brussels morning but garnered a group of intelligent, engaged and curious individuals keen to converse with other humans in the room about our digital lives in 21st Century.

Since our last events in 2022 much seems to have shifted within the digital landscape, particularly with the release “in the wild” of ChatGPT and other generative AI and large language models.  It took ChatGPT just five days to gain 1 million users following its release in November 2022 and before long thousands of very noted people had signed the Future of Life Institute Open Letter to Pause Giant AI Experiments.

By the time we got to Brussels even the Smart Humans who had invented the tools themselves (such as people like scientist Geoffrey Hinton) were worried and struggling to keep up and the major tech companies were scrambling to maintain some sort of competitive edge by rushing to integrate the tools in to their mainstream offerings (for example Microsoft’s launch of Co-Pilot).

So what is this all about?  For anyone who has been watching the tech space the events of the past few months were entirely predictable, as was the human excitement / panic / reaction / confusion that followed.  We’ve been here before, although not necessarily with a suite of technologies with the impact to profoundly change human society as these ones.  Ever since the invention of writing people have warned about it’s dire consequences – Socrates of writing; Gessner of the printing press; Carr of Social Media.

In all the hype swirling around at the minute, and particularly that driven by the major tech companies, we need to remember that the success of humanity as a dominant species comes from our ability to to co-operate with each other, to transmit and build on the knowledge of our forebears, and to develop and utilise tools that have become increasingly sophisticated.

Human beings have a unique ability to cooperate in large, well-organized groups and employ a complex morality that relies on reputation and punishment.  (Fraans de Waal

The tools we are currently developing are merely the latest in a very long line which have helped us survive and thrive, and these tools too will become necessary in order to help us meet the challenges we currently face.

But as Roy Amara states

Technology is neither good nor bad, but nor is it neutral.

So what did all these mean for the conversations we had in Brussels on 12th May?

After the years of Covid one of the things we feel is most important with Brave Conversations is to get the humans in the room, and a number of people made a big effort to get to Brussels to be with us in person. This meant that there were human-to-human interactions, unmediated by any technology, and the ability for each person to explore their ideas within the physical confines of a human space.

We had a blend of participants which included the Board of the Digital Enlightenment Forum, academics, some people working in policy with the European Union, Students, and a couple of creatives.  A fabulous blend of minds and perspectives to craft interesting insights and a nuanced approach to how everyone was feeling about the current technology onslaught.  Some of the comments below give a flavour of the conversation but perhaps the most important was when one participant told me that she came along because she can’t find anywhere else to have these conversations in a safe space without judgement or a predetermined agenda.

This is what we seek to create in Brave Conversations and which our partnership with the Digital Enlightenment Forum promised to bring.

I would like to thank as always Leanne Fry for her continuing partnership, it was wonderful to work with Thanassis Tiropanis yet again and thanks to him for helping facilitate.  To the Board of DEF thank you for your support of the event and to the inimitable Myriam de Greef an enormous thanks because without Myriam no conversations would have been had!

 

The Age of the Smart Social Machine

The Age of the Smart Social Machine

Title adapted from Shoshana Zuboff’s ground-breaking 1988 book

Last week I attended a Group Relations Conference in India.  These events are always intense (this one even more so!) but they provide a unique opportunity to consider oneself with a human social system.

One of the things that occurred to me as we were exploring the role of the unconscious as it was playing out in the here and now (all psychobabble terms but in fact hugely important) was that there are multiple unconsciouses which operate as we live our dual analogue-digital lives. Carl Jung described what he called the collective unconscious which complements and influences all of our conscious thinking and actions as we participate within the human system.  I believe that there is now in addition a digital unconscious which is emerging in the digital realm as the result of our digital interactions within the Social Machine and an even more powerful machine unconscious which is evolving in the artificial intelligences we are building.  I drew the image below to try to illustrate my conjecture to the group – needless to say most didn’t understand.

In What Technology Wants co-founder of Wired Magazine and co-Chair of the Long Now Foundation Kevin Kelly talks about The Technium:  A Living System of Technology which encompasses the entire system around technology – culture, art, social institutions, through to “the extended human”.  In his latest blog post Kelly states that

For a while I’ve been intensely exploring generative AI systems, creating both text and visual images almost daily, and I am increasingly struck by their similarity to dreams. The AIs seem to produce dream images and dream stories and dream answers. The technical term is “hallucinations” but I think they are close to dreams. I’ve come to suspect that this similarity between dreams and generative AI is not superficial, poetic, or coincidental. My unexpected hunch is that we’ll discover that the mechanism that generates dreams in our own heads will be the same (or very similar) to the ones that current neural net AI’s use to generate text and images.

 

The foundational mode of the intelligence is therefore dreaming.

Don’t get me wrong – I’m not necessarily agreeing with Kevin Kelly here nor am I buying in to the hype about machines hallucinating.  What I am pointing out is that the machines are analyzing human data using human crafted algorithms and therefore there is something of our unconscious that is embedded in their emanations which is now being made explicit and visible.  We can only refer to concepts and ideas in human terms (hence we anthroporphosize) and to describe what the machines are doing is almost like taking us in to our own unconscious (this is where the concept of Azimov’s Psychohistory comes in to play).

One way of accessing the collective human unconscious is through Social Dreaming, the practice of sharing, associating to and working with dreams in a matrix in order to identify social trends and social dynamics. As our machines are coming together and bringing our data with them it may well be that what we are seeing is a manifestation of the collective human unconscious expressed through the output of the machines – which may seem like hallucinations – but how can we know given the opaque nature of how they operate?  And, if they have begun to go down that path then they are already moving beyond our realm of understanding.

The real challenge will come when they become able to acknowledge and recognise this unconscious as something different from a probabalistic algorithm, or are embodied, as the work of people like Rodney Brooks and so much of our Science Fiction (Humans, Blade Runner, Ex Machina) has shown us,

So what does this mean for us as humans?

Up until the recent advances brought about by the large language models such as ChatGPT talking with the average person about the advancing machine intelligence was like describing an elephant.  Every person sees things that directly relate only to them just like the story of the Blind Man and the Elephant.

This relates as much to technologists as to everyone else as I’ve witnessed countless times. The most obvious to me was when

I heard a very notable “father” of the digital world speak at a conference and when asked what he would recommend about how to address the rise of pornography on the Web he responded  “well just don’t look at it!” 

Many of the people I’ve met who have built the machinery of the digital world are extremely naïve, building the tools because they can, not asking whether they should. When Geoffrey Hinton resigned from Google last week he commented

I console myself with the normal excuse: If I hadn’t done it, somebody else would have,

As with all kids in the candy shop scenarios if you give a scientist a problem and lots of funding they will develop new tools and techniques regardless of the potential consequences. Hinton and others like him saw only part of the Elephant without considering it as a whole animal let alone part of a herd.

Which brings in the question of ethics.  Whilst some of the big companies have created Ethics Advisory Boards the reality is that much of the development work in the field of AI is now happening in the open source space where there is no supervision or oversight.  These people still want to move fast and break things and the very nature of Ethics is designed to slow things down by asking difficult and challenging questions.

Governments and regulation are also designed to slow things down because politics and policy operates on human time which is analogue, messy and the very opposite of an efficient machine.  Humans need time to process, and our relationships are based on what people like Anna Machin and Rachel Botsman call Trust Friction – the stickiness and the glue that underpins how human systems operate.

The whole point of human relationships is that they are not efficient, because they take time and brain power to develop and maintain. Trust needs friction.  (Anna Machin)

Human systems are analogue and analogue takes time.  In the analogue world:

  • You can’t fire off a letter you need to write and post it
  • you can do an online transfer you need to go to the bank
  • you can’t immediately alter a design you need to redraw it
  • you can’t just be friends with everyone you need to build trust through shared experiences which takes time.

Machines don’t want friction – it slows them down, makes things break and ruins their power to work ratio – i.e. “productivity”.  The ultimate idea of this is the Paperclip Problem where smart machines instructed to make paper clips will consume all the resources in the universe (including us) to just make paper-clips.

With the advent of ChatGPT and it’s brethren the removal of friction within our human-machine interactions has now gone to the next level and smart AI is now being embedded in to pretty much all of our digital processes – just think of how many conversations your have and hear which involved technology of some sort.

So now I’d like to bring in a new analogy, the frog in the pot of soup as the temperature is gradually turned up.

Our human need to process and understand means that we as humanity have been sitting in the digital soup for at least half a century but in the first half of 2023 suddenly it is feeling a little uncomfortably warm.

As the soup heats up there are some who are going to want to jump out of the soup – there are some who going to boil and there are those who will adapt.

The questions now seem to me to be who each of these will be and what will happen in each case.

Let’s consider some options:

Firstly, those who want to leave.  It may be too late but, as with the Luddites in the Industrial Revolution, there is much wisdom in what they have to say and perhaps an alternate reality has much to offer as it always has throughout the ages.  There is something of this in Hari Seldon’s concept of building a Foundation on the furthest planet in order to separate itself from the chaos of the main system – an opportunity to isolate, slow down, reboot and recreate.

Secondly, those who are trapped. Sadly there is always a high cost to any radical change and many will find the “new world” frightening and overwhelming. Just one example is the rate of teenage girl suicide already.  Along with many others I have spent the past three decades of my life working to understand the transition that is upon us and help people prepare for the change with minimal effect.  Some have heeded the lessons, most have sat and enjoyed the warmer water oblivious to the dangers. I’m not sure anything can help these people any more as I think the rate of change is going to be too fast.

I think both of these groups will struggle and push back through both fear and anger and the manifestation of this could be dangerous.

Finally, there will be those who adapt, survive and thrive.

With all the noise about the technology and how fast it’s progressing or whether it should be paused or stopped the real point is what are the humans going to do about it?  Therefore it is the third group I am most interested in and I believe that it is being led by the younger generation but needs to be supported and mentored by the 21st Elders who have memories of the analogue world and the value of its friction and temporal nature.

Some fear the AI Apocalypse and that non-Western (WEIRD) cultures may gain a technological advantage.  This is problematic on so many levels particularly given that it is the minority-population WEIRD West that has created the culture of growth and the technologies themselves.  Some alternative thinking might be precisely what is needed now and some less privileged cultures may, in fact, be better prepared for what is to come.

The history of automation is that we humans have invented machines to take away the dirty, dangerous and dull jobs … now we are taking away a whole host of others.  These technologies can be used to solve the very challenging problems which confront us in the 21st Century and the sooner we learn to work constructively and creatively with the machines the sooner we will harness the power that is before us for good.

The more I feel people heading in one direction as a herd the more I want to go the other way and explore what is happening there – this is where the adaptive survivors will be.

 

December 2023
M T W T F S S
 123
45678910
11121314151617
18192021222324
25262728293031

-->