The emerging Age of AI is challenging us to take some profound questions more seriously than ever before. What does it mean to be human? What is the nature of human consciousness? What potential qualities and capabilities might we possess, that even the most advanced machines will never have?
AI 2027 provides a fascinating glimpse at some possible scenarios for advancing AI. In one (spoiler alert), AI evolves very quickly and soon recognizes human beings as a burden and a barrier, destroying us in its quest to colonize the galaxy. In another, we manage to co-evolve with this technology and establish a mutually beneficial relationship.
What’s is required of us to actualize the potential of the latter?
Some Basic Premises
To begin, let’s ground our explorations in a few premises:
1) We have neither a clear picture of our own potential as human beings, nor the potential of future superintelligent machines. There is much we do not see.
2) Developing toward our unknown potential as human beings will require a fundamental shift in how we broadly approach developmental work: going beyond building functional skills to include evolving human consciousness.
3) This process must begin with new levels of humility: a willingness to accept that with perhaps very few exceptions (certainly not claiming to be one), we are very far from potential levels of human consciousness. Most of us get only occasional glimpses.
This results in a predicament. We lack the capacity to see what we do not see, preventing us from realizing potential.
The Rapidly-Approaching End of Human Expertise?
A mentor of mine, Carol Sanford, had a unique gift for humbling people through producing what she referred to as a conscious shock. She was a master at disrupting false senses of certainty. I can attest first-hand that it can be an uncomfortable experience. We enjoy our false senses of certainty very much.
Part of the challenge is that we belong to cultures that assign status to expertise. We are hard-wired to seek status and without examining the prices we pay, will play whatever game is laid out for us to win it.
Such examination requires conscious energy. Take a moment to reflect on your own patterns. To what degree do you play the status-seeking game of being (or becoming) an expert? How attached are you to the comforts of certainty? How often does this attachment prevent you from engaging with the unknown and unseen?
Of course, this current status game has been disrupted with recent developments in AI—expertise is quickly becoming a commodity. How will the “experts” fight to maintain their hard-won status? What might emerge as a replacement for expertise through the likely coming cultural shift in status symbology?
Emerging Roles and Human Consciousness
What new roles are we being called to in this transforming landscape? What role does consciousness play in these roles?
There is a lively and ongoing philosophical debate around the nature of consciousness and whether or not AI will ever truly possess it. Getting too far into the weeds here is beyond the scope of this article, but we can lay the groundwork for continued exploration with some basic definitions.
Below are several commonly cited definitions of consciousness, drawn from leading thinkers in neuroscience, philosophy, and cognitive science. Each captures a different aspect of the concept.
As you read each definition, reflect on the following:
Considering the broad consensus that no AI is currently conscious and we are uncertain to what degree this is possible in the future, what light does this shine on potential future roles for human beings?
Regardless of the potential of future AI, what role does human consciousness play in the continued development of the technology, including strategic direction and alignment?
Definitions of Consciousness for Reflection
“An organism is conscious if there is something it is like to be that organism—something it is like for the organism.”
— Thomas Nagel, What Is It Like to Be a Bat? (1974)
“Consciousness is the having of perceptions, thoughts, and feelings; awareness.”
— John R. Searle, The Mystery of Consciousness (1997)
“Consciousness corresponds to the capacity of a system to integrate information.”
— Giulio Tononi, An Information Integration Theory of Consciousness (2004)
“Consciousness is the capacity to access, evaluate, and report information in a widespread, integrated manner across the brain.”
— Bernard Baars, A Cognitive Theory of Consciousness (1988)
“A mental state is conscious when there is a higher-order thought directed at that state.”
— David Rosenthal, A Theory of Consciousness (2005)
“Consciousness has two components: arousal (wakefulness) and awareness (of self and environment).”
— Laureys et al., The neural correlate of (un)awareness: lessons from the vegetative state (2005)
.
Developmental Questions on Human Consciousness
Holding the above definitions in mind, what is required of us to develop toward a higher level of consciousness?
How would our systems and approaches to education, learning and development need to change to facilitate this shift?
What would this enable for us in the context of rapidly advancing AI?
What new unique-value-adding roles are calling to us, which will require new levels of consciousness?
Curious to dig in deeper? Check out A Brief Introduction to Developmental Leadership.