Is Shared Effort All We Really Need?

Getty Image stock data center
Stock Image of a data center Getty Image

In my first reflection on AI and how to live well with AI technology I concluded, based on the recent lecture by Stephen Fry and the new book Nexus by Yuval Harari, that there was a manifest need for a set of transparency laws to govern the interaction of AI systems with humanity in order to allow humans and machines to collaborate and cooperate in a way that enhances the opportunities and minimizes the risks. But there is more to this human-machine equilibrium than this; we need to also understand which elements of human behavior and our manifest set of abilities are unique and cannot (for the foreseeable future) be replicated by machines, no matter how seemingly "intelligent" or conversationally proficient they become.

At the 2023 Letters of the Year, Stephen Fry highlighted a response Nick Cave wrote on his Red Hand Files blog to a question submitted by a reader on his view of the role of generative AI in the creation of songs. There is an undeniable resonance that I think we all feel when listening to Fry's dramatic reading of Cave's impassioned response.

For me, the most profound sentiment is that we, as humans, value struggle, both as an individual pursuit, as well as a collective one. And any attempt to short-cut or circumvent that process we find tantamount to threatening the very essence of humanity, as Nick Cave suggests.

Which leads me to the first factor that I think defines the human condition and cannot be replaced or even replicated by AI:

Humanity Factor 1: Humans Uniquely Value the  Effort  of an Endeavor

I think the origin of this sentiment is perhaps deeply rooted in our continued existence and perseverance in (and over) challenging physical environments over millions of years. In effect, the ability to "struggle" and then prosper is the critical component that drives evolutionary selection and progress in the direction of increasing "value" and therefore to continuously thrive as a species. In essence, I think we, as humans, have learned or preferentially encoded in a vestigial way the value of "effort applied" when searching for the most elegant or valid solutions and outcomes.

But there is perhaps an even more prosaic explanation of this behavior: In a recent experiment at MIT that evaluated the efficacy of ChatGPT and Meta's Code Llama LLM, versus traditional search tools in assisting with a software coding task, it was found that although AI could expedite the successful completion of the task, it massively impaired the ability for the participants to learn anything, with the result that when asked to repeat the task from memory, every one of the AI-assisted students failed miserably relative to the search-assisted ones who had to piece the answer together for themselves. This is a manifestation of the aphorism that "you get out of something what you put in"—in other words, that tangible benefits accrue from the effort applied.

Indeed, we each experience this continuously throughout our developmental stages so that it becomes ingrained as our default behavior or modus operandi. Of course, there are those who are preternaturally gifted for whom much less effort seems to be required for a particular task, but while this is undoubtedly true for some people in specific domains of ability, or even for everyone in some domain, in my experience, most people must struggle—or the close cousin "practice"—in order to truly excel at any task.

I think this same idea is also captured by one of my favorite axioms and one often associated with unique human insight—that one must "embrace complexity to find simplicity." This can be interpreted as advocating that one must do the necessary (hard) work to achieve a deep understanding before being able to fully gain the advantage of being able to reduce this to an essential perspective or principle. Again, effort matters.

A recent article by Cassie Kozrykov explores a similar idea by observing that what AI systems such as Open AI's Strawberry enhancement for ChatGPT can't really do is think (only mimic the process of thinking). Moreover, she argues that thinking is a characteristically human ability or advantage and that it takes effort. Daniel Kahneman's "Thinking Fast and Slow" paradigm would refine this assertion by saying that automated System 1 thinking is effortless and an exercise in pre-trained pattern recognition which can be replicated by a machine, but this must be complemented by System 2 thinking that is effortful, as it comprises multidimensional, multifaceted complex analyses integrated over our relevant experience and knowledge. And it is System 2 that we associate with our unique human abilities to infer, extrapolate and create something original to decide on a valid plan of attack for a complex problem.

So, in short, we value effort because effort leads to effective learning.

An interesting, related "learning" observation is that humans enjoy puzzles of all types, driven in part by the sense of triumph (and concomitant dopamine release) on achieving a goal or solving a riddle that required effort. As described in a recent article, humans excel at such puzzling because we are able to rapidly context-switch—no doubt borne of the need to learn how to exist in our complex physical and social environments.

Conversely, the absence of this need in AI systems (to date) has driven almost the opposite focus—to locate and leverage a single most-probable context. In short, in order to navigate and collaborate in complex contexts, humans context-switch with great facility, and the effort entailed leads to a biochemically enhanced learning of new experiences and outcomes.

But I think the principle of effort can be extended beyond learning; it also is the basis for the valuing of goods as well. Surely the physical effort required to mine and refine natural diamonds is why we value these flawed gems over perfect synthetic diamonds? Or the creative effort required to produce a piece of literary or visual art, and the direct connection to that work, is the origin of the premium value ascribed to original art over the reproduction, which is, to all other intents and purposes equivalent or even better, e.g., it is in better condition or with brighter, more vibrant colors or more attractive framing or binding.

Similarly, I think one can argue that the value we ascribe to in-person meetings over tele-interactions is partly due to the effort required by every participant to be present at the meeting, in addition to the more resonant social experience. This latter aspect actually forms the basis for my secondary uniquely human factor:

Humanity Factor 2: Humans Uniquely Value Shared Experiences

As Nick Cave observes in his screed against Generative AI that "ChatGPT rejects any notions of creative struggle, that our endeavors animate and nurture our lives giving them depth and meaning. It rejects that there is a collective, essential and unconscious human spirit underpinning our existence, connecting us all through our mutual striving." In other words, there is an inherently collective aspect to the human experience and the appreciation of effort.

The collective basis for effort appreciation can, I think, be understood by reference to the critical role of social interaction to facilitate group or 'tribal' formation. This is one of Yuval Harari's driving theses in Sapiens in which the ability to socially bond via intimate emotional connections and also via shared stories and mythologies or belief systems have given rise to the dominance of homo sapiens, despite being weaker or having smaller brains than other species—including some of our immediate ancestors or cousins.

This same social bonding response is surely also the reason that teams, troupes, groups and extended families have such special resonances and why collective 'live' shared experiences such as concerts, sporting contests, festivals, communal or choral experiences or exhibitions are valued more than equivalent events that are experienced individually.

A recent Science op-ed summarized this phenomenon as follows: "Human intelligence is not centered on the optimization of fixed goals; instead, a person's goals are formed through complex integration of innate needs and the social and cultural environment that supports this intelligence." And, in a similar vein, Stephen Fry observed in his 7 Deadly Sins podcast, "We are a mixture of two extremes: the desire to form social communities and the desire for self-determination and sufficiency."

Now, returning to the question of AI and the ability of such systems to fully emulate human behavior, I would argue that an AI system cannot demonstrate effort in any way that a human could appreciate as human-like—surely a simple accounting of power or server cycles consumed does not pass muster in this regard. Similarly, an expression of the number of server nodes or data sources utilized does not reasonably demonstrate an appreciation of collective experience that would resonate with most humans (computing fanatics aside)!

So where does that leave us? I think it is important to continue to compile the set of attributes that are uniquely human, in order to ensure that we appropriately value and incorporate these factors into our human-machine equilibrium equation. In this way, we will not be seduced by the capabilities of an AI system to the point of questioning our own human value; we will instead be able to appreciate the value the AI system can provide in the context of a larger, richer tapestry of human value—and we will return to the rational, desirable place where humans and machines work in concert to define and advance the state of human existence. And, to use a British phrase, my 'starter for ten' is that human effort and collective human experience are good starting points for this greater calculus.

A version of this article previously appeared on the author's Medium page.

About the writer

Marcus Weldon, a Newsweek senior contributing editor, is the former president of Bell Labs and a leader with the ability to connect pioneering research to innovative product development and novel business strategies. Previously, he was the chief technology officer at Nokia and at Alcatel-Lucent and Lucent Technologies. He has a Ph.D. in physical chemistry from Harvard University, and he served as the Neil Armstrong Visiting Professor at Purdue University in 2023 and 2024.

Marcus Weldon

Marcus Weldon, a Newsweek senior contributing editor, is the former president of Bell Labs and a leader with the ability ... Read more