Rewilding our AI hubris

January 18 2024 – 11:20AM

Of all its benefits, one of the drawbacks of the digital age is how easily we mistake information for knowledge. Because we can find the answer to most of our questions, we can begin to believe that we are smarter than we are. Because we can find an instructional video to help us perform any task - everything from car maintenance to, yes, even making cake - we can begin to believe we can perform these tasks. But knowing about something is different from knowing how to do it or whether you even should.

This blog article reflects on an important passage from a blog titled Algorithmic Induction by the Cynefin group. The writers perspective resonates with a feeling I have had for the better part of the decade - that AI hubris (and subsequent worship) is in fact, evidence of our own (human) ineptitude and depicts the shallowness of modern day work, and its prevailing culture.

Here is the excerpt that illustrates this point:

"In a much-reported study by Boston Consulting Group, which had a sample size of 7% of the workforce, they found that “Consultants using AI finished 12.2% more tasks on average, completed tasks 25.1% more quickly, and produced 40% higher quality results than those without”. When I first saw that my response was not, this is wonderful. Rather it proved that all consultants do these days, is cut and paste things they probably don’t understand. If an AI system can pass an MBA exam, then all that demonstrates is the poverty of an education system that rewards synthesising other peoples’ material. Rather than seeing these as illustrations of AI’s competence, we need to see that these demonstrations of its ‘capability’ expose us to examples of the increasing poverty of human inventiveness, hopefully before it is too late to do something about it. We need to find new and different ways of engaging employees, customers, academics, citizens and so on in new, more innovative forms of sense-making to rebalance or rewild the role of humans. The fictional character of Ned Ludd was still the possessor of a craft, which may have partly been automated but which is still needed. That will require us to think differently than the current hype and baffle the lemming-like dash to the cliffs."

Dave Snowden, the writer, poses an important question to reflect upon: When can the human abductive capacity be best deployed, and how can we ensure it continues to develop and evolve?

Abductive capacity refers to our natural capacity to feel and listen to our guts and respond to our hunches, intelligently. 

We all recognize the importance of understanding people's knowledge and how they acquire it, especially when it comes to AI systems. It is crucial to retain certain skills and knowledge that future generations can build upon. Moreover, there is need to differentiate between AI-generated content and human input, as well as to determine the contexts in which they are most effective. In a sense, attaining a balance, and the term rewilding captures this intention accurately.

Rewilding is the process of restoring or returning a particular ecosystem or habitat to its natural state. In the context of AI and human abductive capacity, rewilding could be seen as a metaphor for the need to restore or return certain skills and knowledge to their natural state. Just as a forest needs to be allowed to return to its natural state after being disturbed, society needs to allow certain skills and knowledge to return (or stay) to their natural state without interference from technology. To rewild skills, society needs to allow them to evolve naturally, without imposing artificial constraints or interfering with their development. This way we allow our natural abductive capacity to continue to flourish.

End note:

I reference the graphic (below) when it comes to uncovering the true use of technology for a given problem. I believe the larger part of rewilding/achieving a balance is in exploring the dynamism between substitution and transformation, and where true end user value lies across the spectrum. 

(adapted)