Published Jun 22 2023

AI, we need to talk: The divide between humanities and objective truth

AI existence depends on humanity. We develop(ed) it, use it and evaluate its use, but humanity is beginning to question its relationship status with AI.

Pioneers such as Geoffrey Hinton, formerly of Google, are leaving their positions to speak out against AI, and Gen Zers are actively seeking and engaging with “‘less smart’ phones”.

Could our fascination with objectivity be the Pied Piper that led us to develop a machine some of us now fear and avoid?

The bewitchment of objectivity

In many developed countries, we seem enchanted with ideas of objectivity and emotional neutrality. Often driven by noble intentions such as equality and egalitarianism, we recognise that our emotions can colour our perceptions.

We build frameworks and checklists to guide us towards impartiality and dispassionate decision-making in hiring staff, strategy development, workplace support, and personal life decisions.

Along comes AI. Many of us are enticed by AI’s capacity for detached information processing – AI outputs don’t seem to be hamstrung by sentiments or emotions.

While our perception that AI is unbiased is a mirage (because humans are at the helm), we seem to think that AI’s “indifference” can save us from ourselves. We want to believe that AI can excel where humans fail, in being truly objective.

Are arts and sciences distinct from each other?

Our educational system shows the cracks of this divide between humanities and objective truth.

In the US, a Bachelor of Arts (BA) is commonplace, no matter the field you study. My undergraduate biology honours course, for instance, included all the science courses required for a BSc, plus humanities and arts.

For me, these electives included courses on Indian religion and philosophy, and modern art.

I learnt that the concept of nothingness emerges from Buddhism.

I learnt how movement could be created through colour in my art history courses, where my teacher described how modern art was “meant to be accessible and universally understood”.

“If this is the case, why do we need an entire course focused on understanding these artistic movements?” I asked. Without missing a beat, she responded: “This is the paradox of modern art.” And with that answer, I also learnt the definition of irony.

Despite never leaving the lecture hall, each of these courses challenged my understanding and shifted my worldview. In my science courses, I learned facts and figures. In my humanities courses, I learned how these truths are situated in the natural world.

Fast-forward to my first year in Australia. I quickly learned not to discuss my BA, as I was met with unmistakeable shocked (sometimes horrified) facial expressions. “How could you possibly have a BA and a major in biology?” I was asked. My hard-earned BA degree was often met with contempt, not esteem.

Some had the misconception that I didn’t take many science units in my degree. Others revealed on which side of the age-old rivalry between STEM (science, technology, education, maths) and HASS (humanities, arts, social sciences) they stood, with many valuing the former over the latter.

The STEM-HASS chasm is woven through the fabric of Western societies. In Australia, it’s built into the university fees and our perceptions of “job-readiness”. When the chips are down, it seems to be the HASS degrees that are slashed.

We have a persuasive belief that STEM and HASS are somehow severable, and that objectivity (which we tend to associate with STEM) reigns supreme.

AI’s role: Can knowledge be severed from humanity?

Our society seems entranced by the idea that we can create an artificial divide between the world’s knowledge, and the people interpreting and applying this knowledge.

Positivism, which forms the foundation of most Western science, is based on the idea that truth is verifiable and objective. This leaves little room for an appreciation of the social context in which this knowledge is produced and applied.

As Monash bioethicist Peter Douglas states: “The values that underpin science as an endeavour – honesty, trust, cooperation, curiosity – can't themselves be derived from an objective understanding of the world.”

As AI is further woven into the fabric of our lives, we risk falling into a kind of techno-solutionism that defines intelligence by what is tangible and observable.

Vulnerability and critical reflection are inseparable from social movements supporting equity and decolonisation, and are critical for feeling connected to our communities.

Yet, AI is famously (or infamously) incapable of such human skills. Googling “AI and vulnerability” yields results about “cybersecurity”, not empathy and connection.

AI represents a holder of knowledge that is entirely severed from the humanity that generates such knowledge.

As the title of this piece states, “It takes a body to understand the world…”, and AI doesn’t have one. AI cannot reflect on the decisions it makes, and it can only take in information through a narrow set of inputs derived exclusively from the tangible and observable.

Humans, on the other hand, understand truth through a broad range of data inputs such as embodied experience, cultural context, visual and auditory clues, physical touch, and sound, among many others.

These inputs are integrated across our human intellect, filtering through our memories and our emotions.

In this way, can what we as humans define as truth ever be “objective”? What benefit is it to have facts and figures devoid of the context within which they occur?


Read more: AI in healthcare education: Is it ready to teach the future?


In contemporary society, with our humanity on the line, I encourage us to reconsider our idolatry of objective truth, or a view that STEM and HASS are divisible.

Instead of firing AI ethicists and idolising AI’s “objectivity”, what if we value ethics, humanity, and truth as parts that add up to the sum of humanity?

In this world, tangible and AI-accessible outputs – such as papers, citations, talks and artefacts – would depreciate when separated from human intangibles such as connection, creativity, and hope.

Just as currency carries no intrinsic worth without humans, truth and science are also vacant without humanity.

 

About the Authors

  • Michelle lazarus

    Associate Professor, Faculty of Medicine, Nursing and Health Sciences

    Michelle is the Director of the Centre for Human Anatomy Education and Curriculum Integration Network lead for the Monash Centre for Scholarship in Health Education. She leads an education research team focused on exploring the role that education has in preparing learners for their future careers. She is an award-winning educator, recognised with the Australian Award for University Teaching Excellence in 2021, a Senior Fellowship of the Higher Education Academy (SFHEA) and author of the Uncertainty Effect: How to survive and thrive through the unexpected.