Being Human in the Age of AI

Wheaton Professors Consider the Indispensability of the Liberal Arts

Words: Jen Pollock Michel ’96

Wheaton College IL illustration depicting AI and the liberal arts by Caroline Park

Illustration by Caroline Park ’23

If modern biography is a tale of technological innovation, I know how my own story might begin. I arrived at Wheaton College in 1992 with a Macintosh Classic and no access to email outside of trips to the computer lab. After graduation, when I secured my first full-time teaching position on Chicago’s North Shore,  faculty were subjected to weekly administrative injunctions to please use email instead of voicemail. I finished my first master’s degree in 2000, and most of my thesis research was conducted inside the physical building of University Library on Northwestern’s campus.

By 2008, I was the mother of five young children. The most germane technology for my own life involved the dramatic improvement of cloth diapers. It was the same year I listened to a set of four lectures by Associate Professor of Communication Dr. Read Mercer Schuchardt, delivered at College Church and titled “Living in a World with No Off-Switch.” The first iPhone had been released, though I didn’t yet own one. To listen—today—to the first recording and the introduction of Schuchardt is to time-travel: Welcome, everyone, to the second-annual Culture Impact series. How many of us had enough nerve to wear our cell phones on our belts this evening? How many of us have iPods? BlackBerries?

The technological moment is the historical moment. Gunpowder and the cotton gin; the smallpox vaccine and the gas-powered car; the calculator, Walkman, computer, and smartphone: each tool alters the known world and bookmarks our place in time. ChatGPT (Generative Pre-trained Transformer) is the most recent biographical timestamp, and though I’d read about the public launch of the text generator in the headlines at the end of 2022, I paid little heed. ChatGPT became a frequent topic of conversation among students in my MFA program. We’ll all be out of our jobs, some predicted pessimistically.

Months later, I finally opened ChatGPT for myself. I prompted it to detail “various considerations and questions regarding the appropriate use of AI.” Within seconds, 13 categories and 26 questions unfurled on the screen, as if magically handed down from heaven. Here’s one example of those categories and questions:

Question 13: Education and Literacy.

How can we educate people about AI to ensure its capabilities and limitations?
What efforts are needed to increase AI literacy among the general public?

If this was the sophisticated language, or “thought” made possible by ChatGPT, it certainly seemed we were in trouble.

On November 30, 2022, OpenAI released an early demo of its large language model, ChatGPT. Within five days, the text generator—with startling capacity for writing emails, wedding toasts, poems, college essays, computer code—had more than one million users. Within two months, ChatGPT had reached more than 100 million users. The chatbot’s swift success has proven a striking example of German sociologist Hartmut Rosa’s concept of technological acceleration, reminding us that the world we inhabit today, with its particular set of tools and devices, becomes estranged from us in shorter and shorter intervals, as new technologies supplant the old. The past is more quickly past, as the present is increasingly compressed. Now never lasts as long as it did yesterday.

The semester after ChatGPT’s online release, college students were quickly adopting the new technology, and Wheaton professors began to suspect—and sometimes confirm—its use for their own classroom assignments, despite forewarnings. One philosophy professor grew convinced AI was now generating some of the densely distilled 500-word summaries he assigned of the central claims and arguments of Plato’s Phaedo. Another member of the department received a final paper he later confirmed was written by AI. “The prose was a little bit smoother than I expected. Not to say that it was beautiful or deeply insightful. Just smoother, grammatically, syntactically.” There was also an obvious irony: “A couple of sources were made up.”

This particular professor described the hit he took when he received the paper: as a course instructor but also as a cultivator of philosophy, this ancient love of wisdom. He couldn’t help but see an obvious “pedagogical disconnect,” given the course had sought to probe the human goods of love, responsibility, artistic creation, suffering, honesty, embodied engagement, authenticity, risk, and passion.

Professors I interviewed described the confusion many students face regarding the ethics of these technologies—and their own disappointment when students elect for such “efficiencies.” “These students are robbed of the opportunity to express themselves,” said Dr. Ryan Kemp, Associate Professor of Philosophy. In his essay published in Zeal: A Journal for the Liberal Arts, “AI and the Struggle to Think Humanly,” Kemp argues educators must take a stand for the digital student “misfit” and the “existential stakes” involved in writing.

To read his description is to realize he speaks of someone like me: “the technologically clumsy; the electronically backward and chronically under-informed; those with ink-stained fingertips and books made of paper; the ones who write poetry even though their cell phone turns a better phrase, and when you ask them why, they look at you blankly and say something ridiculous like, ‘cause I have to.’”

In May 2023, Dr. Richard Gibson, Professor of English, led the annual faculty seminar hosted by Wheaton’s Center for Applied Christian Ethics (CACE); this year’s theme—proposed to CACE Director Dr. Vincent Bacote by another member of Wheaton’s faculty, Dr. Nathaniel Thom—centered on AI and liberal arts education.

In Gibson’s estimation, the AI moment will be “decisive” for the College’s life.“There are competing education models,” Gibson explained, “and this is really a threat to what we do. There are some internal questions that must be asked: how does our model get students ready for the marketplace and for life in this kind of technologically enhanced society? What is the degree to which our alumni will support us (financially) in re-imagining what we do?”

During his doctoral studies, Gibson became interested in the history of the book and the history of media, and those preoccupying questions became a hub for the digital humanities in the early 2000s. Gibson was at the “right place, right time to fall in love with computers again,” he told me in an interview, and for the past ten years, he has been studying text generators and publishing regularly on themes of the human and the digital in places like The Hedgehog Review.

Like many of the Wheaton faculty who attended a three-day seminar on AI and liberal arts education in May 2023, I had little interest in understanding, certainly not experimenting with, the capabilities of large language models until it was more fully explained to me by Gibson. According to Gibson, the most important thing to know is that LLMs are “prediction engines.” These models have taken in what he describes as an “enormous ocean of words”: most of the internet, including Google Books and Wikipedia. Having “learned” from the diet of language with which they were fed, these models can now “statistically describe the relationships between words in the most fine-tuned way.” But if their language has qualities of the “human,” the real achievement is not sentience but mimicry.

Chatbots do not understand the words they produce or the conceptual realities of the world they name. Indeed, their fluency breaks down in situations where language demands embodied knowledge of the physical world. In effect, the language that these models manipulate has been reduced to numbers, to meaningless representations of words. “Every little molecule in the linguistic ocean has a number,” Gibson said. What the models do is “analyze the relationships between all the numbers” and assign a magnetic attraction or repulsion between words—a kind of probability score of relationship.

“Our intellectual tradition didn’t prepare us to explain these things,” Gibson admitted.

For participating faculty in the May seminar on AI and liberal arts education, the pedagogical threat posed by LLMs seemed obvious. “I cannot afford to ignore ChatGPT, as a professor or as a participant in society,” concluded Dr. Aubrey Buster ’09, M.A. ’11, Associate Professor of Old Testament, who attended the seminar and wrote about her experience. Of the constituent writing tasks involved in creating the traditional exegetical papers Buster assigned in her classes, she could see the broad capacities of LLMs for accomplishing most of them: engaging close readings of the text, comparing English translations, outlining the structure of the passage, comparing work with that of the scholarly community. “The only thing that ChatGPT cannot accomplish,” Buster concluded, “is the accurate citation of sources.” For Buster, the most urgent question raised by LLMs is this one: What does it mean to write?

Faculty participants could immediately see that many traditional assignments, such as the précis assigned for Plato’s Phaedo, would have to be revised, possibly even abandoned. And though many were loath to admit the inevitability of the change wrought by this new technology, Gibson underscored the realities sure to affect future education. “We do know word processors are going to incorporate LLMs into their software,” he said. “This will suggest to students that such tools are academically neutral, no different than spellcheck or Grammarly. We’re going to have tired students who are prone to making bad decisions.”

The seminar raised large and looming questions, questions left unresolved by participants who, no matter their degree of relative enthusiasm or suspicion, shared a collective anxiety about the potential harm of these technologies on “developing minds.” These questions were “ultimate” in nature, according to Dr. Denise Daniels ’91, Hudson T. Harrison Professor of Entrepreneurship and seminar participant. “Certainly, computers can do many specific tasks better and faster than we can, and it makes sense to offload some of our work to them,” Daniels concluded. “But at what point do our efforts at offloading tedium begin to diminish our own skillset, abilities, or character? Will using generative AI diminish our humanity?”

How then could Wheaton’s model continue to cultivate habits of deep reading, sustained intellectual attention, logical and beautiful self-expression? What academic policies would be needed for the college most broadly—and each department, more narrowly? If students were afforded the opportunity for the efficiencies LLMs could provide, when were those “shortcuts” appropriate (akin to the operations of a calculator) and when were they dishonest? Further, how might Wheaton students continue developing the habits of virtue that willingly choose struggle for the sake of academic and spiritual formation? Against which standards of success would students be measured?

These challenges are paramount, and many faculty with whom I spoke expressed renewed commitment to the model of liberal arts education, especially as compared with models interested primarily in job skills and professional competencies. “Our educational goals at Wheaton College are person formation, not just helping people become receptacles of information,” said Bacote. Dr. Adam Wood, Associate Professor of Philosophy and Department Chair, seconded Bacote: “In the Christian liberal arts tradition, we are primarily concerned with developing virtues in our students. You are freed from vices by developing virtues. I take it that’s what Wheaton College is primarily about, and I think this is really important work.”

“We are not only interested in turning out glistening products,” said Gibson. “Many of the questions we ask students are ultimate questions that have been answered before. The liberal arts tradition has always held as its principal question: What does it mean to be human? Each particular child of God must ask and answer the question for herself. We’re not expecting a novel answer. What matters to us is that our students have thought their answers through.”

As Buster reminded me, writing is a technology that is understood in “culturally contingent” ways. As she outlined in her written reflection from the seminar, what was once the province of the ancient scribe has become the purview of the contemporary author. “The concept of what writing is and its relationship to the human mind has changed before. Perhaps the large language models will change the way that we think about writing and authorship yet again.”

Although I’ve never gone full Wendell Berry and foresworn owning a computer, I don’t reflexively grant the truth that technology—its many forms, its many uses—can be a faithful expression of Christian commitment. Given my penchant for technological pessimism (and maladroitness), it was refreshing to speak with Associate Professor of Computer Science Dr. Thomas VanDrunen, to consider an alternate perspective of AI. (For technical reasons, VanDrunen prefers to speak of “machine learning” rather than “artificial intelligence.”) “I think of computer science as a creative field,” he said. “It’s about making virtual things, algorithms, creating virtual worlds that run on computational technology. With machine learning, we can create things, tools, and technologies that can benefit people. These are the things God commanded us to do when he said to take dominion of the earth.”

Bacote, whose scholarly interests have focused on theology and culture and theological anthropology, as well as faith and work, echoed VanDrunen’s theological formulation. “Humans are created in the image of God, endowed with capacities for creation,” Bacote said. “These capacities are how we get technologies. We’re making something of the world, working with what’s here, producing things. Technology is an expression of what we do as humans.” Bacote also cited the cosmic repair we often seek through technological innovation—the ways we hope to alleviate, even eliminate suffering as we can. Machine learning wasn’t simply learning a tool for cheating on college essays; it was learning to perform life-saving tasks, such as cancer diagnoses.

Dr. Alan Wolff, Wheaton’s Chief Information Officer, added his own moderated enthusiasm about technology, emphasizing the important functions AI performs on behalf of the College’s administrative operations and cybersecurity. “I’m a technologist,” he said. “I love tech; it’s done a lot for the world, even if it won’t be able to solve people’s fundamental problems.”

It is tempting to formulate facile responses to the harms of AI technology, but this is to ignore its many benefits as well as the ecological change wrought with technological innovation. (Cue one of the most important insights I learned from Schuchardt’s 2008 lectures—that it’s not only the pot that deserves attention for killing the frog, but the boiling water.) Even if we might forfeit the use of a tool for ourselves, we can’t reverse the expectations it creates in society at large. Whether we embrace the efficiencies of text generators for our own tasks, we aren’t likely to escape the appetite for quick fixes of other kinds. Because this is technology’s fundamental promise to us: more benefit with less effort and time.  

Indeed, one might see that the appetite for more and more text, in less and less time (such as LLMs produce) is a further stop on the runaway media train Henry David Thoreau identified in the middle of the 19th century, which he characterized as an age of “mental dyspepsia.” “I do not know but it is too much to read one newspaper a week,” Thoreau said in a lecture titled “Life Without Principle.” His arguments about the dangers of “news-as-spectacle” can be even more easily assigned to our era of computer-generated text than to his own era of newspapers printed by steam power. The danger, in other words, isn’t separate from the efficiencies but inherent to them. It’s the danger Kemp identified, that to avail ourselves of LLMs, “We will have been more efficient. We will have saved time. But at the end of the day, we may have nothing to say.”

Still, this technological moment (and its philosophical apprehensions) may not be as unlike previous moments in history as we might have thought. As Wood mentions, in one of Plato’s dialogues (which I might have ostensibly remembered from Dr. Robert O’Connor’s Philosophy 101 course at Wheaton), Phaedrus and Socrates engage in a debate about the relative virtues and vices of writing things down. If we write things down, Socrates argues, our interlocutor won’t be able to converse with us face-to-face. “Virtue won’t be developed through dialogue if we are reading dead letters,” summarized Wood, while also admitting the irony of the argument’s provenance. “Scholars of Plato disagree on what exactly to make of it since he is writing this down.”

Just as societies transitioned from oral tradition to written record, we are undergoing another communicative sea change: from human utterance to digital text generators. This is a moment for the Christian technologist and the Christian theologian, the Christian professor and pastor and parent: not simply to decide technological convictions but to recover what it means to be human, in all its limitation and glory.

To anyone who thinks the answer to this question is straightforward, Dr. Marc Cortez, Acting Dean of the Litfin School of Ministry & Theological Studies and Professor of Theology, emphasized that after a semester-long course attempting a response, students still didn’t have it resolved. We know at least this much, Cortez said: that to be human is to be “made in the image of God” but also in “need of process of redemption.”

Most importantly, Cortez emphasized, “If Christ is the image of God, as we know from Colossians 1:15, our understanding of humanity reveals itself in him.”

Since at least 2004, when Gunther H. Knoedler Professor of Theology Dr. Daniel Treier attended a faculty seminar on technology led by then–Wheaton professor Dr. Alan Jacobs, Treier has understood the need for a robust theology of technology. Recently, he’s begun work on a monograph that examines the work of four important Christian thinkers (or groups of thinkers) on technology: Jacques Ellul, Wendell Berry, Albert Borgmann, and the Neo-Calvinists. Although the book as it is currently conceived isn’t poised to address AI specifically, each of these thinkers’ insights are meant to help us form a rubric for Christian technological discernment that might be applied to questions raised by LLMs.

“Ellul would raise, certainly, the question of power,” Treier explained. “In what ways does this new technology enhance the power of technological elites? In what ways does it enhance the power of the lowest common denominators of mass culture?” (Treier reminded me that Wheaton College has the largest North American Ellul archive, on the third floor of Billy Graham Hall, a place he spent considerable time during his sabbatical as he began this project.)

Wendell Berry, another of Treier’s interlocutors, “values embodied relationship and the significance of physical spaces.” In Treier’s words, Berry asks, “To what degree are we doing more internet-generated/enhanced work, expecting productivity to be enhanced, but not enhancing craft?” Philosopher Albert Borgmann, on the other hand, examines the formative nature of practices. “We can see that ChapGPT might have good uses, in general,” explains Treier, “but it won’t be fostering deep thinking and writing. It will be encouraging shortcutting and outsourcing that aren’t likely to be formative unless we are intentional.” Finally, according to Treier, the Neo-Calvinists raise the question of progress, how technological progress relates to providence. “There is a temptation to baptize anything that comes along,” Treier concluded.

To consider Treier’s project, in light of my own recent learning about AI, is to envision the communal conversation that an institution like Wheaton College might host. “I doubt we can discern/resist technologies unless we get into thicker forms of community,” Treier noted. Although differences of conviction and commitment will be evident in such a conversation, as in the May faculty seminar, I might hopefully imagine acts inspired by the Christian liberal arts tradition: sustained (and prayerful) probing, searching, submitting, investigating, instructing, comparing, analyzing, contrasting, attending, learning, unlearning, testing, self-expressing.

Are those activities to describe the reasoning capacities of which LLMs, trained by algorithms and nourished on human language, are becoming capable? Apart from “prayerful,” perhaps. But as Gibson pointed out, “A computer can’t undertake its own tasks for its own pleasures and desires.” Maybe this is what the liberal arts tradition means to inculcate most—and a learning of which AI will never be capable. It’s this philosophy Kemp spoke to me about—this educated love for a range of human and divine goods, not limited to knowledge, not greater than charity itself.

At noon on an August day, I paused work on this article and opened my prayer book to the midday office. The selected psalm—Psalm 100—rang out from the page. “Know that the Lord, he is God; it is he who made us, and not we ourselves. We are his people and the sheep of his pasture.”

People was the word to jar that day, as I worried about the threatening capacities of machines. Would I, as a writer, be replaced by text generators? In our interview, I asked Dr. VanDrunen for technological prognostications, but the computer science professor resisted answering. Beyond he will judge the living and the dead, there isn’t much we know for certain, he said.

I suspected that the growing use of AI would drive readers to expect more and more text in less and less time, as they already did. I suspected I myself would be tempted to avoid the hardship involved in learning, the patience involved in good work. But to pray the ancient psalm was to remember my humanity and its many attendant gifts. We are the people of God, made in the image of God, for the pleasure and purpose of God. We are God’s workmanship, or poetry as the apostle Paul put it in his letter to the Ephesians. We are his handiwork, even an expression of his artifice.

Whatever the future, this was a status we could never lose.