By Aadi Mehta, November 12, 2025
Artificial intelligence won’t replace workers, but those who understand it might, economics professor and data scientist Carsten Lange said to students during the Digital Humanities Consortium’s “Understanding Generative AI” workshop Oct. 28.
Lange said that understanding artificial intelligence is essential for anyone entering the workforce. He said the goal of the workshop was to help attendees understand the inner workings of AI, especially how words and meanings are represented numerically.
“I explained how words are coded in AI systems — that’s called tokens — and how to give these tokens meaning,” Lange said. “I showed examples where you type a word, like ‘dog,’ and it gets changed into a list of numbers and how the list of numbers for ‘king’ and ‘queen’ is more similar than the one for ‘spaghetti.’”
According to a 2025 article by the AI publication “Towards AI,” AI systems cannot understand words or meaning directly, so they must interact with numbers. Each word, or token, is coded with a list of numbers that retains relationships with other words based on patterns. Words like “king” and “queen” are conceptually related, so they will have similar numerical patterns, while words like “spaghetti” will have different values because they appear in different contexts.
Lange also showed attendees how to put words into a sentence and give meaning to context. However, he cautioned students to monitor how AI undergoes this process before moving on to the next step.
“Generative AI is a numerical process that estimates the next word in a sentence, and that’s all it does,” Lange said. “It creates very powerful results, but there is no personality behind it.”
Accounting student Kaylee Johnson, who is taking a course in AI ethics, said the event helped her ask better questions to make sense of generative AI content instead of relying on her own human tendencies.
“It definitely furthered my thinking into how much morality we can accept from these systems and whether or not we should be imposing morality on it, or if we should just consider them like calculators,” Johnson said.
One of the biggest concerns with AI is the need for organizations and people to develop frameworks for responsible AI use, according to a recent study by the Journal of Strategic Information Systems.
“Learn how to work with (AI) by trial and error, and keep in mind it’s a numerical algorithm that can make errors,” Lange said. “Always verify the results. Take everything with a grain of salt.”
Lange said he believes the ongoing debate surrounding AI use in education should not be a question of whether to educate people about AI but rather whether people who fall behind will suffer.
“We have to be ahead of the trend,” Lange said. “AI is not going to take your job; the colleague who knows AI will take your job.”
According to associate professor of history and digital humanities consortium co-director, Rachel Tamar Van, Lange’s workshop is a continuation of a workshop series that is being piloted this semester.
“In terms of the DHC AI Literacy series, our YouTube page has the first event in the series, an introduction to AI Literacy and Ethics, and Carsten is giving the campus an opportunity for a more hands-on workshop on generative AI,” Van said.
The DHC regularly posts upcoming events on its website, with the next one and students and campus staff can reserve spots for upcoming events.
Feature image courtesy of Anais Hernandez

