Chat GPT

 What ChatGPT means for higher education

April 26, 2023 | By Dean Sevin Yeltekin

 

ChatGPT, a chatbot released by Open AI last November, has captured the world’s attention with its uncanny ability to compose essays, write song lyrics, play games, answer test questions, and take on other tasks traditionally associated with human intelligence.

The academic community has responded to the advent of this new technology with a mixture of fascination and concern, keenly aware of its potential to transform the way knowledge is produced, conveyed, and retained.

In the Q&A below, Mitch Lovett, senior associate dean of education and innovation at Simon Business School, weighs in on the potential impact of Chat GPT on higher education.

What do you view as ChatGPT’s primary strengths and weaknesses?

Where ChatGPT seems to excel the most is predicting text that might be useful to the user based on information that is available. A student taking a finance class, for example, might ask it to explain and contrast specific kinds of financial tools. ChatGPT might not produce the most elegant answer, but it will produce a B+ answer in a short amount of time. On the flipside, ChatGPT will often make up sources or information while trying to predict something—and, so far, it’s not particularly good at math. It is also not as useful when information doesn’t exist, like in a situation in which someone needs to write the website copy for a brand-new product. When ChatGPT tries to venture outside the space of prediction, it becomes less effective.

Do you think it can ever fully replace human intelligence?

No, but I would be wary of underestimating technology like ChatGPT. A few years ago, we never would have expected to encounter an AI program that can create something people struggle to discern from human art. Over time, it will become capable of more and more. Right now, though, it is just making predictions. It knows that certain things go with other things, but it does not know truth from lies. It doesn’t have a sense of ethics. That’s an area where human judgment is indispensable.

What I think is most profound is how it is going to redefine expertise in some fields. We are going to use ChatGPT to write things for us—text, code, outlines—so that we can complete our work faster. That means some skills will become less important and others more important.

What are some ways ChatGPT can improve the quality of education at Simon?

For both students and professors, one of its primary functions will be to assist in search. Constructing literature reviews will become significantly less time-intensive when using ChatGPT to aggregate sources and summarize information on a given topic. And when we think about ways to integrate it into classroom assignments, like using it to write code or organize information to help solve a case study, it is clear that students will be able to go further in the topics they are learning about. They will be able to do more in an assignment and do it more efficiently. They may not be able to learn more information overall, because the human brain can only absorb so much in a short time, but they will learn different information and get more practice designing solutions rather than implementing them.

I can also imagine ChatGPT serving as a first-line teaching assistant (TA) or discussion board assistant in order to help students taking asynchronous classes more quickly and efficiently than their professors and human TAs. Having an outside curator of information to post questions, responses, and comments will enrich the asynchronous interactions that take place.

What are some of the downsides or pitfalls of using ChatGPT in an academic setting?

There are significant implications when it comes to academic honesty. Theoretically, a student taking an online asynchronous course could use ChatGPT to complete all their assignments and take their exams. Many faculty members at Simon already use various analytics tools to detect plagiarism—for example, by comparing student answers to see if any are unreasonably similar. But as ChatGPT produces writing that is increasingly indistinguishable from something a student would produce, it will become an arms race to detect the use of AI. Some professors will address this by switching to in-person, handwritten exams whenever possible, while also ensuring that the course content is so specific that ChatGPT becomes ineffective. These are strategies that involve trying to block ChatGPT. Others will embrace the use of ChatGPT, but will do so by adjusting assignments and exams to compensate, similar to the concept of allowing a one-page cheat sheet on an exam or holding an open book exam.

Of course, there is also the danger of placing too much trust in ChatGPT. A student taking an introductory course in accounting, for example, might ask it to answer an accounting-related question but lack the base knowledge to discern the accuracy of its answers. Without any background in the subject matter, it can be difficult to know if ChatGPT is producing an A- answer or a complete fabrication. This is where our definition of expertise becomes important. Part of educating students is helping them understand how best to use AI and evaluate when the AI might be producing meaningless or less valuable responses.

How do you expect ChatGPT to change the way students learn?

I find it helpful to think about the analogy of using calculators in math. My children’s elementary school found that allowing students to use calculators from an early age weakened their ability to do simple computations in their head, a skill that is helpful in doing more advanced math. In the same way, the introduction of ChatGPT might weaken students’ ability to write. Writing is certainly something we have traditionally viewed as an important skill. But how important is it in relation to the tasks we need people to be able to do in a world with AI?

If ChatGPT is always creating a generic setup to tailor, it allows students to avoid the mundane and repetitive—but doing mundane, repetitive tasks over and over might be helping them develop intuition and judgment. One of the most important tasks that confronts us as educators is figuring out how much of this mundane work is needed to become an expert when AI is available.

On the other hand, we may be overreacting to the fact that future students will learn differently from their predecessors. To be considered an expert on something, it will be less essential to recall facts that will always be at their fingertips and more important to apply judgment and critical thinking. There is little chance of stopping the integration of AI tools like ChatGPT into education, so our job is to decide what fundamental knowledge is necessary for mastering a subject and learn to train students accordingly.

What questions are we not asking about ChatGPT that we should be asking?

ChatGPT raises fundamental questions about truth that we must grapple with. This technology produces text that may or may not be accurate and presents it as fact. When asked the same question twice, it may produce contradictory answers if it incorporates new information in between searches. Will truth start becoming blurrier for the people who use it?

I also think about how search engines like Google form a layer between the user and a website. ChatGPT, on the other hand, will put a filtered version of a website in front of you. You won’t even need to visit a website directly. What are the implications when it gets things wrong? Where do we draw a boundary line between what is a search and what is not? There are plenty of legal questions to consider surrounding ownership and responsibility for the information that is presented.

Mitch Lovett


Mitch Lovett is the senior associate dean of education and innovation and a professor of marketing at Simon Business School.


Follow the Dean’s Corner blog for more expert commentary on timely topics in business, economics, policy, and management education. To view other blogs in this series, visit the Dean's Corner Main Page.

 

 

Add new comment

Enter the characters shown in the image.
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.