paint-brush
How AI Will Impact Teacher-Student Relationships: A Conversation With Professor Lance Cummings by@edemgold
921 reads
921 reads

How AI Will Impact Teacher-Student Relationships: A Conversation With Professor Lance Cummings

by Edem GoldAugust 1st, 2023
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

In this interview, I talk to Professor Lance Cummings, from the University of North Carolina Wilmington, about the Future of Teacher-Student Relationships in the age of AI. We talk about the impact AI will have on Students and Teachers alike as well as the effects of Large Language Models on the overall Learning Process.
featured image - How AI Will Impact Teacher-Student Relationships: A Conversation With Professor Lance Cummings
Edem Gold HackerNoon profile picture
“My hope is that AI will deepen our interactions with students by doing the mundane work for us, but this depends on institutions and teachers deploying AI in the right way.” -Lance Cummings


Teaching is not an easy job, but it is one of the most rewarding. Being a teacher is usually a really complex job and the increased mainstream adoption of AI in Academia means the relationship between teachers and students will undoubtedly be different.


In this interview, I talk to Professor Lance Cummings, from the University of North Carolina Wilmington, about the Future of Teacher-Student Relationships in the age of AI.


We talk about the impact AI will have on Students and Teachers alike as well as the effects of Large Language Models on the overall Learning Process.


Also, you should check out Professor Cummings’s Substack; .

Enjoy!

Edem: It’s incredible to have you as a guest. To begin, could you talk about yourself; your professional qualifications and such?

Lance: So I am an associate professor of professional writing at the , where I research and teach writing in linguistically and culturally complex environments. Instead of professional writing, I prefer to say content and information development, because I think it better represents what professional writing looks like these days.


It's not just generating text! My doctorate is in writing and rhetoric, so I tend to take a more audience-driven approach to content development.


I've been using AI since 2021, which has been more of a side project. I encountered OpenAI while researching the creator economy and thought to myself, "We are all going to be using this in 5 years."


Well, it was only two, and since then, I've been exploring how AI fits into the writing process and have started implementing AI writing tools into my classes in ways that help students write in tandem with AI (not letting AI take over).

Edem: Let’s dive right in! How might AI tools assist teachers in identifying and addressing individual learning challenges for students?

Lance: My experience is mostly in rhetoric and writing, so I focus on how AI can help students in the writing process.


When I had my students use AI in my writing classes last semester, students talked about how AI was really useful for coming up with ideas, alleviating writer's block, and enhancing weak spots in their writing.


I think students these days are more uncomfortable with a blank page than I was in my day. Most of my students said they feel much more confident as writers after integrating AI into their writing process.


But I should clarify that we were not using ChatGPT most of the time. We used SoduWrite, which is created to write in tandem with you, not for you. I do find prompt engineering as a way to teach students how to think and create frameworks.


To be good at prompting AI, you have to know the framework well enough to write instructions. For example, I used prompt engineering to teach students how about writing good dialogue.


AI can also serve as a useful feedback tool, but I am careful to say that AI isn't analyzing our writing like a writing instructor ... it is guessing what a writing instructor might say in response to particular writing. It is an important distinction.


That said, it can make some really useful guesses, especially when trained by a writing instructor. But students should take this as feedback, not as directions ... which is how they should approach anyone's feedback not just AI, anyway. I wrote a little about this here.


Plugins like the code interpreter are promising, as well. I'm playing around with using it to analyze writing. But in the end, I think creating explainable AI for education will be crucial, where we teach AI how to think with knowledge graphs ... not just train it statistically.


Though one might argue that some human teachers are guessing as educators, do we really want to scale that with AI? Structured models functioning on curated data will be the next big step with AI.

Edem: This is great; so you make use of AI prompting as a means to get students to understand the material. Could you talk about how the integration of AI technologies in education will shape student-teacher dynamics and the overall learning experience?

Lance: I've been telling people to rethink their outcomes and assessment. We are used to thinking that taking tests and writing papers are the only or best forms of assessment. But many outcomes can be achieved through teaching prompt engineering, just as well, maybe better than assigning a paper or quiz.


We will have to think of teachers more as coaches and less as content distributors or gatekeepers. This is not much of a change for me because that is how I've always approached teaching. Writing instruction is more than just teaching principles and giving feedback ... there is a lot of psychological and mental work that goes into teaching writing.


AI can't coach without a human coach training and guiding it. We need to move away from assessment and towards coaching. Then AI just becomes another coaching tool in our tool bag.

Edem: This makes real sense, the role of teachers will surely evolve as they work hand in hand with AI systems. There are some arguments that state that AI-Driven teaching systems have the potential to replace human teachers, What are your thoughts on this and what irreplaceable qualities human teachers bring to the table?

Lance: So many of the people talking about replacing teachers are either investors or selling software. Honestly, I don't know what education looks like after this ... and neither do they, really. I absolutely believe that AI will make our jobs easier.


AI already helps me do the mundane things that take up a lot of time, so I can spend more time on the important elements of teaching writing ... like talking to students and commenting deeper on assignments.


For example, there are many kinds of comments I just cut and paste because they come up all the time ... and often students never really fix them. Well, an AI can point these out continually, until the draft is good enough for a deeper analysis for me. Or instead of hours developing an activity or lesson plan, AI can do that, and I can focus on implementing it with students and coaching.


So will AI replace teachers? I'm skeptical. But there will be no room for teachers who aren't using AI. Those are the jobs that will be lost.


Theoretically, I suppose, you could train LLM technologies to simulate empathy and coaching, but today's technology is nowhere close. LLM technologies can generate text and simulate some forms of reasoning, often not very well. We can improve these two elements, but the other aspects of teaching will need a new kind of technology or approach.


There are lots of people in education and AI who would disagree with this ... but so far, I see no evidence to think otherwise. Most people are just speculating and not grounded in the technology currently being developed.

Edem: I agree with you; let’s talk more about this. How do you foresee AI technology influencing the way teachers assign and assess student assignments? For instance, the type of questions or problem sets asked?

Lance: In general, I think there should be less focus on content and more focus on working collaboratively, solving complex problems (that have no exact solutions), and managing complex projects.


These are all things AI cannot do. For example, AI can pass the Scrum certification test ... but it can't manage a Scrum project.


Since AI works best with frameworks and fine-tuning, students will need to have an even better understanding of how theory relates to practice. Teaching students to use generative AI in real-life contexts can be an important way to do this.


I'm still thinking this through when it comes to writing. As I mentioned before, writing papers is not the only way to assess learning outcomes ... perhaps is not even the best in many cases. That said, writing is still important. AI can only take you so far.


Here is what I am thinking when it comes to writing:


  • Fewer, but more intentional writing assignments


  • More focus on the invention (or ideation) phase of writing


  • Taking a knowledge management approach ... smaller pieces of content that can be put together


  • No more grades


  • More applied learning ... writing in real-life contexts


Well, these are all things I was doing before ChatGPT, but I'm cranking it up a notch and looking to adapt them more to AI contexts. For example, training students to use AI as an extension of their mind and not as a replacement. That will be key.

Edem: This is incredible, this way you’d be adapting your students to working with AI rather than being disparate with it. How do you think AI can positively impact the teacher-student relationship?

Lance: I hate to guess at this, but my hope is that AI will deepen our interactions with students by doing the mundane work for us ... but this depends on institutions and teachers deploying AI in the right way. Take doctors, for example.


Some in the field are saying that being a doctor will go back to what it was like in the early days, highly personal with lots of interaction because doctors are too busy with paperwork and such.


If AI can take over those tasks, doctors have more time to spend with patients.


It can be the same with teachers. I also think educators can "replicate" themselves with AI apps for some of this mundane work if they learn how to properly build AI agents.


So in a sense, AI becomes an extension of the teacher, giving more time for personal engagement. I'm not particularly excited by this idea of a universal tutor because you lose the specialization, but certainly, there will be a space for all kinds of different tools.

Edem: Let’s go deeper into this rabbit hole. What adaptations or additional training do you think teachers need in order to adapt, efficiently AI into their teaching methodologies?

Lance: Phew! That's a tough one because most teachers are under-resourced and already handling so many other things besides those AI-related. I think basic knowledge about how the technology works is absolutely key.


There is a lot of hype and fear-mongering that the current technology doesn't really warrant. This should also include ethics and system bias. If our teachers aren't AI literate, then students won't be.


Second, educators need to learn how to manage their own knowledge and structure their content to better utilize AI for course management and development of course materials.


This can save educators an immense amount of time so that then they can either spend more time with students or more time researching what they need (like AI technologies).


This should also include basic prompt engineering, which then can be tailored to their specific needs and contexts. It is also a great teaching tool. But I wouldn't stop there. They need to explore other generative AI tools in their area.


For example, writing teachers should be looking into tools that blend generative AI with word processors, like Copyspace and Soduwrite.


There is much more, sure. But it is easy to get overwhelmed. Teachers shouldn't feel like they need to know everything about AI ... but they need to know what will be useful and essential for their jobs.

Edem: Incredibly insightful Lance! Let’s talk about validity. How might the large-scale adoption of AI in academia impact the credibility and authority of a teacher in the eyes of the students?

Lance: Sorry if I am a bit slow the next few days, I've come down with an illness.


I really think it comes down to how instructors use AI with their students. Frankly, AI doesn't replace expertise without a human expert shaping it with fine-tuning or prompting.


If a teacher's sole job is to deliver content and assign trite assignments, then their credibility and authority would be less. But this is probably true without AI.


But if we use AI with our students ... and show them how experts use generative AI, then our creditability and authority will remain. Students might be tempted to think they don't need teachers or classes if they can just ask ChatGPT ... but that can only take you so far.


In the end, you have to have knowledge and experience to make generative AI work in particular cases.


This means that AI does not so much pose a threat to a teacher's credibility and authority, but rather acts as a tool to enhance their teaching approach.


By integrating AI into the curriculum, teachers can provide a more interactive, personalized, and enriching educational experience for their students.


This can actually boost the teacher's authority as they demonstrate their ability to innovate and stay current with technology trends.


Teachers should make it clear that AI is not the source of wisdom or expertise, but a tool that helps apply and express that wisdom. AI is a machine learning model trained on data - it doesn't understand the context or make value judgments the way humans do.


That is the role of the teacher.


Showing students how to critically use and interact with AI, identify its limitations and potential biases, and apply it to solve complex problems in their field of study can only raise a teacher's creditability.

Edem: Fascinating! Let’s take a deeper look at how it affects students. How does the increased dependence on AI to answer questions affect the students’ critical thinking and problem-solving skills?

Lance: Honestly, I don't think it does impact critical thinking. Well, I guess it can if we use AI in all the wrong ways ... but the problem is not in the technology, but in how we use it.


In fact, I might argue that using AI can improve critical thinking skills.


Don't ask students to take tests or write about content ... ask them to solve problems. There is a difference between answering a question and solving a problem.


Answering a question often requires recalling information or applying a known solution to a defined problem. Solving a problem, on the other hand, involves more complex reasoning and creative thinking skills.


It requires understanding the problem, formulating a strategy, implementing a solution, and then reflecting on the outcome.


When used correctly, AI can guide students through this problem-solving process. I suppose it can give shallow answers to complex problems, but working past that is part of critical thinking.

Edem: I see your point there; I do want to look more into the effect it has on students. Do you think the integration of AI systems and the popularity of personalized learning experiences for students could hinder the development of essential social skills and collaborative abilities that seeking help and clarity provides?

Lance: I don't really think the technology will do any of these things ... it's how we use the technology. If we decide that education is all about individual skills and content, and all we need is some AI tutors, then yes.


But if we use this opportunity to teach skills like collaborating with diverse people, solving complex problems, and managing projects ... then no, I don't think so.


The technology itself is neutral - it's up to educators, policymakers, and society to shape how it's integrated in a thoughtful way that enhances human connections and doesn't isolate students. With care and intention, AI could even help teach some of those vital interpersonal skills.


But we have to prioritize those values, not just efficiency or personalization. Ultimately it comes down to the goals we set for education and how we choose to get there.

Edem: You’re right, technology is anthropomorphic and therefore can’t be evil. That’s a wrap, Lance! Any closing words?

Lance: Many in education are focused on the wrong things when it comes to AI. Trying to preserve outdated methodologies or stop plagiarism misses the bigger picture.


As we head into a new school year, the conversation needs to shift to how we can thoughtfully integrate these technologies to truly enhance learning and prepare students for the future.


Rather than an adversarial view of AI, we should be asking - how can these tools improve access, engagement, and outcomes. How do we need to adapt our teaching methods and curriculum to leverage AI responsibly?


This requires a human-centered mindset focused on problem-solving and ethical integration, which can be called "AI Operations."


The business world is starting to get this, but education lags behind. We need more people taking an AI Ops approach - cross-disciplinary teams focused on real-world solutions and continuous improvement as these technologies evolve.


This goes beyond just creating the tech to actually delivering value in the classroom.


That's why I've been focusing on developing practical approaches in my .


Also published

바카라사이트 바카라사이트 온라인바카라