Teaching Q&A: David Dwyer on moving from intimidation to experimentation

Clinical Professor David Dwyer, PhD, RN, NE-BC, is quick to say he doesn’t think he’s doing anything unique when it comes to incorporating generative AI into his teaching. What’s clear is that he is comfortable trying, evaluating what went well and what didn’t, and trying again – an approach he encourages other instructors to consider. Dwyer recently spoke with CTLM about how he moved from being intimidated by AI to jumping in and experimenting with it.

Photo of David Dwyer

Tell us a little about your teaching context within the School of Nursing – who are you teaching?

I’m a clinical professor, so non-tenure track, instructional academic staff. I’ve been at the school since 2014 and I teach in the undergraduate program. And I also am a coordinator of the BSN at Home program, which is a degree completion program for students who receive their associate’s degree, who are registered nurses, and then go back to school to get the bachelor’s degree.

When and how did you first become interested in generative AI in teaching?

When I first read about generative AI in the newspaper, I said, “This is going to be the biggest change since the introduction of the computer.” And people were like, “I don’t know about that.” So, a couple years later, all of a sudden, everybody’s using generative AI, and I don’t know a thing about it. I’m kind of intimidated by it. I know students are using it. So finally, I just jumped in and opened up an AI tool and just kind of played with it, and it didn’t seem much different than Google to me. I kept asking it questions and refining the questions, and I learned that the more you refine, the more you can improve the results.

Hear more from David Dwyer about prompting during the Nov. 18 workshop Exploring AI in Teaching: Refining Your Prompts

As soon as I started using AI, I recognized how and why student communication and writing was changing. Students are synthesizing thoughts that come from AI, in many cases. I’d say that 50% of students are using AI in the classes that I teach. So I thought, well, we either fight against the windmill, or we can adapt. So I decided I would start letting students use AI as a tool.

The speed cycle of our lives is continuing to increase and AI is going to turbocharge that. I decided I would focus on teaching students how to live in the modern world as opposed to using a candle to light the room.

How did you begin incorporating it into your coursework?

I started with giving students the okay to use AI, and then I incorporated AI into one of the assignments. They have to use AI and then describe what was the difference between a traditional search – traditional being Google – and the AI search. Where were the differences, and where were the similarities, and how can they use each, together, in one assignment package?

It’s a 100-level course about healthcare systems, and one of the modules that we do early in the course is a career presentation. Everybody gets a career that they have to research. I thought, this is a perfect example, because there are so many different avenues of information on careers, and if you really got something wrong and this backfired, it’s not going to wreck anybody.

I tried it, and I’m revising it. Overall, it worked pretty well. The only thing that didn’t work well was my instructions. I wasn’t specific enough – I realized that students need a little more direction on how to use AI.

We’re already seeing AI change some aspects of the healthcare system, how professionals care for patients. Is your awareness of that affecting your approach in the classroom?

Yes. Studies have found that AI does a better job than doctors at responding to patients, because the computer “listens” and then provides more empathetic feedback. Clinicians are using AI to scan MRIs and x-rays. It really is the future, and I think we have to adapt to manage it.

I look at nursing and healthcare services – emergency medicine MDs are at risk because of the diagnostic pathway, it can be quicker. Pharmacists are at risk because of the immense information on side effects. We have to embrace this technology to learn from it, because it is at risk for changing our jobs. There are studies I use in my course saying, by 2035, 20-50% of jobs will be affected in some way by AI. That says all I needed to know.

There is not yet a clear roadmap for instructors about how you should be using AI – what have you found most helpful as you’ve navigated this area?

I think that the key here is that AI has to be a tool. AI has to support the thinking process and the development of the student in order for it to be used effectively or appropriately. If you’re allowing a student to use AI to do everything, you’re not developing the student, and that’s really what they’re paying the tuition for. On the other hand, not using AI, I think, does the same thing – it doesn’t prepare the student for life in the real world. You have to use your own intelligence to determine how you can best improve the knowledge, thinking, and ability of the students that you’re working with.