“AI is here to stay—and it’s evolving fast—so much so that in some ways, it’s like teaching a moving target,” says Dr. Michael Drexler, Associate Professor in the School of Psychology. “Its potential is enormous, but the challenge is teaching students to use it responsibly while continuing to think critically and independently.”

In his Clinical Assessment & Treatment Planning and Research Methods and Design courses, Drexler helps students develop strong research habits—starting with crafting effective prompts and verifying sources. He even uses documents with intentional errors to sharpen their fact-checking skills. He emphasizes that the goal is not to replace human judgment but to strengthen it, ensuring that technology becomes a complement to—not a substitute for—critical inquiry.

AI, he says, is a powerful tool for brainstorming, but real learning happens when students interpret, reframe, and express new ideas in their own words. This reflective process, he notes, can - if set up correctly - mirror the clinical reasoning used in psychological assessment and treatment, where interpretation and synthesis are as vital as the data itself.

Looking ahead, Drexler hopes to help develop a chatbot that simulates patients with specific symptoms and conditions— an interactive tool that could support hands-on learning in treatment planning.

 â€śWe must be cautious,” he adds, “especially when it comes to patient care. This technology raises significant ethical and confidentiality concerns that require deep and thoughtful examination. That’s why we’re moving carefully and deliberately, and taking a comprehensive look at its long-term impact in the field of clinical psychology.”

An early adopter in his own work, Drexler uses AI to streamline searches, organize information, and assess existing documents for readability—but always with a critical eye. “It’s great for speed and structure,” he says, “but I always double-check the sources. Accuracy always matters.”