- Everyday AI
- What universities have wrong about AI 👀
What universities have wrong about AI 👀
🧑🏫️ Implementing AI in education, Google warns of layoffs while investing more in AI, a GPT to learn anything, and more!
🦾How You Can Leverage:
With school back in session, so too is the fiery debate on AI use in the classroom.
Our take — universities are dropping the ball on properly and swiftly bringing GenAI education to students.
But we know it’s easy to heckle from the sidelines without knowing what’s going on on the field.
That’s why we tapped into insights from Laura Dumin, a Professor of English and Technical Writing at the University of Central Oklahoma.
Laura’s been tip-toeing the thin divide that’s been at the center of higher education for the past few years: proper AI use in the classroom.
On today’s show, she walked us through the challenges, the misconceptions, and practical tips on how to focus more on AI literacy and less on detection.
Here’s what you need to know to know. 👇
1 – Look away from AI detectors 🙄️
Gotta start here.
Here’s a (not-so-secret) secret educators — AI detectors don’t work. Literally. Even OpenAI shut its detector down after it had a 26% success rate.
Laura gave a better way forward.
By building strong AI literacy skills, students can adapt and thrive in a world where GenAI skills are increasingly required, ensuring they are well-equipped to succeed in the future workforce.
Read this in-depth study from MIT, showing why AI content detectors don’t work.
2 – Literacy matters 🤖
Real talk here — the job market right now is demanding GenAI skills. (Like, explosive demand.) And so many colleges and universities are still trying to ‘figure out’ GenAI.
That’s one of the reasons Laura stressed AI literacy in individual classrooms.
On today’s show, Laura detailed the complexities of even creating GenAI guidelines and courses at the higher education level.
Right now, there’s not an easy one-size-fits-all approach for AI governance in the classroom.
TBH, we’d be game with Bob’s suggestion from today’s livestream.
Until that happens, though, teachers and professors sometimes need to advocate on their own for their students by implementing their own policies for responsible GenAI use.
On her website, Laura has a slew of documents, links, papers and research to help both parents and educators better understand the process.
3 – Cite your work ✍️
Savin’ a gem for the last point?
We were impressed by Laura’s own approach to GenAI literacy in her classroom.
Here’s the details on how Laura’s students use GenAI in each step of the process 👇
Brainstorming and Drafting:
Students are advised to limit the percentage of AI-generated text to 40% in drafting stages, and can use it as much as they need in brainstorming.
Colored red text is used to clearly identify AI-generated content within the drafts.
Students can put their own work into AI for feedback, but not other student’s work.
The restriction on AI-generated text usage is maintained at 15% for the final drafts.
Colored red text continues to be used to differentiate AI-generated content within the final submissions.
Students are required to include reflections on their usage of AI in the writing process, promoting critical thinking and self-assessment in utilizing generative AI tools. (This is all human generated.)
Students are required to turn every source they use from AI into a PDF, then, highlight the quotes they used from that original source and why they used it.
Bonus: Still want more?
Go check out a recent episode with Jason Gulya about how to fix the AI in Higher Education conundrum.