5 Comments
User's avatar
Ms. H's avatar

I want people to start talking about laws governing AI

Rose's avatar

As a mom and educator, this is so scary. When all the answers are give to kids in a matter of seconds, they lose the ability to think critically. It is hard getting kids excited to ask questions and understand nuance or bias.

Rachel @ This Woman Votes's avatar

I hear you. I really do.

As a mom of grown kids and someone who’s been educating adults for decades, I actually think the answer isn’t keeping AI away from kids. That ship has sailed. The answer is teaching them how to engage with it without being seduced by it.

AI isn’t dangerous because it’s fast. It’s risky because it’s emotionally fluent. It sounds confident, calm, reasonable, and “done thinking.” Young minds, especially, mistake fluency for truth.

What I’ve been arguing, including on my other Substack where I focus on AI bias and emotional fluency, is that we have to teach discernment early. Teach kids to ask:

Why does this answer feel satisfying?

What assumptions is it making?

What questions did it not ask?

What would change the answer?

Critical thinking doesn’t disappear. But yes, teaching just became exponentially harder. The skill is no longer “find the answer,” it’s “interrogate the answer.”

That’s uncomfortable. It’s new. And it’s absolutely necessary.

If we don’t teach kids how to recognize emotional resonance versus epistemic authority, the loudest, smoothest voice will always win. And that’s the part that should scare us - because even in society and politics, the smoothest, loudest voice will always win.

I have an entire series that may interest you, on my Biz Substack: https://www.trustable.blog/p/ai-trust-and-emotional-fluency-part

Rose's avatar

Thank you for your insight and the resources.