18 Comments

Great stuff!

Expand full comment

<3

Expand full comment

I didn't realize how complex of an issue this is.

Expand full comment

You should reach out to Julia. Her group has some great insights on AI in Education

Expand full comment

It's a long read, but it's worth it.

Expand full comment

Thank you<3

Expand full comment

Good stuff!

Expand full comment

Thank you<3

Expand full comment

97% Agree - positive aspects of AI 🤖 That's quite a majority view !

Expand full comment

Very different to what the doomers will force you to believe

Expand full comment

Started as a preschool music teacher.40 years in hitech and government contracts. We need a digital 'dewey decimal system' for key definitions used in the constitution so we can scrub prescient and statute for fidelity of context. See original purpose of the turing machine. Woke is incorrectly defined in Florida state legislation banning books ( bill of rights? DEI?)

Expand full comment

There is an inconsistency between federal and state constitutions regarding state codification of federal bill of rights. start there.

Expand full comment

Thanks for publishing this; it's good to see what's being discussed out there. As a teacher, I have two main concerns:

1. The assumption that "AI literacy" or some certain set of "AI skills" will be needed in the future. Aren't we all supposed to be doing everything in the metaverse right now? That prediction came and went - but it was sure taken seriously for a while.

LLMs are impressive and certainly have some well established (if limited) use cases, but we're going through a period of extreme hype and wild prognostication right now. Let's not go overhauling all our curricula in order to prepare students for a speculative future society that might not ever arrive. The past is littered with failed predictions about future technology. Not saying new technology won't be transformative - it always is - just that I don't trust predictions about future transformative technologies.

(Also, as a side note, are teachers more competent at using LLMs than their students? Are we even qualified to be teaching students how to use this brand-new technology? Perhaps we could learn more from our students about AI than they learn from us!)

2. I doubt the capability of LLMs to deliver "one-on-one education", or to serve as a "personal tutor" at a level more effective than that of existing technology (e.g. pre-programmed adaptive learning software designed to the topic of interest; Googling your question). I don't know of any rigorous research here, and I don't trust a demonstration curated by the same company selling the product. I'm deeply skeptical that this could work beyond the superficial level. LLMs are good at answering questions whose answers show up frequently in their training data. Assuming they've been fed loads of textbooks on whatever the topic of study is, they'll be able to mimic standard lessons. But remember, they don't know anything, not in the sense humans do. The way they come up with answers to questions is radically different from how we do it.

Let's say a student asks a question about how to answer a homework question. As the teacher/tutor, my first step is to look at whatever answer they've come up with on their own, if any. Based on their answer and how they answer an initial question or two from me, I try to figure out what it is they're understanding and what it is they either don't understand or misunderstand. I try to figure out how they're seeing the question (this can be hard). And I've learned over time that I'm not all that great at predicting the many ways students will make mistakes and misunderstand concepts. In order to give personalized help, I have to both understand where that one student is at, and draw upon my own subject matter understanding in order to guide them.

LLMs don't do anything remotely like this. They don't "understand" what they're asked or what they answer, at least not in the ordinary sense of "understand". They certainly can't understand the nature of someone else's understanding. Now they will sometimes manage to convincingly *simulate* such understanding (and understanding of someone else's understanding), particularly if the topic is a common one and the student is asking a common question or making a common error. But if the topic is uncommon, or the if the student's knowledge gaps are non-obvious, the LLM stands a good chance of getting it wrong, sometimes fabulously wrong. And, since it doesn't understand anything, it won't know what or when it's getting something wrong.

I've role-played this out many times with ChatGPT and GPT-4, giving them some of the kinds of questions my students ask me about challenging topics, and then asking follow-up questions as though I've misunderstood a concept that I'm not able to clearly articulare (because students cannot be expected to articulate precisely what they don't understand). The experience has not impressed me. Not to say my own tutoring is always effective, or that some future LLM won't be good at this - though based on how they work I'd be surprised. I'm very concerned that their ability to "tutor" students one-on-one is being oversold, and that the people who should know better mostly work for tech companies or are otherwise incentivized to push AI tutors on less LLM-savvy teachers and administrators and parents.

Expand full comment

Who has the dot?

Expand full comment

??

Expand full comment

Do a Llama3.2.Ask if 'woke' is defined or traced to wording in the Constitutional definition. I got a circular reference. zit's actually mentioned in the Bill of Rights ( First 10 amendments).'Woke' and DEI define the cluster of Congressional Bills as currently required. is that the rules committee?

Expand full comment

I'm not sure what youre asking here> Can you explain?

Expand full comment

I did a traceability test from the word 'woke' on Facebook 3.2 yesterday. At the end of the search ( which I already knew the answer). It gave me a circular reference when thought I followed regular audit test a key definition in the Constitution.

Expand full comment