pull down to refresh

Harvard University professor Alex Green thinks that AI is robbing students of their ability to think and that universities should crack down harder:
More than half the nonnative English-speaking students and a notable number of native English speakers told me that after relying on AI to draft their papers and emails, their ability to write, speak and conduct basic inquiry is slipping away. They tell me this as if they have done something wrong, never considering that it is their professors, not they, who should carry that burden. ... There is little evidence that senior university faculty are committed to tamping down the rampant overuse of AI. Instead, it is the paperweight on a pile of evidence that at an ethical level, universities are too timid or ignorant to insist that students use the core skills we are supposed to be teaching them.
What do stackers think? This is a live issue at colleges all over the country.
My stance so far has been that AI is inevitable and impossible to police effectively, thus de facto putting me on the "embrace" side. I'd love if we could somehow build an AI-free educational environment, at least until certain skills are demonstrably acquired. I just don't think it's possible, and that AI is going to change the way everyone thinks and works, so the best thing to do is to prepare students for this brave new world.
I think the incentives around college need to be overhauled such that it's actually about learning, rather than being about the appearance of learning. That means more things like group discussions and oral exams.
There are small liberal arts schools that do this pretty well, but it doesn't seem to scale well and it doesn't work with students who aren't interested in learning.
reply
53 sats \ 5 replies \ @kepford 13h
The education system is antiquated and wasteful. I believe it needs to be replaced not reformed.
reply
I'm sure it'll be a bit of both. There's no way the industry is able to keep up with technology advances, which will make the training obsolete and of suspect quality. That'll crater the illusion that there are significant returns to college.
Still, there will be a lot of demand for education from both employers and students. So, something will get worked out.
reply
I think higher education has always been mid-tier in terms of technology catch up. We're slower than most tech companies, but we're faster than some of the legacy dinosaur companies, and of course faster than government agencies
reply
I agree with that, but it's going to be hard for professors and the bureaucracy to keep up, which will lead to students receiving obsolete training that's not desired by employers.
reply
Something that bugs me is whether colleges should educate kids to do useless work that is nevertheless compensated well by the market because of government distortions. The formerly well paid DEI consultants are an example. But also things like training kids to do political advocacy which is mostly about fighting over surplus rather than creating surplus.
As a well known econ professor likes to say, "do you create surplus or do you just move it around?"
Yeah scalability is the main issue. I'm sure if I had 1:1 time with a student I can teach them a lot of stuff, even while letting them use AI. It's just that I can't replicate that across 50+ students
My wife and I are actually leaning more and more towards home schooling because of these issues.
reply
I've always been very pro-homeschool, but our daughter adamantly wants to go to school and we're going to let her.
After studying education economics, I'm pretty convinced that it doesn't make much difference either way, so we're going to let her try what she wants until it seems like a problem.
reply
42 sats \ 1 reply \ @Bell_curve 5h
The fact that your daughter wants or chooses to go to school is big
reply
She keeps telling us she nervous but is also adamant that she wants to go.
reply
It might not matter in the aggregate, but I do think some people may be better suited to home schooling than others. In the end, each parent has to assess what kind of environment would be best for their kids.
reply
That's exactly my feeling. If the local school is going well and that continues being what she wants to do, then we'll stick with it. If it's not working for us, then we'll try something else.
Generally speaking, this is the takeaway from Caplan's book about parenting efficacy.
reply
I like your previous idea of handwritten exams
Or maybe bust out multiple choice and number 2 pencils
reply
41 sats \ 0 replies \ @kepford 13h
Pretty much agree with you.
Thing is, I am pretty confident most of the people in education don't get AI nor do most users so they both underestimate it and overestimate it.
reply
77 sats \ 5 replies \ @Scoresby 19h
If universities are training kids to do things that can easily be done by AI, perhaps they should consider training the youth to do something else.
reply
I think the question is similar to the calculator analogy. Why do we train kids to do arithmetic when calculators can do it much faster and better? Because we think the innate ability to understand arithmetic unlocks higher order thinking that the calculators can't do.
There's probably an argument to be made here about AI as well. It's just much harder to point to exactly what skills are being lost by AI use, and how they feed downstream into higher order skills.
reply
hmm, yes, I agree with you. Yet, my expectation is that by the time a person gets to university, we're past the "innate ability unlocks" period of learning, and into the "it's time to do something useful" phase.
reply
Ideally that would be true, but in my experience a lot of students still lack basic skills by the time they get to university. It's a sad product of our education system and also pushing higher education on everyone, even those who otherwise wouldn't be interested
reply
0 sats \ 1 reply \ @Scoresby 19h
Do you have thoughts about how one should introduce AI to children (I'm thinking particularly in my context as a homeschooling parent).
So far I've done a little vibe coding with my kids, focusing more on how to prompt than anything else. I've also had them use it a little for research tasks, but I did find that this led to less thorough research. They (my kids) surprised me that they didn't want to keep asking questions about a topic, but were kind of fine with the first answer they got and just tried to regurgitate that on paper.
reply
Yeah, that's the million dollar question. I don't think anyone knows exactly just yet. But I think the example of your kids just accepting the AI's first answer is what educators are afraid of. Will they learn how to think critically about AI responses if they've never searched for sources or tried to synthesize research on their own?
41 sats \ 0 replies \ @gmd 18h
For the professors probably ignorance is bliss. They don't want to have to police the students or deal with false accusations, which will happen even if they are 95% accurate in detecting true AI cheating.
reply
School and university became hopelessly broken by late 2023 because of AI. AI is now so good from May 2025, school must be totally reinvented. The old ways are totally broken now. A child now has the capabilities of a PhD student.
If Harvard cracks down on AI, they will cease to teach effectively from now on.
reply
30 sats \ 0 replies \ @aljaz 17h
The fact that anyone thinks they can control/blockit clearly shows the fundamental misunderstanding of reality
reply
Cracking down never helps, and embracing the scams is dangerous.
What I'd want to do in their place is shape the use of LLMs - not forbid it, not encouraging slop. I think that it is important to teach non-end-user topics: how it works, training, finetuning, modifying.
If students graduate only to digress into having a relationship with Elon's virtual horny girlfriend (#1042803), or asking "grok is this true", or whatever else dumb shit normiespace does at that time, we have a real problem because it means they have not been educated on what LLMs are.
reply
I still think the most useful degrees are technical - engineering, medicine, etc.. From my experience with a mechanical engineering degree, most of the grades were from tests and the homework was essentially just your own practice and reinforcement. If you're not learning the equations and solving the problems on your own there's no way to reasonably complete the tests. I don't see how AI significantly impacts those. Computer science degrees may be in trouble, even though I think there's some value in learning and pursuing a degree there.
I think the technologies we interact with do impact the wiring of our brains at a fairly deep level. Even the ancient Greek orators were concerned with reading and writing that it would weaken students memories. AI offers some compelling uses, but I worry about the side effects of dependency and mental outsourcing a lot. Especially when the results can be mediocre to outright flawed, what happens when the ability to think critically about AI output is simultaneously impaired?
reply
I am def thinking less using ai/LLM but I am not a student and at my age learning is not my top priority... I tell myself that I am learning thru osmosis from ai/LLM
reply
0 sats \ 0 replies \ @BeeRye 9h
If a university wants to be relevant, they need to figure out how to enact their value proposition regardless of existing technology. Otherwise, you are offering a shit product.
reply
Embrace it but give the students a decent foundation in the subject and topic.
reply
stackers have outlawed this. turn on wild west mode in your /settings to see outlawed content.