this post was submitted on 13 Aug 2023
745 points (98.1% liked)

Technology

59693 readers
5227 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.

you are viewing a single comment's thread
view the rest of the comments
[–] Mtrad@lemm.ee 14 points 1 year ago (3 children)

Wouldn't it make more sense to find ways on how to utilize the tool of AI and set up criteria that would incorporate the use of it?

There could still be classes / lectures that cover the more classical methods, but I remember being told "you won't have a calculator in your pocket".

My point use, they should prepping students for the skills to succeed with the tools they will have available and then give them the education to cover the gaps that AI can't solve. For example, you basically need to review what the AI outputs for accuracy. So maybe a focus on reviewing output and better prompting techniques? Training on how to spot inaccuracies? Spotting possible bias in the system which is skewed by training data?

[–] Atomic@sh.itjust.works 7 points 1 year ago (1 children)

That's just what we tell kids so they'll learn to do basic math on their own. Otherwise you'll end up with people who can't even do 13+24 without having to use a calculator.

[–] Arthur_Leywin@lemmy.world 0 points 1 year ago (3 children)

When will people need to do basic algebra in their head? The difficulty between 13+24 and 169+ 742 rises dramatically. Yeah it makes your life convenient if you can add simple numbers, but is it necessary when everyone has a calculator?

[–] Atomic@sh.itjust.works 2 points 1 year ago

Like someone said. It's not just about knowing what something is, but having the ability to recognize what something isn't.

The ability to look at a result and be skeptical if it doesn't look reasonable.

169+742. Just by looking I can tell it has to be pretty close to 900 because 160+740 is 900. That gives me a good estimate to go by. So when I arrive at 911. I can look at it and say. Yeah. That's probably correct, it looks reasonable.

[–] Mtrad@lemm.ee 1 points 1 year ago

That sounds like ot could be a focused lesson. Why try to skirt around what the desired goal is?

That also could be placed into detecting if something is wrong with AI too. Teach people things to just help spot these errors.

In my experience, it's so much more effective to learn how to find the answers and spot the issues than to memorize how to do everything. There's too much now to know it all yourself.

[–] settxy@lemmy.world 6 points 1 year ago

There are some universities looking at AI from this perspective, finding ways to teach proper usage of AI. Then building testing methods around the knowledge of students using it.

Your point on checking for accuracy is on point. AI doesn't always puke out good information, and ensuring students don't just blindly believe it NEEDS to be taught. Otherwise wise you end up being these guys... https://apnews.com/article/artificial-intelligence-chatgpt-courts-e15023d7e6fdf4f099aa122437dbb59b

[–] revv@lemmy.blahaj.zone 5 points 1 year ago

Training how to use "AI" (LLMs demonstrably possess zero actual reasoning ability) feels like it should be a seperate pursuit from (or subset of) general education to me. In order to effectively use "AI", you need to be able to evaluate its output and reason for yourself whether it makes any sense or simply bears a statitstical resemblance to human language. Doing that requires solid critical reasoning skills, which you can only develop by engaging personally with countless unique problems over the course of years and working them out for yourself. Even prior to the rise of ChatGPT and its ilk, there was emerging research showing diminishing reasoning skills in children.

Without some means of forcing students to engage cognitively, there's little point in education. Pen and paper seems like a pretty cheap way to get that done.

I'm all for tech and using the tools available, but without a solid educational foundation (formal or not), I fear we end up a society snakeoil users in search of the blinker fluid.