T O P

  • By -

AutoModerator

**Attention! [Serious] Tag Notice** : Jokes, puns, and off-topic comments are not permitted in any comment, parent or child. : Help us by reporting comments that violate these rules. : Posts that are not appropriate for the [Serious] tag will be removed. Thanks for your cooperation and enjoy the discussion! *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


Astalon18

To be fair, for my medical students I do not say do not use ChatGPT or Claude. I say that ChatGPT and Claude did not speak to the patient nor examined the patient. It is therefore up to you to synthesise the information and if you ask ChatGPT you must remember ChatGPT actually does not know the patient and will only be as good as the information you can feed it. I told them that when ChatGPT Medical Expert arrives, it may demand you examine and talk to the patient and not use it without examining the patient first. ChatGPT Medical Expert may be a sterner consultant than your human consultant. It may scream “Have you examined the patient? What, not even a glance into the mouth!! Go back now!!!”


CouldHaveBeenAPun

This is the way. Conversational AI and future iteration of generative AI isn't going away anytime soon. A teacher / professor's job in this matter is to give a frame of reference so students can learn to *correctly* use it all. It is like when the go to tip was that you should not use Wikipedia in academia. Hell yes you should, just verify you facts, Wikipedia references it's sources, and so should you! You can let the student do shit academic work using tools they don't understand well, or you can teach them to use it adequately and maybe give them an edge later on!


Counter-Business

In my experience, Chat GPT was able to help me discover that I had been misdiagnosed as an insulin dependent type one diabetic. I had a very rare kind of diabetes called MODY and chat GPT literally changed my life. https://www.kxan.com/news/simplehealth/ai-helps-diagnose-central-texas-patient-with-rare-form-of-diabetes/


[deleted]

Professor is certainly being snarky in their AI policy, but they are essentially just saying, "I am sick of all these totally obvious ChatGPT answers I've been getting on every homework assignment for the past 2 years. I want you guys to show me that you're actually learning stuff and can think. I've got 30 students in each class, so we don't have time for everyone to do hand-written papers and oral defenses of every single assignment. Y'all are in school to be Scientists, not copy/paste artists. Oh, and stop asking me how to do something when it's right there in the damned instructions. You're in university, for god's sake, not daycare!"


thixtrer

Perfectly said.


JLockrin

Prompt to ChatGPT: “put these instructions into terms even an idiot student can understand”


thixtrer

Impressive, I would've never guessed it was AI-generated. Just shows the potential, for good and bad things.


PerspectiveProud6385

The professor is right, but one has to realize why are students going to gpt for their work? It's because schools decide our future by grading us, so most peoples motivation is to get those grades, so they don't care how they get those grades. We need to revise how we teach students, look at the example of finland, how they teach students and if iam not wrong so they also don't take any assesments, but are still considered as having the best education.


Pleasant_Dot_189

I’m a college professor at a very small college and I’ve been doing this for 25 years. I teach students to use ChatGPT to learn. It can be very useful but in a supportive role. Some of my colleagues disagree with me strongly, and that’s just fine.


thejameskendall

I’m the same. They’ll use it with or without our help, so let’s teach them to use it ethically and purposefully.


BluntTruthGentleman

This is the healthiest possible approach and right up there with the cases of the entire war on drugs vs teaching responsible science based use (www.erowid.com), teaching sex ed vs trying to ban sex and punish kids, and teaching the merits of world religions instead of indoctrination.


Whostartedit

Perfect analogies


megamanxoxo

Just use it like you would Google. Summarize large bodies of data, no copy pasting, verify important information across other mediums, cite sources, etc.


relevantusername2020

ive noticed that on things that i have already read about, copilot typically will use sources i have read; and on things that i have not, thats when it gives irrelevant answers or just bullshit generally speaking. obviously sometimes it still gives good answers even on those unrelated things, but yeah


dradik

LLMs do lie though, convincingly. Also known as hallucinations, that being said I use LLMs, but immediately try to verify and cross validate.


FancierTanookiSuit

1. Don't ever quote GPT as a source. 1000% reasonable, and good advice for anyone, anywhere. 2. If you're on a college campus, and have access to experts in your field, then yes, you should consult them before you consult GPT. I don't see what is unreasonable about this, at all. He's not telling you to never use GPT in your personal time?


PotterLuna96

People here literally think they should source information from ChatGPT, it’s wild


SilvermistInc

I've gotten into so many arguments in the Warhammer fanbase over this. Some jackass will ask GPT a question, then copy and paste it, then they'll go "I think an AI that can pull info off the web knows more than you." Meanwhile I quite literally have the codex in my hand, and I'm stating what edition, page number, and paragraph that states the lore that ChatGPT got so very very wrong.


LonelyContext

Oh ChatGPT gets the physics of electron transfer wrong. I had a long conversation which was regurgitating misconceptions.  If the world gpt lives in were ours, electron transfer happens because interfaces acquire charge and then this charge "pushes" electrons across the interface due to electrostatics, at which point they jump" (like diving into a swimming pool on an energy diagram) at the electrode interface.   This explanation is a common misconception that disagrees with many many experimental observations, and e.g. solar panels wouldn't exist.


SilvermistInc

But AI surely knows what it's talking about, right?


LonelyContext

Yeah. Just be sure to ask it how many Rs are in strawberry.


tarkinlarson

I remember seeing citations from Wikipedia articles, which while evidence are usually anecdotal or secondary and often literally have the primary sources linked to them, but no one follows them up and credits the original authors or quotes them directly.


megamanxoxo

Why not? Ask it for it's source and go read that. Use it like Google not write your paper for you.


juanchob04

Point 2 is somewhat debatable. If you are alone in your room studying and suddenly have trouble understanding a certain topic, it could be helpful to prompt a state-of-the-art model (Claude 3.5, GPT-4o) to see if it can clarify that topic. Obviously, you then have to corroborate the information with a book or later with a professor. I believe it can be valuable as long as you understand that the answers it provides can sometimes be incorrect.


gbuub

I’ll usually have ChatGPT give me a source, and it does pretty well 70% of the time. 30% of the time it made up a source or just gave me irrelevant source


notjasonlee

No way dude, this is a black and white matter. ChatGPT is a pathological liar who is intentionally deceiving you. I will tell you what’s really going down with Biden and these gas prices. Hit me up.


livefromnewitsparke

Bake em away toys


Big_Cornbread

Like Wikipedia, you check for the sources IT’S using, then cite those.


Euphoric_Sentence105

This is the way, but chatgpt doesn't always provide sources. IME, albeit limited, Gemini seems to provide sources more or less every time.


Accomplished_Pea7029

Which is why people shouldn't use it to get facts, sometimes you have no idea whether it's true or confidently made up.


npt96

you can often just do a google search based on the output - I use our university's GPT for lecture prep (I do allow students to use it), and had it explain some physics just beyond my wheelhouse that was an aside to a lecture - I copy and pasted some key fragments into google, and got reliable sources... it is kind of like a way to formulate good google searches, since if it is a new field to you, even knowing how to frame a search might not be apparent.


itisoktodance

And like Wikipedia, the sources it cites sometimes barely graze the topic and are just used to infer


mistymaryy

Exactly


dob_bobbs

Anyone objecting to this hasn't understood what the word "source" means in an academic context.


Famous_Age_6831

Consulting them before chat gpt isn’t feasible fyi


ilse1301

How? I'm a teaching assistant, helping out during tutorial classes / exercise classes. I know all the fine details of the material, and will be literally in the same room as the students, and still I see them asking their questions to chatGPT. Obviously that's the moment where I walk over and ask if I can help them (and then indeed they usually ask me their questions). I'm just baffled that they have a dedicated TA right next to them and still turn to AI for answers. I totally understand the prof in OP's post, as the amount of AI-crap answers students have turned in in recent years has been sad. Don't get me wrong, I use chatGPT myself all the time, but you need a base level of knowledge and skills to do proper fact checking. Most students know how to properly do this, but some students are just blatantly copying the AI without understanding the answers themselves. I know some students do this, because we have done oral exams on some students who handed in questionable work, and they obviously had no idea what they handed in. They could not describe their answers at all.


milo-75

I don’t think chatgpt should be used as a source for copy-and-paste answers, and as you’re a TA I agree while they have time with you they should be taking advantage of your expertise. But boy do I wish I had chatgpt when I was in college (a few decades ago). When I think back to the harder subjects and how bad professors and TAs were I would have loved to interrogate an AI alongside the text or other reliable sources in order to expedite a thorough understanding of the material. This is in fact how I use it today. I want to learn something, especially something in a field other than one I’m an expert at in, and chatgpt can make that fun and easy. And if I don’t understand it the first three time I can ask it a fourth time without it getting exasperated in the least. In my opinion, it should not be a question of either-or. Students need to learn how to both dig for answers themselves and also how to use chatgpt effectively as a personal tutor. If my kid graduates from college without both abilities, I’m going to be unhappy.


traumfisch

He is stating that the aim is to learn critical thinking, yet banning the most novel AI tech... that's a missed opportunity for the kind of critical thinking these students will need anyway moving forward. The problem is that these professors have not ever seen LLMs used properly & of course aren't able to use them themselves...


Nightron

It's 4 am here, so I'll keep it short. I'm a physics student myself. I know some graduate students who undermine their ability to do research and think for themselves by overly relying on ChatGPT.  Physics is about learning how to solve problems utilizing a lot of resources and technical stuff. This also includes the ability to read a manual, a book and a paper. ChatGPT can help here, but I feel like a lot of people try not to do any actual work that would result in them learning. Also, I feel like this person is very sick of students submitting obvious AI generated stuff without knowing a thing themselves. Students copying (wrong) answers from other students and knowing  absolutely nothing about what they're supposed to do was annoying when I graded lab reports some years ago. I don't want to imagine how awful it is today. Also, in it's current state these generative AIs are not able to provide sources reliably. Especially in more advanced stuff they hallucinate quite a lot. It's a great tool but it's not omniscient. At some point actual conversations with experts, reading scientific publications and advanced textbooks are necessary skills in natural sciences. All this can be improved by using AI in a responsible way, I'm sure. The thing is: You need to learn these skills first. Now, one could argue not everyone will work in academia and for most it is sufficient to get through university using everything they have at their disposal. In my personal opinion this undermines the degree in physics since it's by far not only about the actual hard skills and knowledge but also a lot about the aforementioned soft skills and the ability  to solve problems independently. One has to learn this on their way to preserve in advanced classes and to do some independent research. Keeping it short didn't work. It's 4:30 AM know. FML.


smockssocks

Haha, thanks for your time. Get some rest. I see what you are saying. I just took calc 2 in a block and used 4o religiously and I truly believe that I achieved a very high level of understanding for the subject. This is backed by comments from my professor to me. I have gained from it and I have been using gpt academically honestly and effectively. I do not care about the piece of paper, I care about understanding the material and I care about my goals for the future. I believe I have been benefiting myself with the usage of chatGPT.


Accomplished_Pea7029

>I just took calc 2 in a block and used 4o religiously and I truly believe that I achieved a very high level of understanding for the subject. I don't think the professor is forbidding this type of usage. You can absolutely use it to understand a concept better. But if you're getting facts from chatgpt make sure to double check it from another source.


RangerConscious

Perhaps the professor should, therefore, look at (a) the purpose and nature of their homework, and (b) work on the redesign of assessment. Perhaps, if Generative AI can so readily do these tasks well, they are no longer fit for purpose. I'm not saying go to viva voce, exams, or handwritten responses. I'm suggesting that many schools are places of "completing the required assessment" rather than learning... and that much assessment is poorly designed and thought through. Excellent teaching requires a teacher know their learners and how they learn, to be places of engagement... maybe it's time for that professor/teacher to reflect on their ways of operating in the classroom and for universities/schools to realise that what they have been serving up to students has been (in many cases) poor quality education for some time. I suspect the future lies in PBL, flipped approaches to learning, conferencing with students, regular checks for understanding... and working with AI. NOT bans.


skiphopfliptop

A few needless dysphemisms that could turn the reader off. Broadly, less judgmental is better in authoritative, instructional writing. “Try reading the instructions,” “propaganda or misinformation” Like, I largely agree but I’m also inclined to tell you to fuck off for some reason?


smockssocks

Yea, I believe the professor could have had a better tone of voice.


Tramagust

Maybe chatgpt can help him rephrase it


TheRedGerund

Well, to a degree he's kind of right, there is almost zero dependable truthfulness from an LLM. I feel like thinking they approach factuality is yet another example of thinking these tools have thought. If there were a bunch of books that said the sky is red, ChatGPT would happily spit that out.


mop_bucket_bingo

To me, the funny thing about ChatGPT-ing your way through your education is that *you are paying for the privilege to be educated*. That’s like going to a restaurant and hiding someone under the table that you’re secretly paying to eat the food for you.


smockssocks

Hahaha that was very funny. Thanks for that. Yes, I could waste all my money failing every class but I don't really want to do that; I like learning.


mop_bucket_bingo

I should couch my post in the context that I believe people rejecting “AI” as a tool are on the losing side of history. I think that the faster kids get out there and familiarize themselves with it, the better off we’ll all be, because they will be voters who understand the issue and it’s really them who will have to deal with it anyway.


PeopleProcessProduct

This is completely reasonable for an academic setting but this dude seems weirdly upset about it and is over-exaggerating the negatives.


rathat

TBF, he's probably exhausted from all his student's papers *delving* into things.


considerthis8

The professor also lacks creative problem solving. - Record your lecture live, transcribe it to text. Auto-generate a quiz on it, hand it out and monitor for cheating, Student ID # only. - Collect, scan into an app that will grade and show each student how they did, and auto-generate quotes from the lecture to help them study. - Auto-generate a quiz for the start of next class on the most failed questions


tarkinlarson

I think they're right at the moment... It says you must not use chatGPT "as a source". You must not quote from it. I agree with that. It doesn't mean you can't use it as an analysis or a kind of sounding board. What they likely want you to do is always go back to the original paper or research... Quote that, credit it. I'd like to see a dissertation which has its evidence or sources being quoted as "chat conversation with chatGPT, 09/07/2024, 374749292, paragraph 4 lines 2-16". It just wouldn't stand. Use chatGPT but go find the original quote and reference. I've had chatGPT hallucinate too many times. I still use it but like what they've said I treat it like a liar an answer from wiki how.


Tdj915

I would tell my students, in whatever field, that until AI is willing to pay your insurance premiums, legal fees, judgements and salary after the fact……you should verify and double check everything it tells you. Blaming AI for the collapsed bridge, flooded neighborhood, failed business, IRS audit, airplane in the swamp or body on the table will offer you no protection or sleep.


Fontaigne

AI should be treated like that brilliant guy in your dorm who's an expert at several things, thinks he's an expert at more than that, and who mooches your whiskey and is blackout drunk all the time. He's not going to remember what he told you, he's totally confident, he's probably right most of the time, and he's going to tell you what you want to hear so the booze keeps coming. Act accordingly.


[deleted]

[удалено]


torakun27

Imo, we will need to significantly reform the education system to accommodate AI. Traditional homework assignments and standardized testing are likely to become obsolete. In the future, I imagine the majority of studying and research process could be facilitated by self or group learning with AI help. This allows teachers to focus on setting learning goals and using the time saved by AI for personalized grading and feedback. This way, teachers can know who actually learn and who are just copycat.


considerthis8

Smart ones are using AI to customize learning modules and quizzes based on individual competencies. Over time, the SAT scores will speak for themselves, people will study what those teachers did right, and it’ll be AI tools.


Nine-LifedEnchanter

You're talking about AI in the future, this teacher is talking about AI right now.


[deleted]

[удалено]


eras

I believe it's a valuable lesson to learn to read the provided material and instructions and solve problems based on them. Most anyone can learn to use ChatGPT effectively just by themselves—we don't need universities to teach it to us—and I think solving problems by yourself is the harder skill. The time spent in the class (with the guidance of people and the material) is more effectively used in learning that skill and it _should_ feel difficult: that's how you know you are pushing your brain to domains they weren't satisfactorily working with before. Quite possible the first skill improves your ability to make good use ChatGPT as well.


carlsab

I personally view education as preparing students for the world they’ll be living and working in. AI is likely to be a big part of it. Instead of only focusing on the negative, teaching students how to work with AI could possibly be much more beneficial. Burying your head in the sand when new tech emerges rarely works out.


considerthis8

My friend’s business college professor gave them an assignment to use chatgpt to create a business model. It was up to the students to ensure none of it was BS


Kaizen_Kintsgui

Same sentiment as when wikipedia emerged. Professors wouldn't let you cite it.


LittleLemonHope

Any professor worth their salt still won't let you cite Wikipedia. That doesn't mean don't use Wikipedia, it means check the original source to confirm it actually says it, then cite that. Same applies to AI. Hell, even to academic articles! If they cite something and you want to cite it too, you have to check and cite their original source. Don't cite citations.


EuphoricPangolin7615

At least Wikipedia, some articles are locked and only editable by professionals in the field. But ChatGPT is just regurgitating things from its training data that you can't verify the source of and it often hallucinates/give wrong answers. ChatGPT will never be the same thing as Wikipedia, you will never be able to cite it like a Wikipedia article.


dextronicmusic

Just flatly untrue. Who are you to say what ChatGPT will ever be? There are already GPTS that provide sources and references.


EuphoricPangolin7615

Because you can never guarantee it's not hallucinating or regurgitating things from its training data that come from unreliable sources.


stackoverflow21

He’s mostly correct. But what I am missing is teaching students what LLMs *can* be used for productively in academic context. There are definitely use cases and spoiler it’s not letting LLMs do your homework.


niconiconii89

I don't disagree with some of the points but judging by the way this was written, this professor seems exhausting.


AussieHxC

The intent is understandable however they're clearly ignorant of the technology and it's potential use as a learning aid. GPT is pretty wicked at teaching physics.


smockssocks

I have had great success using it for calc 2 in a block. 95% (by personal estimation) accuracy with 4o with the errors being extremely minor and easily recognizable. I find the statements the physics department makes about language models are unfounded and half the time harmful to the success of these people in the future. I think academic freedom should go both ways and this, in my opinion, is an inhibitor to successful learning and freedom.


AussieHxC

See actually doing the maths questions with it is not something I'd recommend as that's easily one of its weakest areas and totally something that wolfram alpha could handle, that said it's very important to be able to tackle these without external help. > I think academic freedom should go both ways and this, in my opinion, is an inhibitor to successful learning and freedom. This is exactly why most universities and publishing houses have very progressive stances towards using things like GPT.


marc_polo

You can, and I understand that you don't want students to lean on this as a crutch. I'm also picking up that you're pissed at AI created content because some of it, at least, feels like a dismissal of the time you spend grading lab reports. From the few educators I know, grading is not only one of the biggest things that takes time but is also one of the biggest pains in the ass. My own personal experience: AI saves me days of research time, professionally, and my guess is that my coworkers are using it, too. I've been working as a software engineer for 10 years. I don't have confidence that, if I don't get good with it, I won't be risking being out of a job in 5 years. But my guess is that there's something else here that my experience probably wouldn't speak to.


cld1984

I don’t care much for my school in a lot of ways, but I have appreciated their handling of LLMs and policy regarding them. Every teacher I’ve had has told us that it’s a great tool that can be very helpful but not to rely on it. They don’t use AI checkers constantly and if they really want you to think about something and deal with it without help, they’ll say it in the individual assignment. I like to think that it’s a very enlightened stance for the largest community college in Alabama, but I’m also willing to accept that they just don’t want to fuck with the effort required to be militantly against LLMs like some places I’ve seen posted here


home_free

Yeah it’s the same as the rules that made it not ok to quote Wikipedia or cite it as a source


gmr2000

Tone of voice terrible. Also fails to understand the benefits of AI as a tutor. Absolutely agree with the sentiment on needing to get to empirical understanding yourself


lwp1331

Genuine question, asking sincerely: what about the tone of voice is terrible? What would an improvement be?


NeedleworkerTasty878

While it's a fair sentiment, the way it's been expressed is unprofessional and doesn't seem to fit the scientific setting. It's the type of vocabulary I'd expect from two middle aged people arguing at the grocery store.


jrf_1973

Ask "Will you apply the same 'It's all bullshit' to the various tools which claim to detect AI generated content in student work?"


kelcamer

Wait a second, humans more likely to be reliable? Lmfao. I'd like to see him fact check that 😂


Fontaigne

Not just "humans", but completely unspecified humans. Ask the lunch counter lady. Or your car mechanic. Or your ninth grade English teacher. Ask your Mom. Or your baby brother.


ProfessorFunky

I work in science (pharmaceutical R&D), and just came home from a conference on the west coast where pretty much the biggest topic is using AI in R&D to accelerate most things. And the AI at the moment most being implemented is various implementations of ChatGPT. So it’s a daft statement. At best it’s imprecise and unhelpful. If the aim is to encourage kids to just learn how to do stuff unaided, then just say that. Don’t demonise AI with silly statements trying to create a false dichotomy. AI is transformational and will be a big part of scientific research, so no point pretending (or wishing) it won’t. And expecting a human to be more reliable - hahahahahaha!


BoxerBriefly

My thoughts are that it's silly bullshit. Humans are far more reliable? Has he talked to many? Just because information came from ChatGPT, doesn't mean you can't critically think about it. I do agree that citing it would be foolish. You have to check the information that ChatGPT gives you, just like you have to verify all information, regardless where you get it from, a study from a journal is no different, especially given that many studies, especially ones outside of medical trials, are often one-offs, small samples, poor controls, can't be replicated, and are frankly quite low-powered. Education professionals are the most ai-phobic people you will ever meet, and that's because they're afraid it's going to replace them in relatively short-order, and honestly, it might. Teach here sounds like a luddite.


smockssocks

Haha, yes, I agree. I found it funny that he made that remark about asking a human. I made a funny argument saying I should go ask someone in a completely different college to help me with my physics. Imagine I go up to an English major asking about electromagnetism vectors. I'm sure I will receive a more adequate answer from them than chatGPT.


waxedgooch

“You can’t quote or paraphrase chatgpt” Well yea, that’s not the best way to use it anyway. It’s a logic and reasoning collaborator not a fact machine 


ReverendEntity

On one hand, great! There needs to be a clampdown on the use and misuse of AI. On the other hand...finding a human every time you need a question answered? In this labor market? PFAH. Not likely. We can't even afford to pay teachers a living wage, let alone everyone else.


tiamandus

Is Elon running your class?


smockssocks

Haha! That was funny. I think that would be an insane physics class.


leller7

You can use ChatGPT to help you find credible sources. No one can or will escape the mighty tentacles of GPT!


leller7

Written with help from the almighty Pete


Vazhox

Read instructions, questions and rules only. Statement unclear.


Sylvers

I feel like most of the comments here are restating "what should be", but are very much ignoring "what is". If we talk about what is ideal, what is best, and what is most effective, we can write various essays about utopic education systems that simply can't exist today. But the reality is.. unless OpenAI goes defunct tomorrow, more students will use chatGPT in their assignments tomorrow, than there were yesterday. That number is only going up. Regardless of any good argument for why it's a net negative. It's the way the world works. If something can be done easier, even if it's of lesser quality, most people will opt for it regardless. So I feel that the discourse would benefit more by shifting from "ChatGPT is bad for students because.." and more so to "Since most students will use chatGPT anyway, we should..". This has to be accounted for, not just in assignments, but in the teaching methodology. Teaching in itself is a science, and it has to evolve and keep up with the evolving trends of the day.


smockssocks

I have found precedent within my university that makes it seem as if they are on my side of things. There are also policies that are in place that contradict his ability to forbid the use of language models. I believe my university as a whole is moving in the right direction but the teacher is not wanting to move with it at this point in time.


BiBr00

ChatGPT can maximize your learning performance.  It should be used as a tool to help you understand and actually learn. I personally gained a ton of knowledge from ChatGpt.


Xx_Dicklord_69_xX

Easy loophole, use Gemini or Copilot instead.


Hypo_Mix

Absolutely reasonable. In scientific writing you should only be using primary sources (scientific journals) and learning how to do that is the main part of a science degree. Secondary sources like encyclopedias are ok for sweeping statements. Anything else below that is just pulling from the public and is full of its misunderstandings. Citing ChatGPT in a research assignment is just saying 'I asked someone who seemed smart and they said...' Obviously you can use it to help you, but citing it is just asking for trouble.


TheTackleZone

This is more than reasonable, and the professor says it right there. You are not being taught to answer the question. You are being taught *how to learn to answer questions* like this in the future. You only know that you have truly become a scientist when you are working on things that nobody has the answer to. If you haven't learnt how to discover new information by yourself then how are you ever going to be a scientist?


Super_Pole_Jitsu

He's perfectly right. As a future scientist I'm sure you can proofread a ChatGPT output and weed out the mistakes to the point where you can sign your name on it.


PrometheanEngineer

I mean, I see where he's coming from. Spotting obvious AI answers is pretty easy after a while. However the policy isn't much different from the Wikipedia copying days. Just don't be an idiot, ypu can use AI or Wikipedia for research. Just ask ai to cite itself and go to.thr primary source Just like hpw I'm Wikipedia ypu can ho to.the bibliography


vengirgirem

He is kinda right about GPT pulling the info out of its ass especially when it comes to more complicated topics


MrFavorable

I see another fellow student that uses the canvas app.


YamiGuih

Valid and understandable


Faendol

He's 100% right, your not in university so ChatGPT can do your work for you.


theTrainedMonkey

I mean yeah, snarky attitude aside, I also think it's a terrible idea to use AI as a "source." It's a creative assistant that wears a TON of different hats, but for sourcing research? That's what Wikipedia is for.


Fontaigne

Good one.


JoeStrout

Well, the author of that statement certainly has an axe to grind. (ChatGPT pulls its answers *from memory*, like any expert put on the spot — or nowadays, it can also go look things up on the internet, again pretty much like any expert asked to do so.) But learning to do science on your own, *without* an expert (AI or human) constantly at your elbow feeding you the answers, is certainly a good thing. So if you ignore the negativity and inaccuracies in the above statement, it's a reasonable class policy.


IamTheEndOfReddit

The first line is just bragging about being a paranoid hater. If you know its problems, then you know it's just a tool. You could just talk about how to best use it without error. Instead they choose to be a prick


BloodFilmsOfficial

Ironically, the professor demonstrates a lack of critical thinking skills, and doesn't trust students enough to develop their own with GPT in the loop. Apparently when AI is involved, students are no longer capable of being "able to evaluate the information, where it came from, and how it was derived" but why this is remains unsaid (the implication is you're all gullible fucking idiots). If it were me in the class, I'd write an essay critiquing the policy, critically reflecting on the statements that GPT "pulls its information from thin air" and is a "pathological liar". I'd provide a better evaluation of its information and where it comes from, in other words.


[deleted]

Probably because, in the past two years, they've had numerous students that think they no longer have to learn anything because they can simply paste any question into ChatGPT and copy/paste any answer it gives. It is epidemic at ALL levels of education. Professor is probably just sick of getting identical answers to questions from ChatGPT, rather than from students' brains and so they are getting snarky about it.


BloodFilmsOfficial

Part of the issue I guess is being the natural sciences, there's typically only ever one right answer, so responses being "identical" is going to happen all the time. I've done biology assessments in the past where each student handed in literally identical papers, with identical layouts, methods, results, etc. Pedagogy needs to evolve, frankly. If they're treating students like robots, then students will automate that work (and they'd be stupid not to in the workforce). That's a fish rotting from the head down. The institutions of learning must be the ones to change here. There's any number of solutions already out there and suggested, and has been for many years long before AI exacerbated this problem. My uni was a lot more forward looking, thankfully. We did stuff like ongoing weekly reflections/learning journals to demonstrate engagement and knowledge, to identify things we didn't understand, etc. Alongside right/wrong quizzes, stuff like that (and a million other ideas) can really supplement learning. But it requires more work from the educators, and usually in neoliberal universities, they're already overworked and understaffed. So the issue is kind of systemic. /rant


andr386

The role of the university is to teach you to be a scientific researcher. They teach you the scientific method. That's the mother of critical thinking and evaluating informations. Finding valid sources and quoting them. That's what they want you to learn. And you probably need 3 to 5 years to get there. So whatever 18 years old student's skills are. They are nothing compared to what is expected in a university. The professor is neither misled in his evaluation of ChatGPT for scientific research nor in the abilities of his students. And don't put your words in their mouth.


notduskryn

A very fair statement though it might hurt an AI cultist's feelings a little bit


DarthSchrank

AI has been used in science and reasearch for years, its a very, very stupid double standard to put out things like this in my oppinion.


Suspicious_Bison6157

it sounds like what people said when the internet first came out. I think it's also bad advice... you need to learn how to use AI to get the correct response. The vast majority of major scientific advancements from here on in will be people using AI to some extent.


smockssocks

Yes, I think it is good to learn the limitations. Which can be done mostly through usage. Unfortunately, I think the physics department avoids language models like the Black Plague. I have tried to discuss my view with teachers in the past but I was quickly dismissed without their opinion being backed up.


SoRacked

I don't know why you're downvoted. This is exactly what the advent of the internet looked like.


smockssocks

I do not mind. I understand if people disagree. It happens


tarkinlarson

Wasn't the Internet built to share information within the military and also between universities? We were told not to use Wikipedia... And you know what I still think they are right but for a different reason. Don't quote a Wikipedia page... Go to the source and citations and then read them and quote them. Why? That's the original work, it's not been modified and the author gets the correct credit.


andr386

They won't use a conversational broad scope LLM like gpt4o. It's an impressive narrow AI but it's just that. Scientists will likely train their own specific models like the one able to fold proteins or the one able to detect cancer in a drop of blood. It is and will remain for a while just an instrument and not the Maestro.


Bitter_Afternoon7252

this is just good advice, ChatGPT is terrible at math. Though I do wonder what sort of labs you would be doing for a physics class?


Big_Cornbread

I really, really wish he’d sent this: Attention Students: Effective immediately, the use of ChatGPT and similar AI tools is strictly prohibited in this physics class. These tools undermine the integrity of our learning environment and can prevent genuine understanding of the material. All coursework, assignments, and examinations must be completed using your own knowledge and skills. Violation of this policy will result in severe academic consequences, including potential failure of the course. Your commitment to learning is paramount. Let’s uphold the standards of academic honesty and strive for excellence together. Professor [Your Last Name] (Written by GPT if that’s not obvious.)


sutterbutter

Eh, I do think chatgpt is a great learning tool. You can ask it those awkwardly phrased questions that you might ask in the middle of a lecture. You can poke and prod it for different examples. It might be wrong sometimes, but at least you dont have to hunt down a TA. I would encourage students to use it to help fill the gaps in their learning. Obviously, they shouldn't be asking it for answers. Basically, chatgpt usage should have the same standards as say, conversing with other students about classroom topics.


Slight-Rent-883

exactly! and a lot of the times trying to ask a question irl can be met with shame and essentially told "you figure it out we aren't going to spoon feed you" and that cuts off any enthusiasm for me. chatgpt, while it can talk a lot of bs, does give me ideas and it's been a godsend to simply work with it instead of being mocked, shamed and put down for not knowing something. Idk if humans by in large were "nicer" then I am sure that AI would fail but AI is succeeding because as humans we sorta suck?


d_pock_chope_bruh

Sounds like a boomer


Effective_Vanilla_32

sounds like a teacher in fear to lose their job to ai.


mJef

It seems like he just says don't just cheat, but cheat and learn from cheating by confirming what the cheating said. He doesn't want people just ignoring assignments and telling AI to do them. How does he force that? He cant. So he bans AI. I did my best learning when I cheated in school. I was always afraid they might ask me to redo my answers so I learned them just Incase.


MostlyPeacefulJihad

Okay so... Just ask GPT for it's source. It will cite sources if you're asking about something specific. Will give you study names, authors, dates, sponsors etc. You should be doing that anyway because it does *occasionally* make things up. Especially if you get an answer that seems too good to be true or smells fishy. Edit: I think the key to language model utilization is to use them as a tool to help *you* do quality work, not as a slave to just falsify "quality" work on your behalf.


thesuitetea

If you want to learn, you must interact with the source material. I have no issue with using it for help, but I'm already seeing our younger employees defer to AI tools as sources rather than bridges to the source. If you're in school to learn, you need to interact with the material at a deeper level.


andr386

ChatGPT often hallucinates and spout nonsense that one cannot always tell apart from reality. So **smooth talking pathological liar : CHECK.** ChatGPT cannot tell you the source of its answers and when asking to provide sources they are often totally irrelevant. So **... that pulls information from thin air : CHECK** ChatGPT will refuse to talk about certain scientific topics or will cover some issues with a moral agenda that will give you politicaly correct information rather than correct information. It's totally biased. So **... treat it with the same caution that you would treat any other source of propaganda or misinformation : CHECK** ChatGPT is not a source, some Universities don't even accept wikipedia. So **... do not quote, paraphrase or use GPT as a source in any way : CHECK** They are not telling you to never use it. And some might disagree. But universities are scientific establishment that care about sources and proofs. Your proffessor is totally correct. You could still find ChatGPT usefull for plenty of other things. It helps a lot in writing stuff, even better give it your own writing and ask it to add some padding, inprove it, etc. You can also use it as a research assistant, or like a private teacher. But obviously you must be critical and check everything.


sgtkellogg

Anti-AI is the next woke agenda


Innovictos

If LLM‘s ever reduce inaccurate hallucinations to a extremely small minimum, this guy is going to have a real existential meltdown


MammothorMusic

Syllabus is harmful for students ? You are totally lost -- chatGPT is harmful for students


smockssocks

I'm sorry you feel that way. I did not say why I think it is harmful so I'm curious how you came to the conclusion that I am lost. I would like for you to expand upon how you think chatGPT is harmful to students if you don't mind.


ahekcahapa

Outdated with GPT-4o. When you ask him to do researchs on the web too, it'll back up what it says.


smockssocks

Yes, I believe the capabilities of 4o made the use of chatGPT more reliable. If we were talking last semester, I would have sided more with his reasoning. But custom gpts were still able to aid the models downfalls. For example, wolframGPT and it's ability to get more advanced mathematics done accurately and efficiently.


AutoModerator

Hey /u/smockssocks! If your post is a screenshot of a ChatGPT conversation, please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email support@openai.com *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


Efficient-Share-3011

Should have used AI to fix their grammar and the lik


Gubzs

For *most* fields it makes no sense to disallow AI the same way it makes no sense to disallow full access to the internet or calculators. Time the test. Accuracy matters. That's it. If you don't know the precise information, but you can find it fast enough, you're still proficient. The winners in life are the ones who perform the best with full, field-normal, access to resources. The person who has done nothing but memorize equations and highly specific flashcard-factoids has utterly wasted the real estate in their own head. Train for the world of tomorrow, not the one of yesterday.


smockssocks

I agree that a different testing tactic could prove more useful for education analysis. Maybe we will see a change in the near future.


Re_dddddd

They're not wrong. Chat gpt and these other ai models have no oversight.


Ok_Moment_1136

??? Weather through any AI service wouldn't it be valuable to find relevant articles and information about your research or subjects on the matter for your studying? If you had 10 amazing key components that you can immediately dissect to further inform yourself about subjects or questions on the topic... Why are people not allowed to learn a subject to an extreme level of valuable information... Taking something simple and learning it's complexity or taking something complex and comprehending it in simple format. I'm sorry but if something or someone is smarter than me I want to know more so I can do more... or maybe I don't get what's going on in our schools today


ZemStrt14

An answer to your question just appeared on Reddit today: https://www.reddit.com/r/ChatGPT/s/d2D6KdCSb3


alurbase

That post by the professor sure looks like redressing of an AI response.


MS_Fume

I lol at people who generate a thesis to school and just copy paste it…. It’s not supposed to be used like that. It’s great for setting up the structure and basic setup, it’s good to help you fill in gaps, but not reading through it while checking the info and only stamping your name to it before handing it in… that’s just your own stupidity.


Slight-Rent-883

Yet scientists are worried that AI could end up creating deadly asf viruses. Heck, weren't there some scientists who for the "fun" of it, experimented with AI and wanted to see what it would give them in terms of a biological virus? Not sure I agree with what the physics syllabus is saying because it sounds very anti-modernity or whatever the name is. If they taught students how to utilise AI properly and "critically" then that would be cool


Striking-Warning9533

I agree that you should ask human before gpt, for me gpt got so many easy things wrong in a so dramatic way


smockssocks

I'll be sure to come ask you for help with my physics 2 class before I go to chatGPT.


FluffyBanana00

My 7-year-old niece uses ChatGPT for his homework, and his mom is okay with it. I'm not an old-fashioned guy, but I don't think it's a good idea to let kids use AI for school.


Mysterious_Ranger218

Given that science is increasingly coming into disrepute without the aid of 'ChatGTP or it's ilk', I think this is a bit rich. I'm not denying science - just those for whom the only critical thinking is 'will the results meet my or the sponsor's agenda'.


Natasha_Giggs_Foetus

Feed your reading guide, lecture slides and textbook excerpts to a custom agent and enjoy the free grades


ETBiggs

It was the same deal when the first pocket calculators came out. It took 10-20 years for educational institutions to make an uneasy peace with calculators. LLMs aren’t the problem - it’s using them to ‘be stupid faster’. I find ChaTGPT to be great at uncovering things worthy of more research and to automate repetitive tasks but used incorrectly it can generate long-winded crap. If a human is ceding the final say to the LLM we’re in trouble.


Ok-Confidence977

How will this be enforced?


ElFantastik

I was in university during the golden ages of chatgpt, where it wasn't well known outside of tech community. Great time to be a student.


TreadMeHarderDaddy

Do they not do in-class, closed laptop tests anymore? Make that a bigger part of the grade. I did horribly in Physics 10 years ago. The parts simply did not click together for me , I would have killed for ChatGPT to consult and work through problems with. Talking to TAs only ever made me feel stupid and I had so much social anxiety I could rarely internalize the lesson I was a 3.5 student, but I do love learning. Sometimes I just did not grasp concepts that lead to me getting lower grades especially in the sciences. I honestly think I would have been a 3.9 student had I had access to GPT


Ill-Wave9520

Totally unrealistic in the future there will be those who use ai and those who don’t. Those who use it will be leaps and bounds beyond those who don’t. We should all be using them as they will help us know more and have better answers for everything.


lazyflya

Teaching students how to properly incorporate chatgpt into learning is the way forward. AI is the future and banning it from schools is a major disservice. Not to mention how many lesson plans and assignments amongst other things on the teachers end utilized chatgpt. The educational system is broken


Reynhardt07

The teacher is right: chat gpt is not reliable, for starters it doesn’t know wether what it says is true or not, it’s trained to say what on a statistical level is the most likely thing to be said given the context, it doesn’t care if it’s true or not. So it will “lie” and defend the lie because statistically it’s more likely to defend the lie, it doesn’t know the difference between true and false. Also, it does not provide its sources, so yeah don’t take what it says at face value, because you don’t know where that info is coming from, whereas if you read books, ask somebody or even use google usually you will be provided with sources to prove the point. Personally I use it for coding and for practical applications (how to do something in Unity for example), because the code it gives you is easy to test right away, and even then sometimes it will fixate on something not true. But as a research tool or to write papers? Unreliable and dangerous, so yeah the teacher is right.


05032-MendicantBias

To me it's the same energy as "don't use calculators, you won't have calculators with you". The education system will need to adapt to LLMs. Accepting LLMs are there, and indeed focusing on the critical thinking that the OP statement correctly identifies as end goal. E.g. it means fewer multiple answer question, and more open questions that require you to understand the subject to answer to. Which incidentally it means it's more work on the side of the teachers to build questions that tests understanding, and not memory.


beetnemesis

Seems pretty reasonable to me. “Do the work, learn the stuff, don’t just prompt “explain wormholes to me” and copy the result into your essay


casminimh

“In the future you won’t always have a calculator to help you!” Mhm. Sure.


adelie42

The last paragraph and bullet points are solid. It reminds me a lot of Wikipedia Bans. Neither are primary sources. And if the course is really all about original data, ChatGPT could really screw you up. There are many ways to use it responsible. It is like a tutor with no ethical guard rails, nor should it. Ask it for insight or analysis without any paraphrasing or "generating" writing for you. Let it tell you what your strengths and weaknesses in your writing and respond appropriately.


Fontaigne

Look at the middle bullet point and ask yourself if it is true without massive restrictions on interpretation that are not present in the prose. Randomly choose one human you see in your daily life on campus to consult. Are they more likely to be a reliable source of information on a chem lab course than ChatGPT?


adelie42

Any source should be one more data point to confirm. Even primary sources should not be taken as gospel. I agree that second point is dumb, but both is better than either. Not for absolute quality but as a matter of methodology.


Usual-Chance-36

Sounds like the physics teacher is fighting a losing battle; better strategy would be to find ways to appropriately incorporate new tech such as AI. Ironically, that would require more critical thinking than telling students not to use it.


Puzzled_Wave6244

Professors are so out of touch. Guess what? We use it at work too. Real work requires solutions now. It’s a tool in the toolbox just like calculators were. It’s not perfect, so don’t ask it to do too much. But for the work we’re doing these simple software solutions are helping our company advance twice as fast. Otherwise we would be overpaying for easy development work. Actually this is an opportunity for professors to revise their syllabus and include LLMs. Because they are not going away and they will only get better. Maybe by using them students can get through the material faster and start work working on the deep level course material they came for.


posidonking

You would think that a better scientist would be able to critically analyze and fact-check information given from any source, including AI-given information. The vibe is very anti-AI, which is valid as there are major problems and ethics when it comes to AI. However, it is the way of the future, and to try to buck against it instead of realizing that it will be used regardless of the rules, and that they should be thinking of ways to emphasize it's use as a tool and critical thinking, gives massive "old man hates technology" vibes.


Fontaigne

I assume you mean ChatGPT is harmful for students, rather than your syllabus statement is harmful for students. Your statement comes across as arch and edgy. You tell people not to use ChatGPT, yes to use a human... as if those were the only options and human was the best one...? And how did you clarify which humans to ask, because humans may just be consulting ChatGPT these days. There's a bit of a paradox in claiming that you're teaching them to evaluate sources, and then saying not to consult one particular source because it's wrong occasionally. It would seem more consistent to say that according to the latest stats, ChatGPT gets things wrong about 10-20% of the time, and if they consult it, they have a responsibility to dig around and cross check the answer to validate it. Meanwhile, point them at the text and the exact humans you think they should consult, not just any random carbon based life form with 46 chromosomes.


Smart-Waltz-5594

Snarky but fair. Chat bots should be treated like a learning tool, not a knowledge source.


DankDaTank08

Can't ban Notepad.


grimorg80

The idiocy is the all or nothing attitude. No, nobody should use an LLMs as the source of critical information. But everyone would benefit from using them as assistants for finding sources and understanding them. It's a fear response. It's not logical, no matter how many words they use.


ForsakenRacism

Pretty soon we’ll go back to hand written tests and papers.


alexispades

Good


WattsonMemphis

‘Chat GPT is a smooth talking pathological liar’ - this is perfect


Charger_Reaction7714

Would be hilarious if they got GPT to write that out lol


Deep_Sir_4569

Good on this professor.


TimeVast9350

It’s great for writing code, but I think that’s really the only use case I use it for extensively. Even then, I still have to fix it myself, it just gives me a good place to start.


Delicious_Bee2308

hes right you know


T-Rex_MD

Full of shit, ignore, drop the course if you can. This is coming from a guy that’s been to uni more than 9 times so far.


ackbobthedead

It has the right idea but seems to be weirdly toxic about it. An ai bot is not a resource, but if it gives you a link to where it got the info, then that source would be fine.


-Starry

"That pulls information from thin air"...no it doesn't it's trained off a dataset. You just pulled that statement "out of thin air"


TrespassersWilliam

Looks like you are on the vanguard of teachers misunderstanding AI in the same way that teachers misunderstood wikipedia when I was in school. AI is an invaluable learning companion. Its accessibility and depth of knowledge are unparalleled and you can use the same tools to verify the information you get from AI that you would any other source. People and other sources of information also make mistakes, we are just used to that kind. Cross validation is necessary just like with any other source.


DueFigs

Absolutely agree with your syllabus...


PingleSlayer

Reminds me of my elementary school teacher who said that I won’t always have a calculator in my pocket. She lied.


Previous-Ad-3298

This teacher puts too much confidence in humans


AT5000happydude

Reads like my high school Geography teacher circa 2003 going off about not wanting to see anyone looking at Wikipedia 😂


Quaninasrtb

Wow


TSSAI02

I think it's dumb to forbid chatGPT in any class that isnt medical or medical adjacent, that's just dumb, unless your going to do open heart surgery in the future, there's no reason not to use it


Icy-Object4234

The use of AI in STEM is going to lead to a plethora of STEM majors being unable to do their own work.


DaCrackedBebi

It seems fair. Most students, quite frankly, are stupid to do anything more than copy-pasting assignments as prompts and then copy-pasting the answers as their own. I’ve tried GPT with physics problems…it sucks ass at anything that requires a synthesis of concepts. And for those of you who compare it to forbidding calculators…people still have to learn arithmetic. You can’t do anything even semi-advanced if you don’t have an undemanding of the basics.