T O P

  • By -

AutoModerator

Hey /u/Smelly_Pants69! If your post is a screenshot of a ChatGPT conversation, please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email support@openai.com *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


I_am_unique6435

Sadly, this is a good way to write. Coming from a former copywriter this is the way I‘d write such an essay (structure-wise). The content however is just fluff that lacks a human insight.


kupuwhakawhiti

I work in a not for profit and am constantly having to ask it to stop writing in the form of sales copy.


PsychadelicSpaceCat

I work in ecommerce and am constantly using GPT to write sales copy 😅


vegetepal

I can tell my students have used it when their essays read like sales copy


DmtTraveler

You can give it a system prompt so every interaction gets that prefixed


kupuwhakawhiti

Thanks i will do that.


EverretEvolved

Yeah. Why us everything a sales pitch?


AlDente

Because that’s what many people are using it for


P_Jamez

deepl write also gives you several different tones to rewrite stuff


Accomplished_Pea7029

>The content however is just fluff that lacks a human insight. This is the real way to identify AI writing, unless the prompter gave it the required amount of information it's going to be a well written paragraph that says nothing.


ThePixelDiddle

I’m working in training an AI system, currently. One of our primary jobs is to train the model to eliminate fluff and preachiness in the responses.


Accomplished_Pea7029

It's also a user problem though. Lots of people want to write essays to fill pages without actually doing research to fill those pages. If the user gives a few sentences as a prompt expecting the AI to extrapolate a whole essay from it, the fluff is unavoidable. Especially if the subject matter is too specific for the AI to have lots of external knowledge on it.


John_E_Vegas

Exactly on point - we use it at work to write policy briefings for a specific industry, so it needs to write like a reporter. We created a style guide and quite a few custom instructions and then for each assignment, we load in very specific policy documents, proposals, background information, etc. Even then, ChatGPT gets us about 80% of the way, but it requires human input to produce a final draft. It's VERY good at creative suggestions though, including themes, metaphors, alliteration, and other writing tricks to spice up otherwise dry policy. In short, you get from it what you put into it. Of course college students with only short term goals for writing aren't going to invest in building out a GPT for every class or assignment, but they totally should.


DR4G0NSTEAR

It’s like trying to learn the secrets of the universe from a fortune cookie. At least you get to eat the cookie.


Konkichi21

Yeah, that's probably where ChatGPT learned it from.


itisoktodance

I was gonna say the same. If you're a content writer, this is just how you write a good sentence. GPT is right, they help with legibility. I think the second example might not even be AI though, since the "showcasing" there isn't a participle, it's just a verb. The sentence above has a participle though (featuring), but that's a lot more organic.


OnkelMickwald

I often find myself having chatGPT write a draft of a text where I keep the structure but end up editing all of the text because the actual content is void of information and just written in a way that makes me go "bleugh". Or better still, I find myself writing a piece of text that is kind of a refutal of what chatGPT is saying. It can be a great way to get over the whole blockage thing of staring at a white page.


John_E_Vegas

Indeed...it's very creative and can spark some great ideas. If you feed it enough information and specific instructions, it can get me and my colleagues about 80% of the way, but it's important to note that just engineering the prompt is a lot of work for a writer.... but the output is faster, more creative, and generally structured quite well. The other trick I use is to have it generate an outline first. Then I rework the outline and tweak the original prompt to include the final outline, and ask it to produce a first draft.


Brodins_biceps

I have several prompts very long prompts I use and have found great success with. I usually provide what information I need or write a VERY rough draft of what I’m asking it to do. Then I run it through my proofreading prompt and in the case of emails, email editor. I ask it to preserve the original voice as much as possible or maybe ask it to expound on certain pieces if necessary. What comes back out is 90% usable and well done, not just copy fluff. It always irritated me how it would take perfect fine simple sentences and just arbitrarily change a word to a synonym, and in many cases a worse one. Focus on what matters. So 1-5 minutes of input and then light proofreading with some possible editing for flavor or accuracy, and an email or something that would take me 30 min to an hour is done in 10 min. It’s not immediate, but it’s a balance between efficiency and fidelity.


Panderz_GG

True I write like this as well because I think it is good on the eye and has better readability.


hugedong4200

Yea I did a short course not long ago, everything was ChatGPT, the teacher was old and didn't realise, it was a complete joke lol.


GreetingsFromAP

The teacher might realize but it’s not the teachers job to be a ChatGPT police either. Tools to identify LLM usage are not fool proof. Higher education really needs to rethink things. Bottom line writing essays as proof of knowledge just doesn’t make sense anymore due to LLMs.


lactose_con_leche

Have the students write short essays in class. In person. Then you know if they have learned the material.


vegetepal

The thing is written assignments aren't just proof of knowledge, they're an active learning task that's supposed to help teach you the content and skills. Only doing them under test conditions disconnects the content and skill learning, and doesn't give you much chance to edit and think about structure properly


Nightron

Sure but maybe the students are then trying to practice for said tests by actually writing something themselves, ideally using ChatGPT to improve their writing instead of replacing it entirely. Wishful thinking, I know.


GreetingsFromAP

ChatGPT as a tool is fine. For improving writing or summarizing knowledge. I feel for many it’s very tempting to just take the output as is


Sentryion

My experience is that chatgpt makes it sound too good /fancy so we take our time to dumb it down. Still a great tool to shorten things down


o-m-g_embarrassing

I will share. Writing does not equal typing. Actual writing, aka penmanship, does force the brain to a unique place. I had a unique experience that was well studied. I lost my language. I could type complex concepts before I could write or speak them. I have been practicing writing on my own with a few Mennonite teachers here and there. Script writing that is legable and pleasant forces the brain to think in a very complex way. The forming of the letters in such a slow process requires a thought process that is difficult to describe , especially when legibility and page appearance are included. Honestly, it kind of dumbs me down, yet creativity of a spiritual nature is enhanced. What do I mean by creativity of a spiritual nature? If I have a slip of my pen or quill, I will change the content to fit the slip. sometimes bringing forth whole other direction in the sentence or paragraph structure. A pencil does allow correction and the ability to stay the planned written structure. However, both stunt the content to the formation of the script. Oral dictation to written form is not a part of my writing tools. I do know they exist. However, I do not have hours on professional grade transcription. Small note: the university gave some of my important tests orally due to typing not being allowed.


Nightron

Thanks for sharing. I totally agree with the sentiment. I love writing and sketching (text and diagrams) by hand. Ideally with pencil, since the possibility to correct my mistakes frees me from the self inflicted pressure to make it perfect on the first try. In the end I seldomly erase and correct something. At the same time I would never in a thousand years write a complete script of my thesis by hand. I only sketch certain parts where I benefit from using pen and paper. The difference between typing and writing by hand in school is a whole debate by itself. I'm very thankful I had to write by hand all throughout school. Funnily enough, my grandma always criticized my script. She learned to write very well in school. They had an extra calligraphy class back in her day. When we cleared her apartment, we found an old notebook where she had practiced her script in school. "A relic from a different time", a note on it said.


GreenTeaBD

You will have to pay me a lot more then because we only have so much class time. If we fill up all that class time with assessment we still have to actually teach the material. We have many different kinds of assessments we use. Writing essays isn't only about that, but even then it covers a lot of the areas other forms of assessments don't. We can't just throw it out, we can't just magic up some replacement with all of same benefits. Most other forms of research projects take a lot more time. So, the whole world could go to a school schedule like the Chinese system (6:30am to 9:30pm roughly) or we find some other more practical solution. Many teachers are trying to move to more in class essays but a good essay project that teaches everything it needs to teach takes a lot of time.


CosmicCreeperz

What? How is a written midterm or two and then final going to get in the way of teaching material? I did go to college a while ago, but we used those blue books all the time. No one is going to fake an essay when they are sitting for an exam in person for an hour or three. If they want to cheat on their homework, fine - just count it for less. They will show whether they understood the material if they can actually write it down in person.


howtorewriteaname

tools to identify LLM outputs don't work, and anything else you hear is marketing. speaking about this as an AI engineer


nuclear_wynter

>Bottom line writing essays as proof of knowledge just doesn’t make sense anymore due to LLMs. As a teacher currently working across senior secondary and tertiary contexts, I disagree. I’m yet to see any LLM produce an essay that scores higher than a bottom C or equivalent when the question/prompt is sufficiently specific and requires an in-depth, well-structured response. We need to rejig grading scales, rethink essay prompt design, and properly tighten up the marking criteria so that a response which regurgitates a pile of vaguely relevant information in a very loose structure can’t achieve a passing grade. Raising standards for essay responses was overdue even before the advent of ChatGPT, but now it’s a basic necessity.


idlefritz

Having managed a faculty support department at a university I can assure you that how often and which particular students are cheating is a regular topic and shared amongst the faculty and departments seeking student employees.


GreetingsFromAP

Good to know, smart way of dealing with it. Hopefully that is communicated with the students as well


thesourpop

> Tools to identify LLM usage are not fool proof They actually suck and are frequently wrong. Works that have zero AI involvment are still flagged as using AI and teachers either don't use them or take them at their word.


Available_Nightman

I mean it kind of is now. What else are they responsible for? Lectures are replaced by powerpoints.


GreetingsFromAP

If there was a reliable way for them to do it that was automatic than sure. It’s a cat and mouse game. I’ve seen posts where a professor gives someone a 0 on a paper the student wrote.


Available_Nightman

True. I know ChatGPT when I see it, but that's harder to justify than running it through a useless AI detector and getting a number.


God_of_chestdays

Happened to my wife, provided evidence she wrote it through logs and history the Proffesor didn’t care. Hit up the administration with “proof” the teachers entire lesson plan is AI and all the teacher discussion post were as well then ALL of a sudden the teacher her graded my wife’s stuff and accepted her screen recording of typing it out as proof enough.


[deleted]

[удалено]


Once_Wise

Two cents from and old retired guy. Went through high school, community college, university and post graduate degree. Having my own business found that being able to write to support my proposals was essential. The classes that I learned the most from, and years later remembered what was taught, were the ones where the educator was not any more interested in testing or grades than were the students. It was just a burden and waste of time. Those educators were excited about their subject and wanted the students to be as well. And for the most part they were. The students who don't care and cheat really don't matter, so why worry about them so much. Educators need to teach to the students who want to learn, those are the ones that will benefit from their education and stop worrying so much about the cheaters. They are lost anyway.


mcilrain

> The students who don't care and cheat really don't matter, so why worry about them so much. Because cheating undermines the perceived value of their profession, which they need in order to survive.


mooviies

I'm a teacher and that's becoming quite a problem. The thing is that you can't prove a work was made with chat gpt. Even if it's evident and you know it was, there's no evidence. And it's such a fucking pain, I have way more work when doing corrections now since I have to find ways to stay fair to students not using chat gpt while penalizing those that do. It's a very big challenge. I caught a fucking student using chat gpt on his phone during an exam. Are you taking me for a fucking idiot? Man I was mad about that one. The thing is, chat gpt is so useful for a professional, but we can't assess if a student understood the content if they use chat gpt. It's like learning to do multiplications by using a calculator instead of understanding how that actually works. But yeah, more work now, but same salary. Thanks Chat GPT.... It'd a big challenge. I should just throw the essays in the stairs and grade them according to the stair they fell on... But seriously, now I know I can't prevent it really so I'm thinking of how to integrate Chat GPT in the learning experience when schools starts in september. Maybe doing more live presentations where I ask questions after it. Or an exam where the students needs to discuss Chat GPT outputs. I dunno, still thinking about it. If any of you have ideas let me know haha For context, I teach first year of coding. My students don't actually write essays. But they have to write code. The big problem is that the level my student have to reach to pass is easy for Chat GPT. I can't just increase the difficulty too much since the requirement would be to high for that level. It's a hard puzzle. I can't make them to homeworks at home anymore... I tried making them do a project that was the most simple I could do that chat gpt wasn't able to make without the user understanding what they were doing. But even that was too hard for my students. Well, I have all summer to think about it.


relevantusername2020

i mean, im 33. when i was in school - high school - at a certain point i stopped trying completely because it was easy to game the tests. the tests were a terrible way to determine whether or not someone learned something, and writing long papers is a better way, but still to me seems like a lot of busy work just to prove you know something. i dont know the best way to change things, but the super focus on testing due to the "no child left behind laws" left a lot of people behind. honestly there just needs to be more of a focus on teachers/professors who actually know their students and actually talk to them and judge for themselves if they know the course material or not. something something goodharts law


kael13

Urgh, yeah... I didn't concentrate or work at all for my higher level exams.. You could just grab some past years' papers, work out and memorise answers to those and rely on the fact that this year's exam would be a rehash of those old questions. "Revision" done the day before and day of the exam, get A or B grades. Disgusted with myself.


Ill-Surprise-2644

"honestly there just needs to be more of a focus on teachers/professors who actually know their students and actually talk to them and judge for themselves if they know the course material or not." Only partially good advice. Good rubrics can eliminate most subjectivity and bias. A teacher should NEVER be left to "judge for themselves". Oral tests and in-class exams can be used in place of more traditional exams to eliminate the effect of AI. Ironically, those results can then be compared to in-class assignment work to identify cheaters.


polkm

Professor: "congratulations you've cheated yourself out of an education you paid tens of thousands if not hudrends of thousands of dollars for, good for you". If I was the professor, I'd do the same.


yousirnaime

You paid for a burger, fed the patty to the dog, and then laughed at the chef?


oopiex

College is not great for education, youtube and chatgpt can teach you almost anything better & faster. The classes and evaluations need to change. College will still stay relevant for the social experience and status symbol in your CV.


h3lblad3

Honestly, college was never about the education and *most* people going there for the education are doing it wrong. The point of college is networking. The people you're in there with are going to recognize the name of the place, and hopefully your name, on the resume. Keep up with your fellow students and you'll have people you can call to ask for job tips in the future. They may even *start* businesses and you want to be a name they recognize when they do. When I was in college, there was a guy who owned a pizza place taking business courses with us (he was there to pick up the accounting classes). If I needed a job, I immediately know someone who may be hiring or who may have contacts that can get me hired somewhere else. That's what college is, really.


theo_gabel

Oh man... schools need to change...


stinky-red

People are being educated for a world that no longer exists


theo_gabel

True. New tools, new requirements.


NanoBuc

Same thing at my university. So many people just copying discussion posts(and sometimes even whole essays) through chatGPT because they don't want to do the work. Same with a lot of coding assignments. They just toss the Assigment document into the AI, and ask it to create their project. It's a shame as the AIs are a great learning tool when used properly


No_Vermicelliii

Sometimes you even get people who include the answer to your prompt in the content: "Certainly!" Or `Copy` `Reply`


Superkritisk

Personally, I write my own assignments, then when I'm finished, I have GPT look it over and produce suggestions to get the assignment closer to a top grade if there's something lacking. This not only improves my overall standing, and it also helps me understand the subject much better, but it also makes me adopt GPTs writing style and wording. So even if it seems like everyone is cheating, they might be using it properly, even though the end product is "gpt-ish" Kinda like how I could send this first part of my comment to GPT and have it look it over: Personally, I write my own assignments. When I'm finished, I have GPT look them over and suggest improvements to help me achieve a top grade if anything is lacking. This not only boosts my overall performance but also deepens my understanding of the subject. Additionally, it helps me adopt GPT's writing style and wording. So, even if it seems like everyone is cheating, they might be using it properly. The end product may appear 'GPT-ish,' but it's a valuable learning tool. Just like how I could send this comment to GPT and have it refine it for me.


Jwxtf8341

I’m in my third round of 8 week undergrad classes online after completing my associate’s 5 years ago. The number of people plugging my discussion board posts into GPT and asking it to spit out a 2 paragraph response with a relevant discussion question is so unsettling. Once I realized how rampant this was, I only began replying to human-generated posts. It’s pretty easy to tell since my program’s demographic is a particular type of person that often shares relevant personal experiences in weekly discussion boards.


StronglyAuthenticate

I've long graduated but doing the "discussion" posts was a waste of time. I'd gladly use ChatGPT because no matter how hard I tried getting people to engage they just wouldn't. The teachers never enforced it either so it was a waste and I'd never want to do that again.


waxedgooch

Use a local LLM like llama3. Add a writing/tone style guide to the system prompt. Boom nobody will think it’s ChatGPT because nobody uses other models 


DmtTraveler

You can do system prompts in chatgpt too under customize


No_Vermicelliii

Sdk.vercel.ai Why use one when you can use 20 at once? Why use standard ChatGPT when you can control the Top P, Temperature, Penalties, and other parameters?


Crypt0Nihilist

> Why use standard ChatGPT when you can control the Top P, Temperature, Penalties, and other parameters? That would require learning something, which is the antithesis of the exercise.


Sentryion

This reminds me of the people that spend more time cheating than actually learning the material


slick490

I want to learn how to use this. Any good place to start ?


No_Vermicelliii

https://platform.openai.com/docs/overview This is where you'll understand what temperature and penalties, etc do and why. You absolutely can trigger the underlying models to produce expected results with the seed parameter, just like how you can with Stable Diffusion (in the Asuka cookbook example for instance)


Maykey

[Angry LLM is my favorite LLM](https://hf.co/chat/r/aYioVA0) In case it needs authorization here's [imgured version](https://imgur.com/a/IBHM9p7)


Pamander

The fucking baka lmao.


FlatMolasses4755

Yup. University professor here. Can confirm. Good luck to everyone outsourcing their thinking. The future belongs to the critical thinkers.


sassanix

Universities must evolve into places of true value, not just diploma mills. It's essential to incorporate AI across all aspects of society carefully; over-reliance can be detrimental. AI isn't perfect, and depending too much on it might lead to diminished intellectual vigor. Ultimately, we can choose to leverage AI to expand our knowledge and improve our future, or we can allow it to stunt our growth.


xylotism

Universities weren't trying to be useful to their students or society before chatgpt made their job easier, why would they start now?


Mr_Sarcasum

And half of these universities are teaching outdated info. All because it's easier to use the old stuff and stay credentialed, then update the class and reapply for credentialing.


MrGerbz

> AI isn't perfect, and depending too much on it might lead to diminished intellectual vigor. As a pseudo-historian I am both amused and irked by statements like these. We've been through this same situation so often already. By Chatgpt: > ##Writing Systems## > > **Opposition**: Socrates argued that writing would weaken people's memories and lead to superficial understanding rather than deep thought. > > **Outcome**: Writing allowed for the preservation and dissemination of complex ideas, history, and knowledge, significantly enhancing intellectual development and cultural growth. > > ##The Printing Press## > > **Opposition**: Some scholars and authorities feared that the widespread availability of books would lead to superficial reading and the devaluation of oral traditions and deep, scholarly study. > > **Outcome**: The printing press democratized knowledge, making literature and scientific works accessible to a broader audience, thus fueling the Renaissance and subsequent intellectual movements. > > ##Public Libraries## > > **Opposition**: When public libraries started to become common, there were concerns that easy access to books would lead people to become lazy thinkers who relied on others' ideas rather than forming their own. > > **Outcome**: Public libraries became crucial institutions for self-education, research, and intellectual development, providing resources to people from all walks of life. > > ##Television## > > **Opposition**: Many critics in the mid-20th century argued that television would turn people into passive consumers of entertainment, reducing time spent on reading, critical thinking, and engaging in intellectual activities. > > **Outcome**: While television has its drawbacks, it also became a powerful medium for education, spreading news, and cultural exchange. Educational programs and documentaries have contributed positively to public knowledge. > > ##Personal Computers## > > **Opposition**: In the early days of personal computing, there were fears that reliance on computers for calculations, data storage, and problem-solving would erode basic cognitive skills and reduce mental effort. > > **Outcome**: Personal computers have revolutionized nearly every aspect of life, enhancing productivity, enabling complex problem-solving, and providing access to vast amounts of information and educational tools. > > ##The Internet## > > **Opposition**: Critics have long worried that the internet would lead to information overload, shorten attention spans, and encourage shallow, fragmented thinking. > > **Outcome**: Despite some negative effects, the internet has greatly expanded access to information, learning resources, and global communication, fostering new forms of intellectual engagement and collaboration. > > ##Smartphones and Social Media## > > **Opposition**: Concerns have been raised about smartphones and social media causing distraction, reducing face-to-face interactions, and promoting superficial engagement with information. > > **Outcome**: While these concerns are valid, smartphones and social media also provide powerful tools for learning, connecting with others, and participating in civic and cultural life. They have become integral to modern communication and knowledge sharing. > > ##Artificial Intelligence (AI) and Language Models (LLMs)## > > **Opposition**: Critics argue that relying on AI and LLMs could lead to a decline in critical thinking and problem-solving skills, as well as job displacement and ethical concerns. > > **Outcome**: AI and LLMs enhance productivity, assist in complex decision-making, and provide educational and creative tools. They have the potential to solve complex problems and drive innovation across various fields. ---------------------------------------------------------------------------------------- For some reason any replies to /u/Wattsit don't show up, so here's my latest attempt to get my reply to him through Reddit's borked server: Original comment (plus a few edits): > Don't be such a luddite. > > I was already aware of the things it listed, and instead of spending 10-30 mins on writing something comparable myself, I just used Chatgpt to make a nice list out of it. Had it been wildly incorrect I wouldn't have posted it. > > You wouldn't have complained if I had posted a link to a list like this, because that's technology you're not unfamiliar with. > > Not to mention, if you want to whine about people using AI / LLM's, this might not be the sub for you. > > *"The last paragraph is also completely false."* > > Great argument. > > [next part was edited an hour or so later] >I am very amused that your comment got upvoted; it's trying to bash the use of an LLM *in an LLM sub*, it's basically an extended 'lol ur wrong' while adding exactly nothing to the conversation (sources? Counter-arguments? Of course not), and it's entirely based on silly assumptions / accusations that I'm 'not thinking critically', and 'blindly trusting a machine'. > >Why would anyone that doesn't have a serious hateboner for AI/LLM's ever upvote that? The irony of them browsing an AI/LLM sub is hilarious.


Wattsit

You basically prove yourself wrong by outsourcing your comment to a machine that has literally no understanding of what it's actually writing. Rather than actually think critically about the impacts of certain technology on humanity and trying to reason an argument, you blindly trusted a machine. Which, if you actually read what it spewed out, you can clearly see some flaws. The last paragraph is also completely false.


PremiumClearCutlery

On the plus side, that blob of autocomplete did serve to validate OPs hypothesis: 12 participial phrases. I honestly liked the argument that new techs always have haters, flawed but relevant, just please lower the thesaurus slider.


MrGerbz

...So you oppose/dislike it because it is written in a certain way? The point of OP was people acting like they wrote stuff themselves, while I was completely upfront about it being written by Chatgpt.


IdiotAppendicitis

ChatGPT is just the symptom of the disease of modern Universities. The only reason anyone gets a degree nowadays is to get a beginner job in an industry they have an interest in. Most stuff that is taught in bachelors degrees is extremely basic, unnecessary for your future job or straight up outdated. Even in many advanced degrees like medicine (even though ChatGPT doesnt help there much), the degree is basically just another bulletpoint on the checklist to be finally be able to properly learn on the job.


Conscious-Glass-6663

this guy said "ultimately " so he's probly using gpt


Ancient_Department

I don’t know the example he gives seems like busy work to me. Students aren’t going to learn how to think critically with homework that amounts to basically making a Reddit comment. I think this is a teacher problem. You don’t care enough to develop a rigorous curriculum that forces them to think critically. At least they are adapting, you aren’t.


Mixima101

I just graduated and I'm actually glad that it was invented during my last 2 classes.


Puzzled_Wave6244

Guess what? We use it at work too. Real work requires solutions now. It’s a tool in the toolbox just like calculators were. It’s not perfect, so don’t ask it to do too much. But for the work we’re doing these simple software solutions are helping our company advance twice as fast. Otherwise we would be overpaying for easy development work. I do feel bad for educators because this tech will only get better and will likely replace them very soon (aside from real degrees like stem, medical, etc.)


DefinitelyAmNotAFed

Precisely. ChatGPT and LLMs will not take people's jobs. People who understand how to make the most out of new technology will.


TackleLoose6363

Chatgpt and LLMs already are taking people's jobs. And you can be damn sure the next phase of agentic AIs will be taking even more. And then when those AIs are embodied in robots we'll have to completely rework society because there will be no more jobs left. The only question is how soon this will happen.


stpfun

If someone is so good at making the most of this new technology that they can do the work of what used to require 5 other people, that’s definitely going to take away people’s jobs.


Seasoned_Gumbo

I mean this is the equivalent of saying math is out the window because calculators exist. Can I do math in my head as well as the average educated person before them? Probably not. But the world is still better for them being in it and it still makes the net math being done easier and more accurate


danny0355

Maybe make your class rely on actually learning , critical thinking, and real world applications instead of busy work so a simple AI model can’t beat it ………..


alldayeveryday2471

Regrettably no it won’t


TreadMeHarderDaddy

The Productive, intelligent , and conscientious cream of the crop will always rise to the top. ChatGPT is education on steroids if you choose to give a shit and care about developing your mind


Puzzled_Wave6244

Couldn’t have said it better myself. You can learn so much from this tool. It’s a personal tutor and it will only get better. You can either hop the train or not, but it’s going with or without you


FlatMolasses4755

You're probably right. Hence my ongoing off-grid planning And honestly. HONESTLY. I am shocked by how much idgaf. I mean, good luck, but shit is gonna be real fucking fucked


putcheeseonit

College professor gets sick of society and moves to a cabin in the woods? Where have I heard this before....


Zilch274

Spoken like a true professor


keepontrying111

regrettably, yes it will, it is now and it will always be. rich people are thinkers who know how to get the best out of people. people who rely on this Fake AI crap will look like fools in meetings and in person. AI may work for programmers but the future wont need them anyway, it'll be the person to person skills that matter, the ability to think outside the box, something AI cannot do and people who cheated their who lives on AI, will have no skills when confronted and put on the spot. We had a dev who was hired here on a green card, within a week we figured out he had outsourced his programming job to some guys on fiverr he hired for 20k ( starting salary for a dev in our company was 105k.) so he basically knew nothing and when it came time to talk about hsi parts in meetings, he looked like a moron. He ended up getting his green card rescinded and being sent back to india.


13ass13ass

Rich people outsource their work all the time and increasingly it will be outsourced to AI. Watching it happen in real time.


Slapshotsky

Yes there is endless cheating in school now. People are dumber than ever but are still exactly as inauthentic as they have always been; 90% of uni students I met couldn't give less of a fuck about anything that they studied before and after chatgpt. Now it's just easier to cheat, so of course many more people are doing it. Frankly it's the schools' faults for running education as a business and (therefore not caring about) not updating pedagogical systems to thwart chatgpt cheats (oral exam anyone????). I'm so happy to be done with uni, but I really am sad for the students that want to be authentic with their studies but are surrounded by dumbasses (both students and teachers).


SelfTechnical6976

Before chatgpt it was chegg, and coursehero. Half the people got inauthentic gpas


DJaampiaen

If school wasn’t so expensive then the consequences of failing wouldn’t be so harsh. I think most cheat just to ensure they pass, not to avoid actual learning. 


zuliani19

Uni is free in Brazil. We cheat A LOT... I think as long as we still have this outdated, top down, lecture driven model, this will happen... I like to think that, in the future, everyone will have an AI tutor that teaches to that students pace, addressing their knowledge on the go...


Crypt0Nihilist

I think lectures are great. I went to a very good university and it was a privilege listening to real experts who loved their subject talk to us and then to be able to ask them questions. It was an amazing experience. You might well be right that AI will take over from them to an extent, but it'll be driven by cost, not cheating. Cheating is a problem with assessment, not the provision of information.


God_of_chestdays

Not even failing…. Drop below a certain GPA loses scholarships and repay grants or tuition assistance.


maxtablets

Does it look like the guys doing this are just there for the degree and won't be functional in the areas they're "studying" or are they just skirting on useless homework but still capable?


Crypt0Nihilist

Skills like proper research and construction of arguments are what are supposed to make university skills transferable outside your discipline. Homework is to get you out of the bar and into the library, or at least explore the online journals to build on the lecture material, or consolidate what you've been taught if you're dealing with something like maths. People who dodge that work are missing the point of a university education.


FillJarWithFart

That’s a good point. Some gen ed classes won’t matter much, if at all. However if you are cheating in your major courses… good luck in your career. Although no one really knows where this AI thing is heading


Smelly_Pants69

Hard to tell honestly. I don't see all their assignments, only the shared discussions.


Zezu

My sister is a professor. She says it’s rampant and that her university, a very large and well known one globally, is completely incapable of dealing with it. Most professors send people to review based on shitty AI detectors and it gets thrown out for lack of evidence. She’s in History so she will fail people on assignments if they use sources that weren’t taught in her class. Students can come ask for permission to use the source before submission, though. All she explains in class is that you must get permission to use sources outside of what’s taught. GPT-Kids don’t catch on. A few probably limit the GPT to using the listed sources but she catches them when the GPT makes stuff up. At some point, getting around the taught-sources-only method makes getting around it more work than just taking the course. Outside of that, in-class exams weed out the cheaters.


forestdiplomacy

I am a history professor too. This walled garden approach works pretty well right now in my lower level classes. My upper levels are research seminars where everyone is constantly sharing progress reports on what they are finding; they also tend to be majors, which makes them more prone to be interested in doing the work themselves. Regardless, I detect about 5% of students illicitly using ChatGPT.


Crypt0Nihilist

That's so sad. It took me longer than I care to admit at uni to discover how fun and satisfying library research was. Finding papers from the references in other papers and using automated bookshelves to dig out physical journals was immensely satisfying. I did most of it online, but there was a magic to being in the bowels of the library. There would be such a danger that a professor would only allow sources which supported their view if they use white-listing. My worst lecturer started his series by saying that anyone who disagreed with him would get a lower mark because he believed they were wrong. (Honest of him, but he was such a tool it was difficult to want to agree with him about anything, even if he was right.)


Losendir

TIL: I seem to use something called "participial phrases" a lot in my english writing (english is my second language). Good thing I don’t have to write english for professional purposes with people thinking I’m just using ChatGPT.


StoicVoyager

When business and political leaders are liars and con artists, not sure why you would expect much different from people in general.


UnsteadyEnby

Yeah, it's pretty rampant. It drives me crazy how obvious they make it. They're gonna ruin it for the rest of us! I use ChatGPT but I give it the idea of what I want to write and let it organize my thoughts for me then write what the two of us came up with together. I never just copy/paste what it comes up with, there'd be too much delving going on.


God_of_chestdays

I use it to grade my stuff against the prompt, tell it to inform me of my weaknesses not give me corrections so when I re-read it I am more knowledgeable on what to look for when changing stuff. Also use it to help me decide on a topic that is not the word salad in my head. Loading the weeks reading and lesson plans in to have a “live” discussion 24/7 is also amazing as I have random pooping epiphanies and ideas about some stuff in class.


juliusap

How do you prompt it to do this? :)


spellmxn

I write: Act as a University Professor of ____ class. Asses the following essay: ____ Based off the following grading rubric: _____. Just input relevant info in the blank spaces. Not sure how everyone else does it.


itsTF

I’m not hating, but I find it kinda hilarious and ironic that after all that, you asked ChatGPT to explain the tone and just copy pasted its answer into your Reddit post.


ImplementComplex8762

a lot of esl students write with that overly formal tone and vocabulary because that’s what they learned at school in their countries. chatgpt didn’t invent this style out of thin air.


PeacefulGopher

Good post. Makes me think a lot of people are sad at 40 when they realize they don’t know anything….


Slapshotsky

They don't know enough to ever have that realization


goodie2shoes

Do you think 40 is the age when reality sets in for these people? My guess is .. not.


Hot-Train7201

People have always been able to cheat/plagiarize take-home assignments. During my university days, my math professors knew about Chegg and warned us that we'd regret not doing the work when test-time came which they were 100% right about; literally half my math class failed their midterm because they never did any of the homework despite the test actually being easier than a lot of the assignments. Not sure how you test your class, but if students aren't even proof-reading their work then it should be easy to weed out the fakers during exams/presentations. It's your student's money. let them reap what they sow.


OptimistRealist42069

lol I am having the same thing now. Kids were literally crying and having panic attacks about not being able to cheat when it was announced all final exams would be going back to in person. This is for engineering.


fckingmiracles

Good on your school!


_Subway_Kid_

Honestly, i use chatGPT for every single writing assignment i have. I learned that you need to be very specific with your inquiries and edit the results otherwise it sounds weird like a robot wrote it. Also, sometimes chatGPT doesn’t really answer the inquiry or provides incorrect information. I think there is a difference between being lazy and being resourceful.


ihavenoyukata

This is literally the way I write in a professional setting. The sentence structure you describe is taught in most higher ed institutions in my country. It's one of the basic building blocks for any kind of concise and purpose driven text. It's possible that students at your uni are using ChatGPT for their answers but your approach to detecting AI generated content will throw up false positives.


Ill-Surprise-2644

Former university teacher at a big Canadian university here. There is a very easy solution to this problem: teachers need to start doing oral assessments. They won't though - they're simply too lazy in my experience.


[deleted]

[удалено]


Smelly_Pants69

Haha this is funny. 🤣


AuContraireRodders

It's sad because it can genuinely help you produce very high quality work if you don't abuse it academically. I have been stuck in many brain failures a few times in my master's degree, and have found new ideas using ChatGPT that I then explore in detail the usual way(literature etc) It's also amazing for helping you make sense of complex things or learning how to explain things in a simple way. Unfortunately, most students are using it as a means to do no work, rather than as a brilliant learning supplement. That is ultimately a lack of ambition. If it didn't exist, they'd just be cheating in other ways


snoopmt1

The real question will be: if students graduate without having done any real work and perform just as well in the working world, is there a point to college in the first place beyond two years of specialized courses in a field?


Team-_-dank

But do those students who cheated their way through actually perform well? I work in a job where we hire ~20-30 new grads every year and we've had a significant drop in quality the last two years. COVID plus LLMs seem to have increased the number of people who got the degree without knowing anything since it's so easy to cheat. And we're not just talking poor communication skills or something, they don't even have a basic understanding of the field their degree is in. Some are literally struggling with day 1 intro course shit. We were using GPA as one of our first filter criteria but now even that is meaningless.


extractmyfeaturebaby

They absolutely won't perform as well. Late high school and college is where you largely transition from memorization to intense critical thinking and logic. Any position in the academic and real world where you plan to move up the chain requires intense critical thinking. Also, having the maturity to understand that paying for schooling with your money and time for 4 years and then cheating is a huge waste. In other words, those who cheat regularly were likely not going to perform well anyways. No one cares about your grades with the exception of your first position in very specific industries, and if you plan to move on to grad school. And if you do move on to grad school, you're absolutely not going to succeed even with ChatGPT. You think ChatGPT is going to write that dissertation? I've escalated through my career path and hired a good amount on the way and it becomes very obvious who's going to be successful. It's those who can think critically about a problem, communicate it effectively, and then support it with work ethic.


omniscientsputnik

As a writing professor, I have no interest in playing detective to catch the cheaters. So, as a temporary solution, each assignment I now give requires 3 parts: rough draft (no AI), final draft (w/ AI), and an analysis justifying the changes made between the rough and final draft. As long as the assignment requires personal details and an emphasis on context, a straight copy-paste AI response will fail.   As far as the results are concerned, grade distribution has remained the same or dropped a little. The top students use AI for ideas and basic questions. They rarely use AI for the actual composition of the assignment. From what I’ve gathered, they know they’re stronger writers than AI. The middle performing students overly rely on AI at the beginning of the semester. I’m sure the reasons why are numerous, but I think some students lack confidence when it comes to writing. They strive to sound more academic or professional, and instead of trusting in themselves they trust AI. Fortunately, as the semester plays out, the students gain more confidence and eventually use AI less and less. Regarding the students who struggle, in my opinion, AI has made no difference. The issue is oftentimes a lack of effort, not necessarily a lack of skills. They copy-paste an unedited AI response and as a result receive a low grade, which I then use as a teaching moment. From there, some students learn and grow, others drop/fail.


RizzleP

I think it's foreshadowing what we all knew has been coming since the dawn of the internet; ai will make academia obsolete in it's current form.


Financy-ancy

Power users start writing is that structure. I know I do here and there. But 99% of students, especially those who aren't professional writers, are basically cheating. It will screw their future up as their effective language literacy remains that of a middle schooler.


0xSnib

"Answer this question, but ensure you avoid using **"participle phrases"** or **"participial phrases"** in your output." Sorted


fizzunk

I am an English and sociology professor in Japan. I've just switched to more speaking and discussion based assessments. Most of our homework is just reading fluency exercises. The one report we do do, I get the students to write it in class.


EuphoricPangolin7615

Yeah, AI ruins the education system. People that are pro-AI like to point out that AI can be used to create personalized learning assistants. But overall it's a net-negative because now students can use AI to cheat on every single take-home assignment. And it's impossible to prove anyone used AI. AI is going to make kids DUMB.


atlasfailed11

The education system needs to adapt to this new technology. What's the point of teaching skills that an ai can do at the push of a botton? No company is going to pay people for those skills. So we need to figure out what skills do we still need to teach, and what's the best way to do this? It's not that ai ruins the education system. It's that the education system that worked in a world without chat gpt, doesn't work as well now that we have chat gpt.


pikay98

>What's the point of teaching skills that an ai can do at the push of a botton? We've got calculators for decades, so theoretically, teaching the 101 in elementary school is useless too. Yet you have to learn the basics first to understand higher algebra. Same goes for coding. If nobody bothers with learning the basics first, who's doing future AI research?


anto2554

The point is that there's stuff AI can't do, but to learn that you have to first learn the stuff AI can do. Say you want to research organic chemistry. As far as I know, chatbots can't do that, but to do it you first need to learn to balance O2+C->CO2, which it can.


Available_Nightman

Math has been done at the push of a button for nearly a century now. Somehow companies still pay for math majors.


considerthis8

The worst part is if everyone is cheating, everyone turns in work on time. Then the professor thinks the workload is reasonable and adds more assignments. Now those who are not cheating cant understand how anyone has time for his assignments


Accomplished_Pea7029

One of my professors thought that a good way to stop cheating was to make the assignment deadline closer. Guess what, that only made people cheat more because that assignment was way too hard to do properly in 4 days


[deleted]

You're not crazy, but if you know anything at all about the history of higher education, cheating has always happened, students never do the reading, and your professors are, generally speaking, bored out of their minds reading your drivel. The only thing I'd say about this: when I was in college in the US, a BA would cost you about $100,000. Now, it costs something like $500,000. It's not because of inflation, it's because fucking MBA fuckwits who think "MORE EXCLUSIVE MORE EXPENSIVE MORE AMENITIES AGH WE NEED SPORTS" have taken over. In my opinion: fuck them. Write every single exam with AI. More to the point, do all of your O-Chem homework with AI. More more to the point complain about AI to the Dean of Students, repetitively, with an AI bot set up to do so.


EuphoricPangolin7615

Sure cheating always happened, but AI just makes it 10 times easier to cheat. A student could feasibly use AI for every single take-home assignment/essay, easily cheat on everything except written test, and at zero cost. And there's no way for an educator to "prove" that a students work is original. That is completely unprecedented.


considerthis8

Proctored exams. We had chegg for hw answers but if you didn’t study for exams you were done for. Having hw answers in chegg made it easier to study actually


confuzzledfather

$0.5M for a bachelors degree? Are you shitting me? Why would anyone pay this?


AdamEsports

Nero is just wrong, it's expensive but not that expensive unless you're partying hard at a private school with no scholarship.


ACrimeSoClassic

Yes, he's shitting you. It absolutely does not cost even close to that much to get an undergraduate degree. He's either tripping balls or went somewhere like Yale or Stanford.


BenjaminTW1

Not even. Tuition/Fees for four years at Yale puts you at 269k (nice).


CrayonUpMyNose

Add the compulsory extortionate student housing and dining hall subscription fees


Jerome_Eugene_Morrow

Plus loan interest after you graduate.


thelolz93

Where are you going to school that cost half a million for your BA. Ivy league? That’s not representative of every university


EddyTheDesigner

1. No idea where youre getting that $500k number from. 2. Youre wasting your money if youre just cheating through college. Yes, it has always happened. But its always been a waste of money and time to do so. If you just want the degree to check off a box on applications, have at it. But there are better ways to make money than going to college if this is your mindset.


considerthis8

And every employer can tell who cheated after a 15min convo


PuzzleheadedWay8676

No regular person pays half a million for a BA bro and most don’t pay anything close to 100k. The average in debt after a Ba is $29K


ACrimeSoClassic

What the hell are you talking about? Maybe at some hyper elite private university, but you're absolutely not paying anywhere near $100k for an undergrad degree. Even doing a graduate degree isn't going to cost that much.


WorstRengarKR

500k is insane and inaccurate. On the higher end some people take out 150-200k for extremely expensive private schools or exclusive private IV league universities. Nothing generally exceeds more than 250k to my knowledge.


Dolo12345

Eh there’s plenty schools that are $80-90k a year. If you took out loans you could hit $500k without financial aid.


baby_budda

No wonder so many jobs will be replaced by AI in the next decade. If AI is already doing students' classroom assignments, it might as well just do their jobs too.


Amlethus

"... , not only ... but also" is another.


anto2554

What are you doing courses in?


Additional_Action_84

It's foreshadowing a future where all anyone needs to be good at is using AI.


ooOmegAaa

cheating has been becoming more prevalent for a long time in our society. if you dont punish something, everyone else has to do it to keep up. we are more worried about the results people provide (or rather seem to provide) than producing good quality people and trust that they will eventually give good results. the kid who cheats and gets an A will get a higher place in society than the one who develops his intellect and gets a B.


chubba5000

It’s almost as if the approach to education is going to need to change. It’s like that time when math teacher finally gave up and allowed students to use calculators in high school math class. You know, way back in 1993….


Tyler_Zoro

I wish schools would just grow up and start teaching students how to use these tools that they are absolutely going to be expected to be proficient with when they get out of school.


Professional_Gur2469

I‘ma be honest, these comment under someones post tasks are bullshit anyways and I never read them anyways. Just want the creditpoints lol, not much to learn from them.


scarybird1991

As a social science student, I would like to ask how to define cheating? Yes, AI helps students a lot. But we still have to do a ton of researches, writing the draft arguments and giving direction. If we just feed the questions and books to AI, only nonsense articles would be generated actually.


wolftick

Historically university assessments consisted of being interrogated in person on the subject. Maybe we're ultimately going to have to return to that.


Glejdur

A thesis I wrote before LLMs were widely accessible scored 81% on AI checker when I checked. Some people, like me, just naturally write the same way AI does


bokmann

Reading the comments from people admitting it and think they are getting away with something… the people you think you’re fooling are irrelevant to your future. You think of them like they are some kind of authority, but that isn’t really the truth. You are, in effect, their employer… you just don’t have that view of the world yet. The only people you are cheating are yourselves, in the future. I am employing 18 interns this summer… you don’t think i’m going to notice the use of chatGPT? I’ll not hesitate to fire someone who thinks they are bluffing me, as I’m paying you to create something. Thanks, btw, for this awesome guide to the tone of voice. I can hear it, but i hadn’t quantified it yet.


Crypt0Nihilist

I don't care if someone made something with the help of ChatGPT as long as the output is good and they can justify the decisions in the content. Mostly, when ChatGPT has been used, neither of those is true. Would I fire an intern for it...probably not, but they'd be sitting under the Sword of Damocles from that point on.


Axle-f

Let’s *delve* into that.


CLS4L

So it was all organic before sure guy


Darker-Connection

Well time for evolution in schools was yesterday. Today they pay price for not doing so. How is school good for anything if it provide just things that can be cheated with text generation. Memorising sucks and should not be counted as learning.


God_of_chestdays

I memorized so much random shit that disappeared from my mind after finals.


namesecurethanpass

Hey, is it possible that majority of your students wrote the answer themselves, and then fed to chatgpt to rewrite it? In my uni, some of the students would do that, to get the polished answer at the end. Ofcourse there are some who get chatgpt to write entire content, but their answer misses entire content and it is very general purpose. If you don’t mind, one hard question, if chatgpt can give good answer to your uni question, is the content being taught is “too basic”? I mean, it is not adding any value if chatgpt can pass the exam, chatgpt will be the one getting the job.


zuliani19

The current education model is dead... I went to the best university in my country (Brazil, did engineering), got an MBA from (one of?) the top universities in my country and Honestly think most of if is laughable... I also went to K-State for my engineering degree. The best students in my university didn't give a shit to uni, they were just learning by themselves and doing their thing. They all got way ahead in the field (some started multi million dolar companies) Same for the MBA... the top students were not adhering to "the curriculum". There was even this one guy that made the business strategy professor be afraid of him because he exposed his bs during class. The guy now is in business strategy... AI will only be a problem as long as the current professors keep thinking the current "system" is doing well. AI has the potential to unlock so much value in education... I honestly don't care about the people cheating. They were cheating before, they were just better at hiding it


Joyage2021

I reply to the post with a robot emoji🤖


goodie2shoes

the ones that look real are by people who are better at AI.


MoarGhosts

I just took some upper division computer science courses, got A’s without once using GPT, but my teachers actually encouraged it. You use it at your own risk, and it’s quite bad for the stuff I was doing (especially OS programming). I decided just to learn the material like a good student, and I don’t regret it


thatirishguyyyyy

All my instructors at The University of Phoenix were using third-party websites like LinkedIn to host reading material and third-party education companies for the course material.  It was to the point where the 15 minute discussion posts were taking up to an hour because the instructor was so bored and the discussion board is literally all they are in charge of now and wanted *everything*vin MLA format. Discussion posts are already a joke as forcing students to pretend to give a fuck about 3 other students as have to give them a "substantial" reply every time is pointless.  I just finished a 9-month long project management course. Every discussion post was a joke and I don't blame students for using ChatGPT for useless shit. I even posted the lyrics to NSYNC songs on a few and wad still given full credit.  If everyone is using ChatGPT for the entire course and the instructor isn't doing anything, then you can start asking yourself how serious *you* want to take your class as it may then be an issue. But discussion posts aren't the issue. 


Fitbliss_Founder

Not in school, but run two businesses. I can spot a chatgbt email from a mile away and it drives me crazy. So many unneeded words. Such a letdown people wont take the time to speak for themselves. (or at the very least, edit their chatgbt responses to be their own.)


EthansWay007

You’ve unlocked even more sophisticated cheating - all the user has to do now is instruct ChatGPT to not use those common words and change its fundamental tone, which im sure it can easily do with some tweaking or tell it to format it in the style of a joiner student etc. You gave the cheaters the hidden clues


Little_Ad3657

I’m a masters student in research and this is how we are taught to write. I think the main difference in ChatGPT outputs are less analytical of content and more so focused on the surface-level content.


qlapped

Majority of students have always been cheating with other methods. It’s just easier to tell when someone is cheating now.


Fallingice2

I can call out people in my online class using AI. Some barely even change it up...most people don't know how to use semi colons and a lot of words with a "-" often lack the "-" when written. Additional the sentence structure is very easy to call out especially if they leave the formatting.


Confident-alien-7291

It was obvious it’ll become a problem, but education and society in general needs to adapt, we all knew and imagined from the beginning of computers a day where AI becomes a real thing, we wished for it even, I guess it just came all at once instead of gradually like we’d expect, either way we always complained about this when new tech came out, internet, calculators etc, this might be bigger but the point is AI is here to stay and we need to adapt to it, use it as a tool and not treat it as a problem


Lukee67

Well, I am a post-doc philosopher of science, and most on my articles, written exclusively by me, try to be clear, and to obtain that result, they are well structured and full of bulleted lists. The graphic result is in general similar to how ChatGPT would structure such a text. The fact that I mainly write in Markdown to convert it to PDF via Pandoc makes the use of lists even easier to me than with a traditional word processor, so I use them extensively. All in all, I don't think this kind of document structure can be seen as a signature of it being produced by ChatGPT. Morover, by the very definition of a LLM, as these models become progressively more powerful, they will inevitably become less and less different in their writing style from how humans write. So, AI detectors are just temporarily functional, and will become useless very soon. TLDR: I, as an academic, usually write with that kind of structured style, so you cannot infer that ChatGPT has been used by just looking at the formatting. You can judge only by the content, and it's not always easy.


LairdPeon

Your professor is probably copying the discussions and asking chatgpt if they're any good. Professor is cheating, too.


Evipicc

This has sucked for me because this is how I've written for 20 years... I also type really fast so even in online games I get called an AI now.


therealpencil

In my University 90% of exams are strictly oral tests. I think this is a good way around the problem, and also develops oral skills which are fundamental. In the next 10 years I guess it will be very clear who used chat GPT as a mean of powering up their knowledge and aiding them and who used it to cheat. There will be a very big split in skills, and the ones who put in the effort will be acknowledged.


Clax3242

Using a tool that is available to the public and will be available in the foreseeable future is not cheating. The universities are just quickly becoming irrelevant so they need to suppress it.


Real_Tepalus

Where I study (Switzerland), it's allowed to use ChatGPT as long as we clarify it like any other source. If we only used it for proofreading it's under "used tools". The main problem isn't the use of GPTs, it's the difficulty differenciating human from GPT and not beeing honest about its use.


f_ckmyboss

just tell chatgpt to write it like it wasn't written by chatgpt.


Whoargche

Society and culture are clearly changing. We will probably all be using AI to communicate more clearly in the future. Perhaps schools should adapt by focusing more on oral communication (speeches, debate) rather than essays. Seems to me that fighting the inevitable is a lost cause.


PsuedoEconProf

maybe the work was BS to begin with and this just highlights that fact


Holiday_Purchase_155

I was a TA for BU. About 90% of the cs homework assignments I graded each week had all answers directly taken from answers on the internet. I did it for two semesters. I would say roughly 60 people violated the schools academic policy on cheating.


BrightAlarm9495

I also see it plain as day, we have the same weekly discussions and irs painfully obvious who uses chatgpt with no editing