Any current teachers/professors here? How bad is the AI situation for coursework now...

Any current teachers/professors here? How bad is the AI situation for coursework now? I keep seeing ads that say ChatGPT is free durings finals week.
I worry this current gen is going to be FUCKED after college

I think ChatGPT helps fuels ideas for thinking, AI is amazing for brainstorming and not so good at knowing knowledge without the internet, meaning using ChatGPT on your essay makes you fail but brainstorming your essay makes you do well

imagine just screen shot your paper of complex multidimensional theoretical calculus problems and ask chatgpt to solve for you

imagine being able to upload pdfs or screenshots of an essay + the sources and instructing the AI to search your works cited to find useful information you may have missed while writing the paper
it is an extremely useful tool if you learn to use its multimodal capabilities

all writing assignments are now bluebooks taken in-class.

It's easy to tell because you have literal retards submitting post collegiate level work but, unfortunately I don't think the administration allows them to do anything about it

Ok so now college kids are outsourcing their brains to the AI junk now too? Oh that should be really good. I'll get the popcorn. You do realize the more you become depending on cheap junk tech machines that break down the more you will run into the more antisocial White males like me who would exploit you, don't you? And you'll never see it coming either, they never do.

I'm a shop teacher so i have no fucking clue, I don't give written assignments

We don't teach thinking and the majority of all education is just memorizing shit you forget after the class so AI is perfect. Why memorize anything?

It's education that needs to adapt to AI instead of thinking of it as a cheat.

Because you can use the AI far more effectively if you yourself know and understand what you're asking it to do.

chatgpt cant do advanced calculus. trust me i tried. that was a year ago so it might have changed, though, tbf. scary

this is very true. ai really just cuts the menial bullshit mostly. if you dont understand the actual ideas, it wont help you. you'll mess up

no, chatgpt was capable of it. you failed at prompting it correctly for it to function
what you are is the equivalent of someone who could type well but refused to adopt word processors or a number cruncher who prefers to do things by hand rather than utilizing excel.

AI is the new microsoft office

learn to prompt

I don't get this, chatgpt is free all the time even without an account or anything. Why do people pay when it does the same shit?

I only ask AI to preform basic calculations and arithmetic tables because I think it's funny.

if you already know the answer you don't need the AI

It’s often obvious when something is written by AI, it has the same formulaic linguistic style, I won’t say how because I don’t want to encourage more retards to outsource all their thinking to AI, but there’s certain tells, plenty of people can tell when something’s AIslop garbage, and it will only respond within the framework that you give it, and it can easily make mistakes too.
Honestly the best way to see an AI break down and short circuit is if you start asking it about sensitive topics, or jews (unironically), you can break it to the extent that it no longer can answer the questions because it’s programming won’t let it

a friend says that most of his uni class uses it to write assignments for them

no. it was a year ago and it couldnt do it regardless of how i prompted it. im talking about actual advanced calculus though. complex integrals, for example, with unique solutions strategies

Imajin being a college professor and actually accessing this site and actually reading your shitty thread. Now imajin actually replying to you. Fat chance, huh

More like it will mess up and you won't notice. People like to take the AI as the supreme source on all topics but you have to be careful with the answers it gives because despite it stating them as if hard fact, there are often errors.

have it write paper

just rewrite the paper in your own words

woah...that was insane...

It's not about knowing the answer, it's about knowing how to find the answer.

See here’s grok shutting down and sperging out about “antisemitism” and then it cited the ADL and went missing and stopped responding to any questions kek

IMG_4691.png - 1284x2778, 541.71K

More like it will mess up and you won't notice

Yes, this is indeed true

I am a teacher and can tell you that the kind of kid who's going to use chatGPT to churn out writing assignments definitely isn't the kind who is going to do any brainstorming or creative thinking. It is a more bespoke version of copypasting a wikipedia article for a school presentation in the 00s

It’s already been possible to do stuff like that, AI is also only as creative as the person using it sometimes it’s just a tool with clear limitations
You can of course use it to make certain research less arduous like asking it to sort out the optimal stats on something should be tho

I worry this current gen is going to be FUCKED after college

This is like math teachers saying "you cant use a calculator on the test because you won't have one as an adult" except we all have one, all the time.
AI is here and everyone will have it, all the time. Why should a person need to perform without it?

I'm in WGU and Grok 3 is a superior teacher to any other resource they provide, the human instructors meanwhile don't seem to know anything at all. AI will b academia tfo and thats a good thing

Yeah even if you get it to summarize a research article for you, it often mangles the findings and gives you plain wrong information. As an experiment I read an article on population genetics, uploaded it to ChatGPT and told it to summarize the conclusions and it told me the exact opposite data than the authors concluded in the article lol

The retarded niggerbrains already do it. AI summarizes the whole of books and writes the whole essay. They won't even bother rewriting it or just taking ideas, they just drop the essay or if they think they're clever they run it through other AIs to obfuscate. If you even insist that the purpose of college isn't just submitting shit and getting a participation award they get mad. Jews have thoroughly rotted higher education and turned it into a debt scheme. No hell is hot enough for them.

If there's anyone smart left they'll require students to justify their essays and arguments when submitting. Just take 5 minutes to summarize in person, like a little thesis defense. Even if they didn't use AI most students would fail which might even be a benefit.

It's pretty gay everything is paywalled slader was pretty cool

Free limits it to a daily total, to how many times you can ask it a question/clarifcation, and how many images you can ask it to make, etc

You cant ask a calculator solve a complex equation, YOU still have to know what numbers to give it and what operations are needed

Most of these high school kids are going to turn out even more brain dead than they used to. When you had to copy paste a Wikipedia article, you had to at least take the time to read the prompt and look for info. Now: without even reading: paste prompt, copy response, done.

Inadvertently, this is teaching them to be AI handlers, and nothing else.

I just want these retard chatbots to solve Navier Stokes etc and win me the million dollar prize...too bad they can't actually do creative work like that. Somehow the brain channels that sort of shit out of the ether - figure that out and you have AGI.

Ultimately it's just one more thing we need to adapt to - our department is already having discussions about how to work around it or incorporate it. Students are going to use it which means we need to pivot to more in-class assessments or try to develop assessments or projects that are 'AI resilient'.

The biggest problems right now are
a) It's early days and there's still a lot of misinformation and misconceptions about what LLMs are, how they work, and what they're capable of (or not capable of)
b) There's a lot of students (and faculty) who use it as a substitute for doing work rather than a tool
c) There's a lot of students who take everything AI generates as fact without trying to reproduce what it generates or examine the results to see if they make sense

I've used LLMs before to generate permutations of problems I've written or help problem-solve code for simulations or analysis, and I'm open with my students about that, but I also make it clear that I don't use it to do the work for me. The first thing I do when I use an LLM to generate a problem is work the problem out myself to see if it's still solvable with the changes that have been made, and the first thing I do when I use an LLM to generate a segment of code is parse through it line by line to evaluate what every single part is actually doing.

My bigger concern is how this may compound an already serious lack of intellectual curiosity and basic problem-solving skills among students. We already see how much of a problem it is when people take whatever they read on the internet as fact or let influencers or clickbait do their thinking for them (regardless of where you fall on the political spectrum this has become a serious problem over the last decade), and my greatest fear is seeing that same trend reflected across education.

scishrug.png - 323x161, 58.98K

Classic.
I like it when it cites a meta-analysis with a clear conclusion and tells me the opposite. And we're speaking about models that are hardcoded to just cite from the internet and paraphrase.

There’s certain obvious tells when they’re using it too. Jeets on X with their spambots always have the same bland machine-like uncanny typing style and the overuse of hyphens ect, and they take paragraphs to explain something that could be easily explained in a couple sentences

I'm imagining that paying for it is only useful for a student who already knows what to do, has all the sources, and does this: Otherwise the latest models have free API keys online or websites that allow you to talk with them forever, so you only exhaust your credits on those when you 100% know what to do in the least amount of steps.

I think that Gemini is superior to be fair to anything from OpenAI. They have tool suited just for notetaking and doing what the quoted post suggests but it uses the slightly more retarded and faster model for everything there, it's still free up to 50 prompts, but google accounts are infinite... I miss when it used to be infinite prompts before they tried to make money out of it.

It's still retarded but helps you see things faster, it will say contradictory things and quote the same source, same paragraph.

I'm going to assist with a diesel engine high pressure common rail class tomorrow.
Just sat in on one a few weeks ago. Instructor told me at dinner that a tech took a picture of a schematic in is hydraulic class and AI answered every test question right.
I don't won't dumb techs, but if it can assist them I'm cool
t. work for an OEM

AI, please transform this problem into a Mathemathica prompt or functional code

Get an accurate, step-by-step solution that's computed classically.

AI, I didn't understand this step. Can you go in-depth?

I think that Mathemathica even has some AI model integrated into it already, so you can do it from the same place.

Think about it, if there's a programming language that can do anything, these AI models can automate its usage, for example, quick and efficient video editing with Avisynth or Vapoursynth, complex computing with Mathematica, automating boring stuff with Python, and automating things around your PC with AutoHotkey.

aistudio.google.com/welcome
here, youll likely need to watch some yt tuts on how to use this interface but from what you wrote it sounded like you were just useing gemini itself.
the free tier when using ai studio is better

There's no limits in AI studio. There's a daily cap but it's quite high. I have only hit it a couple times using 2.5 pro experimental.

The issue is that if you rely on technology to do all your work for you, your skills to do similar work (especially work that the technology can't do for you) atrophy and you ultimately start to lose the ability to check if the technology you're using is giving you correct/sound/reasonable results.

viggle and the other tools people were making hte minions videos turn video editing into drag and drop.
CapCut is also incredible for being able to throw footage at it and it makes it into something good for tiktok

How many jeets get fake degrees writing a bunch of AIslop gibberish and then get jobs they’re clearly not qualified for

very nice, might start pretending to read books again

Why yes I do only use ChatGPT to generate futanari images of Marge Simpson and Bob the Builder

atrophy and you ultimately start to lose the ability to check if the technology you're using is giving you correct/sound/reasonable results.

I don't disagree, but kids are fucking retarded.

cheat sheet to not being retarded, in any life situation even with AI

you can do the higher order thinking and grouping and let the AI prime the material for you or re-order as it makes more sense to it

Acting like college hasn’t been filled with retards for decades

why shouldn't learning be more easy?

mathematical proofs do those and ai sucks at them. in fact ai sucks at high level thinking. ask phd level questions and use deduction to weed out low ai results

Yeah nigger let’s not use calculators anymore. Cnc machines are also bad because we will forget how to hand carve

No employer judge an applicant by their degree anymore not that people haven’t just been paying essay writers to do their homework for years but now it’s on overdrive. They will have to just do IQ tests instead

Well let's see. You can either 1) sit down and do hours of independent research, collect information together with sources and citations, and write out the information into a research paper over the course of many hours or days, or 2) Type one sentence into ChatGPT and spend the rest of the day smoking weed and jacking off.

What do you think?

Bullshit, you're either not really using it to its full capabilities, or you just don't know enough. Even basic bitch vanilla GPT can create a 20 Page research paper with full citations and then give you a PDF or DOC link to download a finished document. You can accomplish this with a single prompt.

I didn't say let's not use calculators anymore, faggot, but if you raise an entire generation of kids that are incapable of doing basic arithmetic without a calculator, then they're going to struggle with doing anything more complex than that.

Tools make work easier, that's what they're for - the problem comes when people treat a tool as a total replacement for the underlying skill or knowledge or work.

It's not like your typical university student learns anything worthwhile and it's been like that for a long time.
The smart and competent people will do the valuable degrees that actually require thinking while everyone else gets channelled through the degree mills into their worthless zoom meeting and email response jobs until we've finished replacing ourselves with thirdies and society finally implodes.