- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
Over just a few months, ChatGPT went from correctly answering a simple math problem 98% of the time to just 2%, study finds. Researchers found wild fluctuations—called drift—in the technology’s abi…::ChatGPT went from answering a simple math correctly 98% of the time to just 2%, over the course of a few months.
Why are people using a language model for math problems?
It was initially presented as the all-problem-solver, mainly by the media. And tbf, it was decently competent in certain fields.
Problem was it was presented as problem solved which it never was, it was problem solution presenter. It can’t come up with a solution, only come up with something that looks like a solution based on what input data had. Ask it to invert sort something and goes nuts.
Once AGI is achieved and subsequently Sentient-super intelligent ai- I cant imagine them not being such a thing, however I’d be surprised if a super intelligent sentient ai doesn’t decide humanity needs to go extinct for its own best self interests.
I did use it more than half a year ago for a few math problems. It was partly to help me getting started and to find out how well it’d go.
ChatGPT was better than I’d thought and was enough to help me find an actually correct solution. But I also noticed that the results got worse and worse to the point of being actual garbage (as it’d have been expected to be).
it’s pretty useful for explaining high level math concepts, or at least it used to be. before chatgpt 4 launched, it was able to give intuitive descriptions of stuff in algebraic topology and even prove some properties of the structures involved.
Math is a language.
Mathematical ability and language ability are closely related. The same parts of your brain are used in each tasks. Words and numbers are essentially both ideas, and language and math are systems used to express and communicate these.
A language model doing math makes more sense than you’d think!
deleted by creator
I used Wolfram Alpha a lot in college (adult learner, but that was about ~4 years ago that I graduated, so no idea if it’s still good). https://www.wolframalpha.com/
I would say that Wolfram appears to probably be a much more versatile math tool, but I also never used chatgpt for that use case, so I could be wrong.
There’s an official Wolfram plugin for ChatGPT now, so all math can be handed over to it for solving.
How did you learn to talk to WolframAlpha?
I want to like WA, but the natural language interface is so opaque that I usually give up before I can get any non-trivial calculation out of it.
I’m guessing people were entering word problems to generate the right equations and solve it, rather than it being used as a calculator.
Well it was quite good for simple math problems, as this study also shows
It can be useful asking it certain questions which are a bit complex. Like on a plot which has the y axis linear and x axis logarithmic, the equation of a straight line is a little bit complicated. Its in the form y = m*(log(x)) + b rather than on a linear-linear plot which is y = m*x+b
ChatGPT is able to calculate the correct equation of the line but it gets the answer wrong a few times… lol
And why is it being measured on a single math problem lol