@mzedp @necedema @Wyatt_H_Knott @futurebird @grammargirl Representative example, which I did *today*, so the “you’re using old tech!” excuse doesn’t hold up.
I asked ChatGPT.com to calculate the mass of one curie (i.e., the amount producing a specific number of radioactive decays per second) of the commonly used radioactive isotope cobalt-60.
It produced some nicely formatted calculations that, in the end, appear to be correct. ChatGPT came up with 0.884 mg, the same as Wikipedia’s 884 micrograms on its page for the curie unit.
It offered to do the same thing for another isotope.
I chose cobalt-14.
This doesn’t exist. And not because it’s really unstable and decays fast. It literally can’t exist. The atomic number of cobalt is 27, so all its isotopes, stable or otherwise, must have a higher mass number. Anything with a mass number of 14 *is not cobalt*.
I was mimicking a possible Gen Chem mixup: a student who confused carbon-14 (a well known and scientifically important isotope) with cobalt-whatever. The sort of mistake people see (and make!) at that level all the time. Symbol C vs. Co. Very typical Gen Chem sort of confusion.
A chemistry teacher at any level would catch this, and explain what happened. Wikipedia doesn’t show cobalt-14 in its list of cobalt isotopes (it only lists ones that actually exist), so going there would also reveal the mistake.
ChatGPT? It just makes shit up. Invents a half-life (for an isotope, just to remind you, *cannot exist*), and carries on like nothing strange has happened.
This is, quite literally, one of the worst possible responses to a request like this, and yet I see responses like this *all the freaking time*.