@marshray @futurebird It's more prominently an issue with image detection Machine Learning A.I.s-when given a set of pictures and asked to identify if a skin mole is cancer or not-is it actually identifying cancer in the picture, or just identifying if there's a ruler in the picture?
In the same way,the LLM is not necessarily doing X,but identifying things that are markers of X,despite not actually requiring one to do X.
It's important because it means it won't solve the issues that trip us up.

at1st@mstdn.ca
@at1st@mstdn.ca
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Posts
-
"Chat GPT told me that it *can't* alter its data set but it did say it could simulate what it would be like if it altered it's data set" -
"Chat GPT told me that it *can't* alter its data set but it did say it could simulate what it would be like if it altered it's data set"@marshray @futurebird The problem is that it's good at predicting what metacognition should look like, better than your average human who hasn't read the internet's worth of documents on metacognition.
That doesn't mean that it's actually good at metacognition.