

28·
27 days agoIt doesn’t, it generates incorrect information. This is because AI doesn’t think or dream, it’s a generative technology that outputs information based on whatever went in. It can’t hallucinate because it can’t think or feel.


It doesn’t, it generates incorrect information. This is because AI doesn’t think or dream, it’s a generative technology that outputs information based on whatever went in. It can’t hallucinate because it can’t think or feel.
It’s not the only step, it’s just that half the country has lost its mind and that half seems to have itchy trigger fingers. Meanwhile a huge part lives paycheck to paycheck where they’re not able to just leave and strike to starve themselves and their dependents. Capitalism has the country by the balls.
Trying to isolate out “emotional baggage” is not how language works. A term means something and applies somewhere. Generative models do not have the capacity to hallucinate. If you need to apply a human term to a non-human technology that pretends to be human, you might want to use the term “confabulate” because hallucination is a response to stimulus while confabulation is, in simple terms, bullshitting.