

0·
3 days agoYou may already know that, but just to make it clear for other readers: It is impossible for an LLM to behave like described. What an LLM algorithm does is generate stuff, It does not search, It does not sort, It only make stuff up. There is not that can be done about it, because LLM is a specific type of algorithm, and that is what the program do. Sure you can train it with good quality data and only real cases and such, but it will still make stuff up based on mixing all the training data together. The same mechanism that make it “find” relationships between the data it is trained on is the one that will generate nonsense.
I don’t know if that is the reason but I wonder if the recent ruling that made Firefox loose on the cash income from Google as a default search engine has them doing a similar type of deal with AI companies, even Google, like, Firefox has a built-in interface for AI and the backend you can choose but the default one is one that some AI company pay a fee to be.
If that is the case I think it is fine, it is like a wink-wink situation, you have to have it enabled by default and with a default provider for it to be worth something for someone to pay for the privilege, and then the users can simply change it be gone with it without affecting the payout. (Unless the pay or renew pay has some metric like use statistics)