Even with LG’s concession, it may become more difficult to avoid chatbots on TVs.
LG says it will let people delete the Copilot icon from their TVs soon, but it still has plans to weave the service throughout webOS. The Copilot web app rollout seems to have been a taste of LG’s bigger plans to add Copilot to some of its 2025 OLED TVs. In a January announcement, LG said Copilot will help users find stuff to watch by “allowing users to efficiently find and organize complex information using contextual cues.” LG also said Copilot would “proactively” identify potential user problems and offer “timely, effective solutions.”
Some TVs from LG’s biggest rival, Samsung, have included Copilot since August. Owners of supporting 2025 TVs can speak to Copilot using their remote’s microphone. They can also access Copilot via the Tizen OS homescreen’s Apps tab or through the TVs’ Click to Search feature, which lets users press a dedicated remote button to search for content while watching live TV or Samsung TV Plus. Users can also ask the TV to make AI-generated wallpapers or provide real-time subtitle translations.



I am genuinely curious, this whole thing is most likely an effort to sell more TVs, but does that actually work? Is there a significant segment of customers which buys TVs based on whether or not it has a (link to a) chatbot in it? Or did some exec just decide “our products need to have AI now” with 0 research done.
I would really like to see data on this.
Or did MS pay them to include it, knowing they could hoover up a lot of data, perhaps even with a clause in the contract to also share that data with LG?
They do it because those TVs are selling.
What many people seem to misunderstand today dramatically is that no sane major manufacturer will push a genuinely risky feature. On the contrary, if something like this makes it into a product, it’s because there is an expectation of immediate or medium-term profit, backed by extensive market research. Companies aren’t stupid; they are highly optimised for this kind of decision-making. And I would honestly be glad to be proven wrong.
In other words, if the feature is there, it means that people either like it or simply don’t care enough to make it into a problem.
And here’s the hot take: don’t blame the manufacturer, blame the people. Collectively consumers have shown almost no resistance to the ongoing enshitification of the last decade.
I’m glad you’re opposed to it, and many people here are too, but in the bigger picture it is just a drop in the ocean, unfortunately.
That hot take ignores human psychology’s known weaknesses.
Blaming the public for falling victim to psychological manipulation that has been being perfected for generations is like blaming a stabbing victim for bleeding so much.