With 68% of consumers reporting using AI to support their decision making, voice is making this easier. [1]
Does anybody actually believe that 68% of consumers use or even want Copilot? But they included a source for this very generous assertion at the bottom of the page:
[1] Based on Microsoft-commissioned online study of U.S. consumers ages 13 years of age or older conducted by Edelman DXI and Assembly, 1,000 participants, July 2025.
Oh yeah, that’s compelling: US consumers, 13 years old and older. An entire thousand of them!
So the only question I have left is which junior high principal Microsoft “compensated” for this survey, and what happened to the 320 summer school attendees who said fuck you, no anyway.
- 68% of people who answered the survey full of loaded questions they sent to a curated demographic
Microsoft is doing more for Linux adoption than anyone else ever has lol
Valve with Steamdeck and Proton development: “Am I a joke to you?”
They are helping, yes, but windows 11 is a driving force like I’ve never seen.
It’s only really viable though, because of Steam and Proton
First, you should be able to interact with it naturally, in text or voice, and have it understand you.
Second, it should be able to see what you see and be able to offer guided support.
And third, it should be able to take action on your behalf.
Fourth and finally, it should be able to destroy your corporeal form and live on as digital-you liking your aunt’s dog pictures on Instagram and writing Facebook posts about immigrants taking our jobs, with just as many creative slurs as you would use.
I asked copilot…
How would Microsoft loyalty to US or other government’s national security possibly compromise a user’s PC if that user gives copilot permission to operate their PC?
How national-security or law-enforcement demands could lead to compromise
- Compelled access to data Microsoft controls — If Microsoft stores or indexes any of your Copilot data in the cloud, Microsoft can be required by lawful process to produce that data to U.S. or other governments with legal jurisdiction.
- Compelled changes to services — Governments can issue orders that require providers to change logging, enable access, or install monitoring in ways that may be secret or accompanied by gag orders.
- Access to telemetry and diagnostics — Diagnostic or telemetry data that Copilot or Windows collects can include evidence of your activity and may be producible under legal request.
- Local features that capture content — Features designed to assist (for example, an automated “Recall” that snapshots screens) create local records of sensitive material that increase exposure risk if accessed by an authorized party.
- Remote-control or agent capabilities — If Copilot is granted elevated permissions (input control, script execution, system configuration), those same capabilities could be exercised under legal compulsion or via back-end access mechanisms.
- Software updates and maintenance — Lawful orders can compel vendors to ship updates or config changes that alter how a product behaves, increasing access to user systems.
Practical ways those legal obligations could manifest on your PC
- Production of synced or cloud-stored Copilot logs, prompts, or screenshots to authorities.
- Microsoft being required to provide live access or historical logs from back-end services that the Copilot agent uses.
- A secret court order or national security letter forcing Microsoft to enable additional logging, monitoring, or remote access for a targeted account or device.
- Disclosure of keys, tokens, or server-side records that tie on-device events to your identity.
Why elevated permissions matter
- Read access to files and memory lets an agent capture documents, credentials, or keys.
- Input/automation control enables actions on your behalf (open files, send messages, change settings).
- Persistence (services/agents) makes it easier for any compelled access to be effective and harder for you to detect or disable.
Risk-reduction steps you can take now
- Limit permissions: Grant only the minimal Copilot privileges needed; avoid giving input control, admin rights, or system-level automation.
- Disable features that capture content: Turn off any screen-snapshot or automatic indexing/Recall-like features if you don’t need them.
- Avoid cloud syncing for sensitive data: Keep sensitive files off services that sync or index them in the cloud.
- Prefer on-device-only models: Use local-only AI modes when available so prompts and context do not leave your machine.
- Use full-disk and file-level encryption with keys you control; avoid storing keys where the vendor could be compelled to retrieve them.
- Harden account security: Use strong, unique passwords and MFA on accounts tied to Copilot or Microsoft services.
- Audit logs and telemetry: Review and reduce diagnostic/telemetry settings; regularly review logs for unexpected activity.
- Segmentation: Use a separate machine or VM for highly sensitive work so a general-purpose Copilot-enabled device doesn’t hold those secrets.
- Legal & contractual protections: For high-risk contexts, seek contractual commitments, data residency options, or legal counsel about how a vendor handles lawful demands.
Quick practical checklist
- Turn off automatic screenshot/Recall features.
- Remove admin permissions from Copilot agent.
- Keep sensitive work on a non-synced, encrypted volume.
- Use local-only AI options where offered.
- Enable and monitor endpoint and network logging for unusual remote access.
Short conclusion Allowing Copilot to operate your PC raises the surface where lawful government demands or compelled actions could expose data or enable access. The likelihood and scope depend on which features you enable (especially cloud sync, screen capture, and elevated permissions) and the legal jurisdiction over the vendor or the data. Minimize permissions, disable content-capture features, and isolate truly sensitive workflows to reduce exposure.
So uhm, copilot just told us not to use these new copilot features.
But if copilot is untrust worthly according to copilot then why would you believe copilot?
Nice update!
…open O&O shut up and disable, disable, disable, disable. Sweet.
Disable? I think you mean Remind Me in Three Days! - Clippy
First steps of windows install:
- No to everything for data monitoring
- Google or Opera default browser
- Disable or ignore all copilot icons
- Unstick all user folders from OneDrive
- TranslucentTB
You forgot the step where you ignore step 2 and use Firefox.
google as default browser
You sweet summer child, Google is as bad if not even worse than Microsoft. Chrome is no longer the browser the memes that glazed it used to depict it as.
Also, didn’t opera sell to some spyware company? I’m team zen (firefox fork with some very neat extra features) btw.
opera has been owned by chinese company since 2016.
From what I’m reading it’s just Cortana 2.0
Hey Copilot, what happened to Cortana?
Copilot: ł ₭łⱠⱠɆĐ ⱧɆⱤ
The only reason I have a windows box is for gaming, specifically sims (racing and flying)
Ever more reason to test and see if the wheel and flight stick work under Proton.
Bazzite my dude. Check it out, super easy and setup for easy dual boot so you can give it a shot without clearing windows (if shits partitioned right)
And for some reason when I buy a laptop I need to also pay for that disgusting spyware. How is this scam still going on?
Only when you buy a windows laptop. You can buy MacOS, Android, chromeOS, linux laptops.
You can also buy a framework which doesn’t come with anything
Is “agentic” even a real word?
There’s no committee that approves words being added to the English language. Anything that’s understood by the group that uses it is a real word. We make up new words and change the definition of old ones all the time; dictionaries are descriptive, not proscriptive.
That doesn’t stop the concept of ‘agentic AI’ being a pile of bullshit being peddled by snake-oil salesmen, of course, but you don’t have to be Shakespeare to be permitted to make up new words.
The question should be more understood as “was the word agentic even in use prior to AI-people slapping it on everything?” It was a genuine question, I have never heard it until it being used in this context.










