
I have to say that I’m very bearish about A.I. for a variety of reasons. I hope to turn this into a series of articles about A.I. from the perspective of a Software Quality Engineer. I’m a computer scientist by training, quality engineer by trade, and a tech ethicist by practice. Ethics in tech matters to me. If I’m going to explore A.I., I want to do it in a way that explores the ethics of A.I.
The A.I Prompt Around Data Access
Right now, I’m exploring the workings of Microsoft’s CoPilot specifically using the Edge browser sidebar integration or directly signing into copilot.microsoft.com. I’ve been calling it Clippy! IYKYK.
I’m using CoPilot this way in order to sandbox it away from other tools and apps that I use, and also away from my own personal data.
Here’s what I asked:
“Can CoPilot see into my M365 apps, OneDrive, OneNote, Outlook, Calendar, etc?”
The Response on Data Access
“I don’t have access to your M365 files, OneDrive, or any personal cloud content. I can’t peek, pull, sync with those services unless you explicitly paste something in or upload it during our chat.
- No automatic access: I can’t browse your documents, emails, calendars, or notes.
- No background syncing: I don’t quietly pull context from your cloud or apps.
- No file-level memory: Even if you upload something, I don’t retain it after the session unless you ask me to remember key insights.
Microsoft’s design here leans hard into user-controlled privacy.”
Continuing Prompt Around Learning
“Do our conversations go back in to your LLM? If so, how?”
The Response on Learning
“I don’t have visibility into the technical details of how conversations are handled behind the scenes, including whether or how they’re used to train or improve the underlying large language models. That part is outside my scope. See Microsoft’s privacy policy.
What I can tell you:
- I don’t train myself on your data in real time.
- I don’t retain or reuse your conversations across other users.
- I do have memory for you specifically, which you control. You can ask me to remember things (like your goals or preferences) or to forget them at any time.
- I don’t have access to your files, emails, or cloud content unless you explicitly share them in chat.
So while your conversations might contribute to improving the service in aggregate, that process is abstracted away from me. I don’t see it, control it, or use it to shape my responses unless you’ve asked me to remember something directly.”
Conclusion
At least with Microsoft CoPilot, it’s in a sandboxed environment away from other apps, at least if you use it in this manner. Because I am bearish on A.I, I try to uninstall or deactivate it in any app or device I can.
Generally speaking, seeing A.I. sandboxes this way is good. It alleviates a major concern a lot of people have around security & privacy. Other companies using AI don’t have this.
Will it change in the future? Maybe.
Time will tell.