Copilot data – where is it?
Compliance Alert Default: On Your Copilot data may now be processed outside Europe. Has anyone...
Stay up to date with security news, technology updates and cyber stories that matter.
→ Do we know which AI tools people are using for work?
→ Does this include free tools or personal accounts used during work time?
→ Do we know what people use AI for (emails, reports, analysis, admin tasks)?
Why this matters?
You can’t manage risk if you don’t know what’s being used.If unsure: Ask staff directly and make a simple list.
→ Do staff understand that public AI tools may store or reuse what they type in?
→ Have we clearly told staff never to put the following into AI tools:
Simple rule:
If you wouldn’t put it on your website, don’t put it into an AI tool.
→ Do staff know AI can sound confident but be wrong?
→ Is AI generated content always checked by a human before being shared?
→ Do we avoid copying AI output straight into emails, proposals, or documents?
Why this matters:
AI can make mistakes, invent facts, or miss important context and your business owns the result.
→ Are important decisions still made by people, not AI alone?
→ Do we avoid using AI to make final decisions about:
→ Is there always a person responsible for the outcome?
Bottom line:
AI can help. But people remain accountable.
→ Have staff been told what AI is okay to use for work?
→ Do they know who to ask if they’re unsure?
→ Is AI use covered by our general acceptable use or security rules?
→ Would a new starter understand our AI rules on day one?
If not:
A short-written policy and this checklist is enough to start.
Answer Yes or No:
→ Would we be comfortable explaining our AI use to a customer?
→ Would we be comfortable explaining it to a regulator?
→ Would we be comfortable seeing it in the news?
If any answer is No, review the steps above.
Compliance Alert Default: On Your Copilot data may now be processed outside Europe. Has anyone...
Cyber Threat Alert Active Threat The message came through Teams.It looked like IT. And someone...