Security researchers recently claimed they accessed McKinsey's internal AI platform -- used by more than 40,000 consultants -- in under two hours.
No credentials. No insider access.
What they reportedly found inside should make every company using AI stop for a moment.
Millions of chat messages. Hundreds of thousands of internal files. And the system prompts controlling how the AI behaves -- all in the same database.
According to the report, an autonomous security agent discovered a SQL injection that traditional scanners had missed. In theory it could have provided read/write access to a massive amount of internal data.
But the interesting part is not McKinsey.
They have world-class security teams and virtually unlimited resources.
The real question is architectural.
What does this mean for a 50-person Canadian company using cloud AI tools with no dedicated security staff?
The "too small to target" argument is disappearing. Automated scanning agents do not care about brand names. They scan everything.
Every time a business sends financials, client data, or internal strategy through a third-party AI platform, it extends its attack surface.
And for Canadian companies, PIPEDA obligations do not disappear just because the tool is convenient.
After 40+ years building enterprise systems, one lesson keeps repeating: complexity compounds risk.
When we chain multiple AI services, APIs, prompt layers, and external storage systems together, the real issue is not model accuracy. It is loss of control over the data boundary.
That is why we are starting to see a shift toward private AI infrastructure.
Not to reject AI. But to run it where the data already lives.
Modern quantized models now run very well on dedicated local hardware. For many internal workflows, companies no longer need to send proprietary data across the internet.
The goal is simple: use AI, but keep control of the system that runs it.
For Canadian SMBs, this is not about fear. It is about making the right architectural decision before the scale of the problem forces it.