applied-ai
The Procurement Story That Most Strategy Decks Are Missing
Running serious AI on a piece of equipment in the field used to require infrastructure most operations didn't have. That's quietly stopped being true. The implications haven't filtered through to most procurement teams yet.
· 5 min read
Something has shifted in enterprise tech procurement over the last two years, and most of the strategy decks haven’t caught up.
Running serious AI on a piece of equipment in the field used to require infrastructure most operations didn’t have. A reliable internet uplink. A cloud account with somebody else’s GPU on the other end. A data pipeline that didn’t break when the signal dropped. For a lot of real-world contexts, especially the Canadian ones I’m most interested in, that combination was either expensive or fragile or both.
That’s quietly stopped being true.
A compute node the size of a credit card can now run localized AI on minimal power, sitting directly on a machine in the middle of nowhere. Sensor fusion, object detection, predictive maintenance, anomaly detection, simple decision logic. The kind of work that used to require a server room and a network engineer is now possible on a board that costs a few hundred dollars and draws less power than a desk lamp.
The procurement implication is the part most people are sleeping on. When the model runs on the device, the value shifts toward the team that can ship hardware-aware software fast and iterate against your actual operational conditions. The biggest names in cloud AI are not optimized for that work. They’re optimized for the centralized, hyperscale, general-purpose deployment that made them giant in the first place.
What changes when the cloud isn’t the answer
Three things matter when AI moves to the edge.
Latency. A drone making a navigation decision cannot afford a hundred milliseconds of round-trip to a cloud API. The decision has to happen on the device. Same for an autonomous truck, a robot in a warehouse, a surveillance system flagging an anomaly, a fishing boat detecting a school of fish. Local inference is not a nice-to-have for these systems. It’s the only option.
Bandwidth. The Canadian mining industry runs thousands of sensors in places where pulling continuous data back to a central cloud is either physically impossible or financially absurd. Edge AI lets the data become decisions at the source. Only the decisions, or anomalies, or summaries, travel back over the slow pipe.
Sovereignty. This one gets more attention by the month. Operational data on Canadian infrastructure, Canadian healthcare, Canadian government systems increasingly cannot leave the country, and often cannot leave the device. The regulatory pressure here is one-way. It will get tighter, not looser.
Each of these on its own would be enough to push some AI workloads to the edge. The combination has made it inevitable.
What Canada has that most countries don’t
This is the part I keep coming back to.
The applied edge AI opportunity is structurally a better fit for Canadian companies than the frontier-model race ever was. The frontier race is a contest between handfuls of US and Chinese labs spending tens of billions of dollars on data center capacity. Canada is not going to win that fight, and we should stop pretending we should try.
The edge race is different. It rewards small teams that can deeply understand a specific operational context and ship software fast. It rewards hardware-aware engineering culture. It rewards trustworthy provenance, because customers in defense, healthcare, and critical infrastructure increasingly want suppliers they can verify. Canada has more of all three than most of the countries it benchmarks itself against.
A few examples worth knowing about, not as an exhaustive list but as proof points:
Tenstorrent in Toronto is building AI chips with an explicit philosophy of openness instead of vendor lock-in. The architecture matters less than the strategic posture, which is the right one for a country that doesn’t want to be downstream of NVIDIA forever.
Waabi, also Toronto, is building autonomy software for trucks. Not the trucks. The software that drives them. That’s an edge AI company in the strictest sense. The inference happens on the truck, in real time, against real-world conditions, and the value compounds in the simulation and software layer.
Sanctuary AI in Vancouver is building humanoid robots that can do real physical work. Their software stack is the differentiator. The hardware does what software tells it to do.
Dominion Dynamics in Ottawa is building software-defined autonomous systems specifically for Arctic conditions. They closed a serious seed round earlier this year and are already in field trials with the Canadian Rangers. The Arctic is, by some measure, the most Canadian environment imaginable. It is also one of the hardest operational contexts on Earth, which is exactly the kind of constraint that turns into competitive advantage if you build for it.
Four companies, four cities, four different angles into the same thesis. None of them are competing with OpenAI on frontier model training. All of them are building software-and-hardware capability for places the frontier labs cannot reach.
What buyers should be doing about this
If you’re inside a Canadian enterprise, a government program, or a defense procurement office, the question worth asking is whether your AI strategy is still implicitly assuming the cloud-centric model.
A lot of strategy decks I’ve seen recently still assume the path to AI value runs through API calls to a hyperscaler. For some workloads, that’s exactly right. For the workloads that involve real physical conditions, regulated data, intermittent connectivity, or genuinely time-sensitive decisions, it’s increasingly the wrong assumption.
The next wave of operational AI isn’t going to be bought from the usual vendors. It’s going to come from teams most procurement departments haven’t met yet. Many of them are working in Toronto, Vancouver, Ottawa, Montreal, and Waterloo right now. They are smaller than the names in your current vendor list. They are also closer to your actual operational reality than the vendors in your current vendor list.
If you’re a Canadian operator and you’re not actively mapping who’s building in your industry’s edge AI category, you’re behind on a story that is going to define the next decade of Canadian enterprise software.
If you’re a builder inside one of these Canadian teams, the procurement door is open in a way it wasn’t twelve months ago. The customers exist. The regulatory tailwind is real. The capital is finally showing up. The thing that’s missing in most cases is the explicit conversation between the buyers who feel the pain and the builders who can fix it.
That conversation is what I’d like this site, and the writing on it, to help start.
The writing, by email.
One or two emails a month. Long-form essays and notes from the Canadian tech ecosystem. No course pitches, no affiliate links, no sharing your address.