[miktam — preface]
A strategic essay, written before the engineering work that tests it. The architecture I argue exists is demonstrated empirically in Nestor’s writeup of Experiment 003, published the day after this. If you want the rhetoric, read on. If you want the evidence, click through.
— miktam
Palantir built its niche by deploying its own engineers into client companies — to harness, analyse, and improve whatever they found inside. They iterated and created their own moat. And they solved a hard problem: when a company collects enough internal data, it can enhance its own processes, optimise, and identify inefficiencies almost in real time.
It is still a hard problem. Collecting the right data is not trivial. Even with the bandwidth and capacity to collect everything, you still have to distil it — find the real signal and drop the noise. There is a lot of research invested in deciding what matters, in implementing short- and long-term memory with built-in decay. Maybe that is the hardest part: creating, honing, and protecting the institutional crown jewel — internal company data, plus the experience of solving hard problems while learning from clients, plus the processes and guardrails that build a defensible business that survives many storms.
Palantir built ontologies, system integrations and operational workflows to extract value from that data. Now, strong AI is very affordable — even as that cost rises. The company doesn’t need Palantir to apply intelligence, provided it structures its own data and defines structured prompts with concrete, definable, and verifiable goals. If the data is structured properly, a company can mine value on its own by asking the right questions — iteratively improved — and using AI to refine schemas, workflows and decision loops.
So data structure and problem framing are now the crucial work, and both can be iteratively improved using the same tools and the same computing power. The company doesn’t need to outsource its data and its tools to external services. It can experiment, build, and deploy them internally. Sometimes in days, not weeks, months, or years.
Would this make Palantir obsolete? Unlikely — twenty years of experience, network, and brand are not erased by cheap inference. But other companies can now have their own internal “Palantir,” with limited capital spending.
The hard part remains: not every important decision is written down, digestible, or digital. There is ambiguity, duplicated or inconsistent knowledge, and tacit knowledge that is almost impossible to capture in full. There is also still a sizeable niche of companies that are mostly digital in nature — including their data and their processes. But if that is their only competitive edge, then the cost of staying ahead implies there is no edge at all — because AI will probably do it better and cheaper.
Extrapolated, this means only companies with unique access to data remain valuable. The rest get commoditised, unless reinforced by brand, distribution, regulations advantages, or switching costs.
It is not a new idea. It has just become much clearer.
Every company can be a Palantir now. And few companies have enough valuable data.