
Image - Salvador Rios
Palantir has grabbed the attention of government institutions and corporate giants alike over the past few years, becoming one of the fastest-growing companies on the Nasdaq and boasting a 43% profit margin, rivalling other tech outperformers like Nvidia and OpenAI. However, customers and employees of Palantir similarly struggle to actually define what the company really does. Former staff are somewhat unable to provide a definition, “Even as someone who worked there, it’s hard to figure out” says Linda Xia, an engineer of Palantir from 2022 to 2024. The Denver-based company specialises in organising large datasets into digestible and functional information that can be utilised effectively without needing to change the entire architecture of the software itself. This is largely strengthened by their state-of-the-art AI, which has proven that Palantir is more than just a data clean-up crew. Having secured over 24 contracts with UK public institutions alone, as well as a $10 billion US Defence contract awarded in August of last year, Palantir have seemingly reached a point of becoming too big to fail, clearly carving itself out as the next big thing in the West.
However, there has been strong backlash from the public as to why every institution and business wants a piece of it, highlighting the lack of transparency that frustrates those being affected, as well as those worried about Palantir's access to sensitive data. These outcries have become increasingly more justified when analysing exactly what Palantir is involved with. The growing realisation that Palantir technologies could very well be applied (and extremely effective) in fostering an authoritarian state raises the question: Is Palantir a step towards dystopia?
What is Palantir?
As mentioned, the US company has found its claim to fame by being extremely effective at managing large datasets into digestible and functional information - “20 people can now do the work that needed 2000 people,” claims Palantir UK boss Louis Mosley, the grandson of British fascist Oswald Mosley. However, what sets Palantir apart from the thousands of other tech startups that tried to gain traction at the same time was integration. Instead of changing how data is collected, stored and organised, the software tool provided sits on top of the customer’s system and integrates into it, without having to change the fundamental architecture of the system itself. Putting this into perspective, Palantir software could, in theory, integrate with a state-of-the-art system that is plastered together with 1960s-era programming language; the makeup of most government agencies.
Palantir’s sales pitch to potential customers essentially boils down to the simplicity of use and the ability to purchase one system that replaces potentially dozens of other systems, according to Ben Rogojan’s 2022 analysis of Palantir’s products. This applies to both of Palantir’s primary platforms: Foundry, for commercial use, and Gotham, for government entities. Most of the Fortune 500 companies you’re aware of are using Foundry; the list continues from Amazon to JPMorgan to Ferrari and so on. This platform focuses on helping businesses manage their data across all departments to improve efficiency and cut costs, a no-brainer investment for any company handling big datasets. The controversy, however, lies in their government-used platform: Gotham.
How does Gotham differ?
Gotham is the platform offered to government clients that primarily focuses on investigative tools that connect people, places and times of interest. Foundry and Gotham work the same by taking in data and converting it into a neat and digestible set of information for the customer to use; however, the practical application of Gotham is what shows the terrifying potential of the software. Palantir staff overlooking Gotham have claimed that in minutes, a government official can centralise a person’s entire network and documents and cross-reference them with areas of interest, such as the time of a crime, the whereabouts of a military target, or how many times someone has taken the train without buying a ticket - making it extremely easy to build a detailed report on anyone.
Gotham never creates data but feeds on it as fuel, requiring massive amounts to do its job effectively. Within the LAPD, Palantir has been in use since 2015 and has created an extremely robust database on countless individuals, guilty and innocent alike. Training guides have been released by the LAPD on how to use Palantir on the job, with the courses stating that police can search for people by name and by characteristics, such as tattoos, race, gender, scars, as well as any other connections it can make, including friends and family. The same document states that those officers ran over 60,000 searches in support of over 10,000 cases, exacerbating how much data Palantir has acquired and now has control of in only a single year of its implementation.
Applying this to a global context that encapsulates entire nations as the servants to the defence and surveillance industry, a formula for authoritarianism and oppression is inevitably brewed. While an overreach as extensive as this, without a public mandate or checks and balances incorporated into it, may make decision-making quicker and more effective in law enforcement and within the military landscape, it overwhelmingly hands over power to individuals in control, who only take in this data to strengthen their narratives and amplify human nature, which always strives towards a conquest for power and control.
Palantir In Practice
Unfortunately, this inevitability has already crept through in a dystopian manner. Having secured contracts within the UK, the US, as well as NATO incorporating Maven AI, the advanced military artificial intelligence programme offered by Palantir, the future of warfare and surveillance is frightening. The company is well aware of its implications, too. While Oppenheimer created the atomic bomb as part of the Manhattan Project, dreading its impact, Palantir CEO Alex Karp boasts about the destructiveness of the platform, “when it’s necessary to scare enemies, and on occasion, kill them.”
The most shocking reality of Palantir’s enablement of hell in warfare can only be pointed towards the war in Palestine, whereby the Israeli military has secretly acquired Palantir AI technology to strengthen the targeting of Hamas fighters - and civilian targets. In April 2024, Israeli military targeted three well-marked and fully government-approved aid vehicles belonging to the World Central Kitchen, killing all seven humanitarians and ensuring that the supplies would never reach those dying of starvation. Immediately after this, Karp travelled to Tel Aviv to secure an upgraded contract with Israel’s Military of Defence, as well as the CEO posting “we stand with Israel” to further rub salt in the wound.
This attack was aided by ‘Lavender’, the AI drawn from Palantir technology and applied to marking tens of thousands of Gazans as suspects for assassination. According to Israeli intelligence officers, Lavender has played a key role in unprecedented bombings of Palestinians, going so far as to claim its influence on military operations was such that they treated the outputs of the AI “as if it were a human decision.” The system was designed to mark out all potential bombing targets in Hamas and Palestine as a whole, with some operators claiming that the war almost completely relied on Lavender, which clocked over 37,000 Palestinians as suspected militants (and their homes) for air strikes. While it's not yet clear whether AI has been given the sole responsibility for the targeting and execution of Palestinians without Israeli authorisation, the military’s reliance on the role of Palantir’s AI system implies that such a reality could soon become possible, as its efficiency is unmatched by a human mind.
Without question, the incorporation of Palantir into government agencies will create some of the most dire consequences regarding surveillance and warfare. Due to the extreme efficiency of the software offered by Palantir, that are undoubtedly ahead of all competition (if any), the technology presented is understandably a no-brainer investment for business and governments alike. However, the implications of these systems must be placed under scrutiny as they inherently encourage the power-seeking aspect of human nature and control, as well as introducing limitless possibilities in the horrific products of war. Palantir can unquestionably lay the foundations for an authoritarian dystopia.