Weekly Backlog Week 19/2026
🧠Editorial The Tech World is Writing Its Own Rules The German Armed Forces reject Palantir because …

The German Armed Forces reject Palantir because software is no longer just software. Microsoft begins embedding AI visibly into commit histories, quietly altering one of the most important conventions of open source. Europe is once again discussing digital taxes, despite being deeply entrenched in the very platforms it seeks to limit.
And while all this is happening, a Linux vulnerability quietly reveals how fragile many of our security assumptions have become beneath the surface.
The interesting part isn’t the individual news item. It’s the pattern behind it.
Because suddenly, everything revolves around the same question: Who actually controls the systems on which almost everything now depends?
Enjoy reading.
The German Armed Forces reject Palantir. This is not a minor decision, but rather overdue, as the Palantir Manifesto has shown in the past.
Palantir is not a neutral software provider. The company itself claims to understand technology as a power instrument. In its own manifesto, software was recently declared the foundation of geopolitical dominance. Statehood, security, military strength – everything is being technologically redefined. This is not a product description; it’s a political program.
Adding to the complexity is the leadership level. Alex Karp and Peter Thiel have been positioning themselves offensively for years. He relativizes democratic assumptions, prioritizes efficiency over legal considerations, and calls for a stronger integration of state and technology industry in terms of military strength.
So, those who deploy such systems are no longer making purely technical decisions. The Bundeswehr (German Federal Armed Forces) draws a clear line here: No external providers accessing national data. No dependency on actors with their own political agendas.
And that’s more than right.
Because as soon as companies not only deliver but want to shape, control shifts. Data becomes the interface of power – and that’s exactly where our sovereignty ends.
The decision against Palantir is therefore more than just “caution.” It is the necessary step to secure state action capability.
If the German Armed Forces were to now question the dependency on US cloud technology, the picture would be almost complete 😉
🔗 https://www.heise.de/hintergrund/Technologie-als-Staatsraeson-Was-Palantir-mit-seinem-Manifest-bezweckt-11272183.html & https://www.zeit.de/politik/deutschland/2026-04/palantir-bundeswehr-nato-software-gxe
Microsoft makes Copilot a co-author – by default. What at first glance seems like a small UX decision interferes with one of the central mechanisms of software development: the attribution of authorship.
In Git, this attribution was previously clearly defined. Whoever is listed in the commit has actually contributed. This logic is not only a technical convention but the basis for responsibility, traceability, and reputation within projects.
This clarity is now being diluted as Copilot is listed as a co-author whenever a corresponding feature was active in the development process – or is considered active. Users already report that the label appears even without actual use, exacerbating the problem. Because this detaches attribution from actual contribution.
What emerges here is a new form of attribution: Not performance decides visibility, but interaction with a tool.
This shifts control over a central element of open source. The commit history is not an accessory but the collective memory of a project. When a provider begins to systematically write itself in without the community defining these rules, a tool becomes a shaping actor.
The decisive lever is the default. The function is optional, but its effect is initially set. Those who do not actively opt-out adopt the system’s logic. This is how standards are established – not through consensus, but through default settings.
For open source, this is a structural problem. Because attribution there not only documents but distributes: visibility, influence, and ultimately economic opportunities. If this distribution is no longer cleanly linked to actual contributions, the system loses integrity.
And at this point, it becomes political. Because the question of when AI is considered a co-author is not openly negotiated here but answered by a product.
By the way: The fact that this can currently still be turned off does not change the direction. Microsoft is already defining how authorship in AI-supported development environments will be understood in the future.
🔗https://www.heise.de/news/WTF-Microsoft-erzwingt-Co-Authored-by-Copilot-in-Commits-11279525.html
The EU Parliament is once again calling for a digital tax for large tech companies. Finally, one might think – if it weren’t for something similar already happening in 2025.
Back then, the Commission withdrew a similar initiative just in time when the first threats came from Washington. A few hints at possible tariffs were enough – and European tax policy suddenly became foreign policy consideration.
So much for strategic autonomy.
Now, the next attempt. With big numbers, big goals, and the familiar argument: Big Tech must finally make a fair contribution. In substance, this is correct. Value creation also happens in Europe, but taxation unfortunately does not.
The real question is another: Will Trump even allow it?
Or will we see the familiar pattern again: pressure from the USA, flanked by the interests of the very digital companies on whose capital and influence he relies – followed by the next tariff threat. And Brussels? Withdraws. Like last year?
What exactly has changed this time?
The situation is identical. Europe remains massively dependent – technologically, infrastructurally, and thus economically. This dependency cannot be resolved by a tax decision.
On the contrary: Even if the levy comes, it will hardly change the fundamental dynamic. Costs will be passed on. Products and services will become more expensive, while companies stick to existing systems because a change is “too cumbersome” for most.
The result is predictable:
More money flows out of the system, less remains for own innovation.
The structural course was set decades ago. Back then, when open technologies were dismissed as playgrounds and proprietary systems were considered without alternative. Later again, when dependencies on cloud infrastructures were consciously accepted – against all warnings.
The arguments were always the same: too complex, too inefficient, not scalable. Presented by those who today manage or benefit from exactly these dependencies.
Now, they are trying to reclaim through taxes what was previously given up in control.
That will not work.
Because as long as Europe does not determine itself under what conditions digital value creation arises, taxation remains dependent on how much pressure is exerted from outside.
Report shows how the federal government remains heavily dependent on US providers; spending policy increases risk of vendor lock-in and complicates European alternatives. 🔗https://www.golem.de/news/anfrage-der-linken-bund-zahlt-weiterhin-hunderte-millionen-an-microsoft-und-co-2604-208108.html
Open-source PaaS on own infrastructure demonstrates reducing dependencies on hyperscalers; shows practical alternatives for European infrastructure.
BSI-certified messenger Wire to replace Signal; strengthens state communication infrastructure and reduces phishing risks.
Digital Independence Day #5 Shows open-source alternatives as everyday infrastructure, reduces dependencies on proprietary platforms; supports Europe’s ability to act through sovereignly usable systems.

It’s one of those bugs that shouldn’t exist – and that’s exactly why it went unnoticed for years. “Copy Fail” is not a spectacular memory corruption trick but a logic error in the Linux kernel, sitting precisely where you’d rather not look: between the crypto subsystem and the page cache.
The result is as simple as it is unpleasant. A local user can deliberately change a few bytes in the page cache without the kernel marking this change as writable. Everything remains clean on disk, every integrity check nods approvingly – but as soon as the file is executed, the manipulated version from the cache prevails. Reality at this moment is what resides in memory, not what’s on disk.
That setuid binaries can be hijacked with this and ultimately root emerges would be annoying enough. It becomes truly explosive due to the architecture of modern systems: The page cache is not an isolated place but a shared one. Containers access the same cache as their host. What looks like a classic local privilege escalation has the potential to develop into a container escape with notice.
The exploit itself fits into a few hundred bytes of Python and runs reproducibly on practically everything that has been considered a “stable Linux stack” for years. That alone says more about the quality of the vulnerability than any CVSS number.
Remarkable is also how it was found: not by chance, but with AI support in analyzing complex subsystem interactions. Exactly where human reviews eventually capitulate, these tools are just getting started.
Patches are on the way and should be at the top of the priority list. Because “Copy Fail” is less a single bug than a reminder: Relying on kernel details for isolation is playing a game whose rules you don’t fully control.

🧠Editorial The Tech World is Writing Its Own Rules The German Armed Forces reject Palantir because …
🧠 Editorial This issue can also be read as follows: Software is no longer just a tool – it is power …
🧠 Editorial This week clearly shows where things are tipping: We talk about digital sovereignty – …