Weekly Backlog Week 13/2026
🧠 Editorial This week feels a bit like an infrastructure reality check. While Germany suddenly …

This week feels a bit like an infrastructure reality check.
While Germany suddenly wants to build data centers on an industrial scale, Europe continues to debate what “sovereignty” actually means – and the USA casually shows us how quickly security standards can soften when dependencies become too large.
Spoiler: It’s all the same problem.
Just from three perspectives.
Germany wants to massively expand data centers: AI capacities are to quadruple by 2030, and traditional infrastructure is to at least double.
Sounds ambitious, but currently fails more due to electricity than hardware.
The strategy therefore relies on existing locations – old power plants, industrial sites, anything with sufficient connection capacity. Technically sensible, politically pragmatic.
The real problems remain open:
And in the background, the real question lingers: 👉 Who will ultimately operate this infrastructure?
Because while Germany builds, a clear answer on the platform side is still missing.
🔗 https://www.heise.de/news/Verwaltung-Open-Source-wird-zum-Standard-11219607.html#
The open letter from 25 European cloud providers hits a sore spot that the EU has been avoiding for years: Europe talks about digital sovereignty while systematically expanding its dependencies.
US hyperscalers dominate the market – and yet their offerings are increasingly politically and regulatorily classified as “sovereign.” This equation is not only inaccurate but misleading. Because anyone operating infrastructure with AWS, Microsoft, or Google inevitably operates within the influence of US law. The CLOUD Act is just the most well-known example, but by no means the only one. FISA, especially Section 702, also creates far-reaching access possibilities. The physical location of the data in Europe changes nothing about this.
Thus, the definition of sovereignty shifts. It is no longer based on actual control but on certifications, contractual constructs, and political framing.
The consequences are already visible: Public institutions and companies outsource critical systems to providers whose legal binding lies outside Europe, while European alternatives are politically invoked but practically hardly considered. Lock-in is not only accepted but structurally reinforced.
This development is particularly critical in the context of AI. Whoever controls the infrastructure today decides tomorrow about data access, value creation, and innovation opportunities. If Europe continues to rely on external platforms here, it will not only lose market share but also shaping power.
CADA is thus not another regulation but a directional decision: Either real frameworks for European providers emerge – or the existing dependency is politically cemented.
The ProPublica report on Microsoft’s Government Cloud shows a familiar pattern – rarely so clearly documented.
At its core, it’s about a system whose security could never be fully assessed. Auditors failed for years to clarify basic questions: How exactly do data flows work? Where and how is encryption done? Even involved third-party auditors did not have full insight. The result was clear: no reliable security assessment possible.
Nevertheless, the system was approved at the end of 2024.
The reason is as simple as it is problematic: It was already in use. A halt would have meant breaking existing dependencies – operationally and politically hardly enforceable. Thus, risk assessment becomes a formality. Not security decides, but actual usage.
This dynamic is reinforced by structural weaknesses. Audits are conducted by third parties paid by the provider, while at the same time oversight is weakened and pressured. Control takes place – but only as long as it has no consequences.
A former NSA expert sums it up: This is not a security proof, but “Security Theater.”
The case is not an outlier but a symptom of a highly concentrated cloud market, where dependency gradually replaces the ability to actually enforce risks.
🔗 https://www.propublica.org/article/microsoft-cloud-fedramp-cybersecurity-government
We were on-site with the brand team for two days at CloudFest – and yes, the event remains an exceptional format.
Organizationally, there’s little to complain about. The event runs smoothly, the paths are clear, and the mix of expo, lectures, and networking works. Especially the Come2Gather on the eve was – as it is every year – a highlight. Networking in the streets of Europapark has its own dynamic, and there was no skimping on culinary delights. The catering was, unsurprisingly, outstanding again. Plus good weather – you can’t really ask for more for such a setup.
Content-wise, we attended the lecture by Daniel Menzel, among others. Broadly speaking, it was about the practical handling of modern cloud infrastructure and the challenges that arise when complexity grows faster than one’s own processes. No buzzword bingo, but quite close to the reality of many teams.
The trade fair itself was well attended – at times even too well. In the halls, it got quite crowded at times, which speaks for interest but not necessarily for good conversations.
Less successful, in my view, was once again the placement of startups. They were packed into a narrow corridor again, which effectively creates a bottleneck. For the visibility of the companies, this is anything but ideal. Especially for startups that rely on attention, there is simply a lack of space.
Last year, this was solved much better when the startups were more integrated into the halls. More space, more foot traffic, more real interaction. This year it felt like a separate area again, more overlooked than discovered.
In the end, CloudFest remains what it is: one of the few events that sensibly combines professional exchange, community, and festival character – with small but recurring weaknesses in detail.

The current warnings about Cisco, SharePoint, and Zimbra follow the usual pattern: gap known, patch available, please act.
What’s missing: the actual context.
The Cisco gap was actively exploited weeks before – without operators being able to know.
This is not an isolated case but systemic:
Additionally:
“Patch faster” falls short here. The most critical phase is often over before it even becomes visible.
As long as this dependency on a few, opaque platforms exists, security remains reactive.

New online tool shows for the first time where internet coverage is lacking – based on broadband atlas, mobile data, and provider information.
Still in testing, but an important step: 👉 Infrastructure problems become visible instead of estimated.
The BSI clearly positions itself on the implementation of the EU AI Act:
The decisive factor will be how deeply the BSI is actually involved in testing processes – or whether formal compliance will suffice in the end.
🔗 https://www.bundestag.de/resource/blob/1157054/Stellungnahme-KI-VO_BSI-23-03-2026.pdf
Bernd Korz describes the current contradiction very precisely:
While politically speaking of sovereignty, dependencies continue to grow.
What’s interesting is less the content than the shift in the debate: Criticism increasingly comes from business and practice – no longer just from the tech bubble.
The topic is slowly shifting from narrative to reality.

Public IT procurement in Germany has been self-stalling for years. Proprietary software was practically the default, while open source was often a legal risk. The result: high dependencies, little competition, and solutions that were paid for multiple times because they couldn’t be reused.
With the revision of the EVB-IT model contracts, this logic is noticeably turning for the first time.
Open source becomes the standard for new software. What has long been politically demanded is now operationally feasible. Authorities can now procure open-source solutions legally secure, without excluding themselves through contractual constructs. At the same time, the market opens up for providers who were previously structurally disadvantaged.
Particularly relevant is the mandatory publication via OpenCoDE. For the first time, a systematic mechanism for reuse within the administration is created. Software is no longer developed in isolation but can be shared, reviewed, and further developed. This not only saves costs but also increases quality because more eyes are on the code.
With the introduction of SBOMs, the security logic also shifts. Dependencies become transparent, risks traceable. Black-box software thus loses part of its previous advantage – namely, intransparency.
This is not a technical detail change but a structural shift. Competition is strengthened, lock-in reduced, and European providers are given realistic chances in the public sector for the first time.
The open question remains how consistently these principles will be implemented – and whether they will also be extended to cloud and platform services. It is precisely there that it will ultimately be decided whether “Public Money, Public Code” becomes more than just a software principle.
🔗 <https://www.heise.de/news/Verwaltu
🧠 Editorial This week feels a bit like an infrastructure reality check. While Germany suddenly …
Sovereign Washing, AI Investigations, and the Illusion of Control 🧠 Editorial This week in the …
🧠 Editorial Schleswig-Holstein. The real North. is kicking Microsoft out of administration. …