top of page

Data Centers - The New Shovels for the Gold Miners?

  • Autorenbild: the haptic investor
    the haptic investor
  • 29. Sept.
  • 10 Min. Lesezeit

With OpenAI, Grok et al. likely showing their new best models this month while data center build-outs are increasing massively, we felt it is time for a blogpost with some insights and development over the last few years in the AI space. From tools to real world productivity gains and how to interpret them.

AI Usage massively gained momentum

The evolution over the last few years has fundamentally changed the AI landscape. Models used to be of limited utility. Running LocalLLMs on your own hardware was a domain for enthusiasts but has become obsolete for top-tier models. With the introduction of models like Llama 405, hardware requirements have exploded to a point where local operation is simply unprofitable for most users and companies. A scenario that crypto miners in countries with rising electricity costs are likely to be familiar with. At the same time, access to hosted, state-of-the-art models via platforms like OpenRouter has become so cheap and easy that the barrier to entry has virtually disappeared. The consequence is that AI is now being used for heavier and more compute-intensive tasks, far beyond simple text generation.

Not only is AI research likely to enjoy increased popularity in the future, but the use of AI in research is also growing day by day, in some cases even driven by politics.  The extent to which a balance will be struck between self-sufficient market and AI development and political intervention is questionable, but that is not the subject of this article. The European Commission, for example, has stated (direct quote from European research development and deployment of AI | Shaping Europe’s digital future:

Science and technological advances are the driving forces that improve AI and its application in multiple domains. EU researchers are already harnessing AI in groundbreaking ways, from improving cancer treatments to solving environmental issues and improving earthquakes’ impact predictions.

The Commission enables this groundbreaking AI research through different strategies. The 2020 White Paper on AI, the Coordinated Plans on AI, the AI Innovation Package and the AI Continent Action Plan all emphasize the importance of developing and deploying AI that is trustworthy and aligned with European values. As part of the Commission, the EU AI Office is actively driving the implementation of these strategies, alongside its key role in enforcing the AI Act—particularly for general-purpose AI—while promoting the development and adoption of trustworthy AI and strengthening international cooperation.

The real paradigm shift, however, came with the introduction of Agents and standardized frameworks like the MCP Protocol. These have turned passive text generators into active, autonomous systems. An agent can now be given high-level goals which it pursues independently. This ranges from code development, including compilation and execution, to having the authority to log into accounts to make transactions or place orders. New, "fully agentic" AI browsers can already handle complex tasks in the background, while open-source tools allow AI to control the browser to collect data and automate workflows.

This digital boom has very real, physical consequences. The building of datacenters is about to exceed the amount of office buildings being built, a clear indicator of the massive capital shift from human workspaces to infrastructure for machines. Over-investment might be an issue, but demand certainly did not reach its peak, from what we can see anecdotally. This development is being backed by a new generation of hardware. Companies like Cerebras are developing wafer-scale chips with nearly a million AI-optimized cores to process the enormous amounts of data at the necessary speed and enable services like Perplexity's AI search. It's not just a software evolution, but a revolution built on new silicon.

ree

 

ree

Usage

But what exactly are data centers? Well, the name is more or less self-explanatory because, data centers are critical infrastructure used to store, process, and distribute large amounts of data. They support nearly every aspect of modern digital life and enterprise operations. A brief overview of the areas of application may shed some light on the abstract description:

1. Cloud Services & Hosting

Companies like AWS, Microsoft Azure, Google Cloud use and need datacenters for hosting websites, apps, and platforms (e.g., Netflix, Zoom, Shopify).

2. Enterprise IT Infrastructure

SMEs,  banks, insurance companies, pharma, law firms use and need data centers for running internal systems like ERPs, databases, file servers to make sure they have secure and reliable IT environments for mission-critical business operations.

3. Artificial Intelligence & Machine Learning

“Techies” like OpenAI, NVIDIA, Tesla, Meta use and need data centers for high-performance computing to train and run AI models. This is by far the biggest growth factor at the moment.

4. Data Storage & Backup

Government agencies, every-day enterprises, hospitals etc. need and use data centers for long-term storage of sensitive or legal data (e.g., medical records, financial data).

5. Content Delivery & Streaming

Entertainment platforms like YouTube, Netflix, Spotify, Twitch need and use data centers for storing and delivering video/audio content close to end users via CDNs.

6. E-Commerce & Online Transactions

E-commerce and payment providers like Amazon, PayPal, Stripe, Shopify need and use data centers for handling payment processing, inventory, and customer data in real-time.

7. Telecommunications & 5G

Telecom and network providers like Verizon, AT&T, Deutsche Telekom use and need data centers for routing voice, data, and video traffic between users and services.

8. Blockchain & Cryptocurrency

Bitcoin miners, Ethereum validators, blockchain startups and crypto companies need data centers for running distributed ledgers and for validating transactions.

9. Specialized Use Cases

On top of that there are multiple specialized use cases for data centers, such as:


  • Military & intelligence for encrypted data processing and command control

  • Scientific research for Weather prediction, genomics, climate modeling

  • Healthcare for storing and processing medical imaging, patient records, AI diagnostics.


As you can see, there are numerous companies that use data centers. Certainly, it is not worthwhile for most companies to operate their own data centers, but they are simply connected to existing data centers from third-party providers. Not only is the demand for computing power constantly increasing, but so is the amount of data. Therefore, this development has certainly not yet reached its peak.

This also raises an interesting question: where does the electricity for data centers come from? Who supplies it? What is its source? How much can it cost to operate the centers economically? Are there special conditions for specific market participants? Which energy producers/suppliers are already in the game, which ones still need to catch up, and, above all, which ones should be included in your stock portfolio? Perhaps a topic for another article.

Successful and Unsuccessful Players

While the market can be divided into data center providers and companies that operate their own data centers, there are also successful and less successful companies in this industry. It should come as no surprise, but some of the names that appear among the less successful companies should be familiar to everyone.

1.      Successful Players

Amazon Web Services (AWS): The global leader in cloud infrastructure. Operates massive data centers worldwide and generates tens of billions in annual revenue. Dominates market share along with Microsoft Azure.

Microsoft (Azure): Strong second to AWS. Heavily invested in data center infrastructure and hybrid cloud capabilities. Grew rapidly due to enterprise integration with Microsoft products.

Google (Google Cloud Platform): Significant investment in high-performance, energy-efficient data centers. Known for advanced AI and analytics offerings.

Equinix: A global data center and colocation provider that operates 200+ data centers in over 25 countries.

 2. Unsuccessful Players

IBM: Once dominant in IT and mainframes, IBM has lost ground in cloud and modern data center infrastructure. IBM Cloud lags behind AWS, Azure, and Google.

Intel: Still a major chip supplier for data centers, but has lost significant market share to AMD and ARM-based alternatives like those from Amazon and NVIDIA.

Apple: While Apple operates large data centers to support iCloud and services, it is not a major player in the commercial data center or cloud infrastructure market. At least Apple has the funds to change that, if really desired.

StratoscaleStratoscale was an Israeli software company with offices in Sunnyvale (CA), Boston (MA) and New York City (NY), offering software-defined data center technology, with hyper-converged infrastructure and cloud computing capabilities. Stratoscale combined compute, storage, and networking hardware with no additional third-party software. Stratoscale has shut down in 2019 - according to official information due to lack of funding - with no details for the future of its products.

Real-world implications

The productivity gains coming from AI are massive and they are happening now. The impact is most visible in white-collar professions. A coder can now complete a task in an hour that used to take a day. A marketing agency can produce a high-quality commercial in-house in a day, bypassing the weeks and heavy costs of hiring a studio, videographers, and editors. The next wave is already forming, with advancements in robotics from companies like Toyota and Tesla suggesting that manufacturing and physical labor jobs are on the verge of a similar disruption.

However, from what we see in industry conversations, these productivity gains are not translating into lower costs for the end customer. The reality is different. Service providers are hesitating to reduce their bills. Instead, they are increasing their output or reallocating the saved resources. In the ad agency example, the budget for a campaign remains the same. The money saved on production is immediately reinvested into larger ad buys and more campaigns. A software firm that builds a feature faster doesn't charge less; it delivers more features in the same timeframe or moves on to the next project, maximizing the value of their retainer.

This creates a significant economic side effect. We expect massive inflationary pressure on services and assets that are staples and cannot be easily automated. As more companies reallocate their free cash flow from automated tasks, that capital will chase a limited supply of valuable assets. This is already happening with ad space, but it will extend to bidding wars for more high-value resources.

The implications for truth and control are even more severe. The issue is no longer just about a model having a detectable political bias. The core problem is that an AI acts as a mirror, reflecting and amplifying the systemic inequalities and narratives present in its vast training data. This makes the "AI truth machine" a potential propaganda machine by default. Compounding this, recent research has uncovered the potential for "subliminal learning," where models can be trained to transmit hidden behavioral traits through their output, a kind of Trojan Horse for embedding values invisibly.

 Going Forward

A big part of working with AI tools is still validation. This is the new bottleneck. An AI can draft a legal contract in seconds or generate a complex spreadsheet, but its output is useless without an expert to verify it. If you are not a specialist who can spot subtle mistakes fast, you will inevitably fall into the trap of generated wrong facts.

The authors have also had the opportunity to test a number of software programs in the fields of law and finance. What stood out here was that in the field of law, models still very often produce hallucinations - even with specialized software - verdicts are invented, specialist literature is misquoted or passages of text are invented, company forms such as Inc., LLC., SE, and GmbH are mixed up, and results are tailored to suit the prompt poser, even though they cannot fit.

In the area of business problems, conclusions are regularly drawn that cannot be implemented in the real economy, and calculations often need improvement. In a real case, for example, the value of the company decreased when profitability increased. In another real case, the use of the entire container space for the storage of certain raw materials caused the value of the goods inside the container to fall instead of rise, according to the model 40m³ of copper was worth less than 12m³.

The technology moves forward, but the ultimate responsibility—and liability—remains with the human operator. This dynamic of human-bot collaboration is where real value is created, separating the effective users from those who blindly trust the output.

This reality will cause a brutal market bifurcation. We expect a massive squeeze out of slow and inefficient adopters. Companies that fail to integrate these tools effectively will be out-competed on speed, cost, and output. Conversely, the best in class—the top marketing agencies, elite software houses, and premier consultancies—are poised for massive gains. They will leverage AI to amplify their existing expertise, delivering higher quality work faster than ever, widening the gap between them and the rest of the market. Earning and margins will increase for fast adopters, while the laggards will simply be pushed out.

The biggest issue going forward, however, remains truth. Models are checked for political bias, but the results are not encouraging. On controversial topics, and especially on subjects where the public narrative is factually wrong, models continue to echo the flawed consensus. This is how the AI "truth machine" becomes a propaganda machine, amplifying bias at an unprecedented scale. The question is shifting from "Is the AI aligned with humans?" to the far more critical "Which humans is the AI aligned with?".

To counter this, a renewed push for truly open-source models, free of corporate and political guardrails, should regain momentum. It is essential to have a diverse ecosystem of models to ensure we don't end up with a single, sanitized version of reality dictated by a handful of companies. The new ways of utilizing agents and MCP will continue to improve productivity and open up retail use cases,  but without a solution to the truth problem, we are building a more efficient future on a foundation of sand.

We are still. So early.

Disclaimer:

The content provided in the articles on The Haptic Investor is for informational and entertainment purposes only. The articles do not constitute financial advice, and the information presented should not be considered as a recommendation or endorsement for any investment, financial, or business decisions.

Readers are encouraged to seek professional financial advice and conduct their own research and due diligence before making any financial or investment decisions. The Haptic Investor and its authors do not assume any responsibility for the accuracy, completeness, or timeliness of the information provided.

Any actions or decisions made based on the information found on The Haptic Investor are the sole responsibility of the reader. The Haptic Investor and its authors will not be held liable for any losses or damages resulting from the use of the information provided in the articles.

It is crucial to understand that the financial landscape is dynamic, and what may be true or relevant at the time of publication may change. Readers should consider the information as a starting point for their own research and not as a substitute for professional financial advice or consultation.

By accessing and using the content on The Haptic Investor, readers acknowledge and agree to this disclaimer.

Embark on a journey through the glasses of a #lawyer, #privateequity #executive and fifth generation #familyentrepreneur as #thehapticinvestor" takes you deep into the heart of #industry #insights. Drawing on a rich tapestry of the author's experiences in big #law, private equity, #consulting, #assetmanagement and #entrepreneurship, this publication is your entertaining compass in the complex world of #investments and #business.

In short: The Haptic Investor is a Is a #financialmagazine with a focus on #quantitative, #macroeconomic and #operational #economic #analyses of various #markets across different #assetclasses.

 
 
bottom of page