D2D: Data Center Math, the Hard Way
In which we try to come to terms with the complexities of going to market with data center hardware and the madness-inducing task of trying to quantify the size of that market.
Highlights from the past two weeks include the multiple ways in which we are exploring the new AI data center landscape - from building servers to counting servers to assessing Nvidia’s options. We also began an exploration of whatever comes next for the VC industry, because it is about to change somehow.
Just a reminder that we operate our newsletters on a paid model. Paid subscribers will get three newsletters a month including our China Deep Tech notes. Paid subscribers will also get early access to the newsletter as well other benefits coming soon. Please subscribe and support our work.
Highlights from our Blog
The Venture Capital industry is changing. Like a caterpillar wrapped in a cocoon, our sense is that a metamorphosis is coming. Over the past decade much of the industry has become stuck in the rut of an overly-quantitative approach of counting CAC, LTC and all the other SaaS metrics. That model is likely gone forever, but it is not clear what comes next. The biggest funds will continue to grow, but are transforming into “asset managers”, while smaller firms are finding it impossible to raise LP capital. Eventually this will all turn, but the new model will likely be very different.
Like everyone else, we are spending too much a lot of time digging into AI data centers. First, we explore all the ways that servers get built, every buyer wants a custom server, but no one will pay for an infinite number of designs. The design process is a major barrier to entry into this market for new silicon vendors, and some incumbents. Next, we tried to calculate how big the “AI server” market is, and found that is just a journey into madness. No one seems to have a clear idea of just how many AI servers will ship this year or next, or shipped last year for that matter. Finally, we tried to put a framework around how big “AI” can grow, There is “AI as Magic” or AGI, which does not seem imminent. Then there is “AI as a feature”, using AI to do things we already do a little better, which is where almost all the gains from AI have accrued so far. In between, there is “AI as an application” or even a platform - this is the ground that everyone is fighting over so fiercely, but the lack of clear consumer uses cases remains significant.
Nvidia reported another strong quarter. This has become so routine that we sense people are starting to take it for granted. In their latest quarter, they added $16 billion in revenue compared to last year, but only added $800 million in opex. They now account for 80% of merchant data center processor spending. This then begs the question of how much power Nvidia has in the industry right now and how it will exert that power. As much as they sometimes say they “just” want to sell semiconductors, their growing reach in networking and software poses a serious threat to the hyperscalers,
If you like this content, you should check out our podcast The Circuit
Semis, Hardware and Deep Tech
We meant to link to this new Substack called Construction Physics , which is awesome at detailing how things get built. They recently did a post on “How to Build a $20 Billion Semiconductor Fab”. Worth paying for a subscription to get the reading list behind the post.
Magnetic tape shipments reached a record amount last year (as measured by capacity). Most of us assume that this stopped being a viable product category decades ago, but it turns out to be a great, low-cost, ultra-long-term storage solution.
Timothy Pricket Morgan from Next Platform wins the headline contest again “Intel Brings a Fork to a Server CPU Knife Fight”.
This is a good primer on optical sensors, it is a couple years old, but will still come in handy.
Three old men get dinner at a night market stand - except the three men are the founders of Quanta, Nvidia and TSMC. There is another more infamous photo of Huang at Computex making the rounds on the Internet, but we cannot publish it here. And we think this one is much more interesting.
Networking and Wireless
HPE Aruba launched its private networking suite. On paper, this looks like a fairly interesting solution, but the heart of private networking is the software and services layer. HPE has a mixed track record on this front, but still a good sign for this slowly emerging sector.
Among the many press release coming out of Computex the one that intrigued us the most came from Arm. In addition to the standard upgrades to existing product lines (more cores!), they also extended their CSS program beyond the data center to include chips for mobile and PCs. Arm needs to do a better job of explaining CSS, and we will likely write more on it soon. The basic idea is that chip designers can go from concept to working silicon much faster if they use CSS, where Arm has done a lot of the product work already. This lowers the bar for custom chips in phones and definitely hints at more Arm-based PCs soon.
Google is leading a new fiber link that runs along Eastern Africa and then crosses the Indian Ocean to connect with Australia. This is the first link to cross the Southern portion of the Indian Ocean. Also important that it links a half dozen African countries, not just hub-and-spoke project for the benefit of outsiders.
Research firm Quilty Space released a financial analysis of Starlink. The article focuses on the astonishing revenue growth of the company, but also says it is massively profitable. We are very curious about that profitability, operating expenses for satellite networks can be pretty steep, and we would love to know how Starlink manages those.
Software and the Cloud
Google published a report on the productivity gains they are seeing from the use of AI-based coding tools. The results are prosaic but still meaningful. They saw a growing use of these tools among their coders, and a slow but steady improvement in productivity. But they also note that their developers are spending more time on code review of that AI-generated code. AI is not magic, but it is still useful.
We do not do much work on security, but we think very highly of the team at Packet Pushers. They attended the recent RSA security conference and recommend following these four companies, so we will be watching.
Diversions
Using LLMs to generate financial trading models. This paper, from no less than the University of Chicago Booth School, demonstrates that LLMs can build useful predictive models. Algorithmic trading is not new, but in the past new strategies quickly permeated the market diluting their effect, efficient markets and all that. So it is intriguing to think how LLMs, which can ‘learn’ (adapt) will respond to other LLMs making use of the same models. If it has not already begun, there is going to be an AI arms race coming to Wall Street soon.
Image by Microsoft Co-Pilot
Thank you for reading D2D Newsletter. This post is public so feel free to share it.