Data Center Firms 'Getting Creative' To Squeeze AI Into Older Buildings
The high-powered computing equipment needed for artificial intelligence has forced fundamental changes in how new data centers are designed. But it’s creating headaches for colocation firms with older data centers that are being asked to shoehorn AI systems alongside legacy information technology equipment into facilities that weren’t built for the task.

Nearly all new data centers under construction today are being built from the ground up to support AI infrastructure, with their designs optimized for the denser concentrations of power, advanced cooling systems and heavier weight that comes with high-performance computing. But the vast majority of data centers already in operation weren't designed with AI in mind.
As major tenants increasingly look to deploy AI computing workloads in colocation data centers where they may already operate their older IT infrastructure, operators of those facilities have had to scramble to find ways to integrate AI processing despite underlying infrastructure ill-suited to support it.
At Bisnow’s National DICE Data Center Management, Operations and Cooling event, held last month at The Hyatt Regency Reston in Virginia, colocation providers and enterprise tenants said wedging AI into legacy facilities presents an enormous challenge. With full-scale retrofits rarely a viable option, firms are looking for creative solutions so they don't miss out on the AI opportunity, even as the rapid evolution of AI hardware and demand makes predicting future tenant needs nearly impossible.
“Sometimes I feel like the team and I are in Disney World, and we're the Imagineers trying to develop the next big fun thing with an eye far out into the future,” said Julie Coates, vice president for life cycle management at 1547 Critical Systems Realty, which operates a portfolio of connectivity-focused data centers, many of them converted from other uses. “Our buildings are over 100 years old — they are very legacy and not even purpose-built, so they have all of these weird quirks that you have to solve for to be able to bring in these larger groups.”
The need to redesign data center infrastructure for AI stems largely from racks of AI processors using far more power per square foot than traditional data center servers. While the average rack of data center servers globally consumes 12 kilowatts, Nvidia’s newest Project Blackwell processors use 130 kilowatts per rack, according to JLL. These rack densities are expected to continue skyrocketing over the next 24 months.
For data centers, the infrastructure changes to support high-density AI computing go beyond the installation of electrical systems capable of such concentrated power delivery. Higher-density deployments produce far more heat than the servers most data centers were designed around, and the most powerful AI systems often require fundamentally different cooling systems that use liquid refrigerant instead of circulated air. Additionally, AI racks and liquid cooling are both heavier than older systems, potentially exceeding the structural boundaries of many facilities.
For these reasons, hyperscale tech giants like Amazon Web Services, Microsoft and Meta are building and leasing new data centers designed specifically for AI. But for other companies looking to deploy their own AI computing, this kind of greenfield development for AI is rare. The vast majority of the time, even the largest corporate enterprise tenants and the colocation data center firms hosting them are finding ways to wedge AI alongside traditional IT gear in the same data halls.
“Whenever people talk about AI, they think that everybody builds greenfield data centers that are end-to-end AI, but it's never like that. You can have a 480 KW rack … and next to it I might have a storage array that’s 4 KW,” said Dagi Berhane, senior director for data center architecture and engineering at Salesforce. “When you're a software service company like us and PayPal and Uber, it's not our bread and butter to do brick-and-mortar builds.”
But incorporating AI computing next to older servers in a data center not purpose-built for the task has presented enormous challenges — and made deploying new capacity significantly more complicated — for data center firms and tenants. For providers, data center space is suddenly far less one-size-fits-all than it was just a couple of years ago, with a growing need to customize supporting infrastructure in sections of legacy facilities for specific AI equipment and use cases.
Where possible, this can mean installing electrical systems capable of handling higher densities and plumbing infrastructure for liquid cooling. But such partial retrofits aren't always viable due to building design, cost and risk to existing workloads, so firms are getting creative.

Some providers are exploring innovative solutions like creating sophisticated computer models of how air and heat flow through their data halls, then redesigning air handling systems and rearranging where certain IT equipment is located to optimize the removal of heat from high-density racks.
Other providers are taking similar approaches to overcome the structural limitations of their buildings, Coates said. She pointed to her firm’s efforts to accommodate a megawatt-scale deployment from a cloud provider at a data center housed in a hundred-year-old building, in which neither the ceiling heights nor the structural limits of the floor plates were appropriate for the tenant’s equipment.
“We are getting very creative with what we can do,” Coates said. “We can't do everything that they're asking for, so we’re looking at whether we can use multiple floors because we're restricted on our height limitations. We're restricted on our structural … so where can we spread certain things out?”
The rapid evolution of the AI landscape has made adapting legacy facilities even more complicated. AI computing hardware is advancing at breakneck pace, with firms like Nvidia releasing more advanced chips with more complex infrastructure requirements annually.
While the technology is changing, there is also little clarity on the future of corporate AI adoption and the specific AI use cases that companies will invest in. Uncertainty around tenants’ AI use cases translates into uncertainty about how AI computing will fit into existing IT ecosystems in the months ahead.
Compu Dynamics CEO Steve Altizer said the industry should be embracing the unpredictable AI ecosystem and learning curve that lie ahead, even as firms are already scrambling to incorporate these new technologies into their facilities.
“It's a fabric of mechanical and electrical and network utilities all woven together into a very complex system that nobody really knows how to do yet,” Altizer said. “I think we're all trying to figure it out on the fly, and we're all going to learn a lot over the next 24 months.”