Arc Brief ARCBRIEF.NEWS · APAC TECHNOLOGY INTELLIGENCE · MARCH 2026
EDITORIAL · ISSUE 001 · DATA CENTERS & INFRASTRUCTURE

What Is a Data Center?

The buildings that run the world — and why Asia Pacific is waking up to them

You walk into a data center and the first thing you notice is the sound. Or rather, the absence of what you expected.

No human voices. No keyboards. No footsteps. Just a wall of white noise — the combined hum of ten thousand cooling fans moving air at precisely 18 degrees Celsius, around the clock, every day of the year.

The second thing you notice is the cold. Not uncomfortable cold. Clinical cold. The kind that reminds you this space was not designed for people.

The third thing — if you are allowed into the server hall itself, which most people never are — is the scale. Row after row of black metal racks, each two metres tall, stretching back until they blur. Every rack holding perhaps forty servers. Every server handling thousands of requests simultaneously. The building you are standing in might consume more electricity than the suburb surrounding it.

This is a data center. And there are thousands of them.

CROSS-SECTION · HYPERSCALE DATA CENTER · NOT TO SCALE POWER Generator 2 × 2MW UPS N+1 Transformer · 11kV → 415V Power Distribution Units PDU Redundancy (2N) COOLING CRAC Units × 8 Computer Room Air Conditioning Chiller Plant · 7°C supply Cooling Towers Evaporative · outdoor WUE: 1–2L per kWh SERVER HALL · WHITE SPACE HOT AISLE 40°C+ COLD AISLE 18–21°C Row A · 8 racks · 320 servers Overhead cable trays · fibre + power Row B · 8 racks · 320 servers Row C · 8 racks · 320 servers Raised floor · 600mm · cold air beneath Network Operations · Core Switches · Fiber Meet-Me Room NOC staffed 24/7 · BGP peering · SLA 99.999% uptime Power: 10–100MW PUE: 1.2–1.6 960+ servers shown Uptime: 99.999% Temp: 18–27°C

FIG. 1 — CROSS-SECTION: HYPERSCALE DATA CENTER · Power zone (left) · Server hall (centre) · Cooling zone (right)

What It Actually Is

Strip away the mystique and a data center is three things in a building: computers, power, and cooling.

The computers — called servers — are simply powerful machines stripped of everything unnecessary. No screen. No keyboard. No case to make them look attractive. Just processors, memory, and storage, mounted on sliding rails so they can be pulled out and replaced in seconds.

The power is staggering. A single modern server rack draws around 10 kilowatts — roughly what ten suburban homes consume. A facility with five hundred racks draws five megawatts. The largest hyperscale campuses — AWS, Google, Microsoft — draw one hundred megawatts or more. These are not buildings that plug into the grid. They are facilities that negotiate directly with power utilities, sometimes building their own generation capacity.

The cooling exists because electronics generate heat, and heat destroys electronics. Every watt of computing power produces a watt of heat. Remove that heat or your servers fail. The engineering challenge of a data center is, at its core, a physics problem: how do you move heat out of a box as fast as you put electricity in?

The answer is cold aisles and hot aisles. Servers face alternating rows — cold air pushed in from one side, pulled hot from the other. Air conditioning units the size of shipping containers line the walls. Chillers on the roof reject heat to the atmosphere. In the most advanced facilities, liquid cooling runs directly to the chip, removing heat a thousand times more efficiently than air.

114 GW
GLOBAL INSTALLED CAPACITY · 2025 · UP 5× SINCE 2005
$3T
PROJECTED SECTOR INVESTMENT BY 2030
14%
GLOBAL CAGR THROUGH 2030 · DRIVEN BY AI
33%
OF 2025 CAPACITY DEDICATED TO AI WORKLOADS
Where It Came From

The concept is not new. Banks and governments have kept rooms full of mainframe computers since the 1960s. But the modern data center as we know it — always-on, internet-connected, scale-at-will — was born from a single moment.

In March 2006, Amazon Web Services launched S3, a storage service that let any developer rent space on Amazon's computers by the hour. For the first time, a company did not need to buy its own servers. It could borrow Amazon's. The implications were quiet at first, then enormous.

Within a decade, every major technology company was racing to build facilities large enough to rent at scale. Google. Microsoft. Meta. The facilities grew from single buildings to campuses. From campuses to entire industrial zones. From industrial zones to the dominant form of infrastructure investment on the planet.

"By 2025, global installed data center capacity had reached 114 gigawatts — up fivefold from 2005. The industry was spending more on new construction than most countries spend on roads."

Where They Are — And Why

The map tells a story of geography, power, and regulation — in that order.

PACIFIC INDIAN WHY US DOMINATES 33% of global capacity in N. Virginia $425B US DC spend · 51% of hyperscale APAC · 20,320 MW · 2025 13.1% CAGR → 37,580 MW by 2030 Johor fastest growing · SIN PUE <1.3 GLOBAL TOTAL 2025 114 GW installed

FIG. 2 — GLOBAL DATA CENTER CAPACITY MAP · 2025 DATA · DOT SIZE = INSTALLED MW · HOVER FOR DETAIL · Sources: CBRE, Knight Frank, JLL, IEA

Northern Virginia in the United States holds roughly thirty-three percent of global data center capacity. This is not an accident. In the early 1990s, a company called MCI laid the backbone of the commercial internet through the region. Every major technology company building online services in the 1990s connected to that backbone. The concentration compounded. Today, an estimated seventy percent of global internet traffic flows through a corridor of buildings in Loudoun County, Virginia.

Europe's cluster — London, Amsterdam, Frankfurt, Dublin — reflects a different logic. These cities became hubs because they were financial centres first, technology hubs second, and because European data protection law created a regulatory incentive to keep European data in Europe. Dublin, population 1.4 million, hosts 1,471 megawatts of live capacity — more than Singapore — because Amazon, Google, Meta, and Microsoft chose it as their European headquarters for tax and regulatory reasons.

Asia Pacific tells the most interesting story. Tokyo at 1,489 megawatts is the region's most mature market. Sydney at 1,123 megawatts is growing fast as Australian enterprises and government agencies modernise. Singapore at 1,091 megawatts is the most connected node in Southeast Asia — but growth there was deliberately constrained by the government's 2019 power moratorium, which paused new data center licences amid concerns about electricity consumption on an island that imports most of its energy.

That constraint created Johor. When Singapore said no more, Malaysia's southernmost state — separated from the city-state by a causeway — said yes. Investment flooded in. In 2023 and 2024 alone, fifteen billion dollars was committed to the Johor corridor. A single project, YTL's Green Data Center Park, is being powered by a 500-megawatt solar farm with Nvidia's infrastructure built in from the start. Johor went from an afterthought to one of the most consequential digital infrastructure corridors in the world in approximately three years.

What Still Needs to Be Said

The map and the cross-section tell you what a data center is and where they sit. They do not tell you what they cost communities, how much power they draw from local grids, or how much water evaporates from cooling towers in water-stressed regions. They do not tell you about the land — farmland, in many cases — that disappears under concrete foundations. These are conversations worth having, and we will have them.

They also do not tell you what runs inside. The servers in these buildings are no longer just storing files or serving websites. Increasingly they run the training workloads for large AI models — requiring clusters of tens of thousands of GPUs drawing forty to fifty megawatts each, sustained over weeks. The character of demand is shifting. A facility built in 2018 for cloud storage is not the same animal as one built in 2026 for AI inference. The engineering, the power requirements, the cooling systems, the economics — all different. This shift, and what it means for the APAC region specifically, is a story we will return to.

And there is the question of who decides where these buildings go — not in investment memos, but in communities. That gap deserves more scrutiny than it gets.

Why APAC Cannot Ignore This Anymore

For decades, data centers were an American conversation. The infrastructure was built there, the companies that needed it were headquartered there, and the regulatory questions were posed there. Asia Pacific watched from a distance.

That distance is closing — and not only because hyperscalers are building here.

Across the region, governments are passing laws that require data about their citizens to stay within their borders. The drivers are two: sovereignty and speed.

COUNTRYFRAMEWORKLOCALISATION POSITIONLEVEL
🇨🇳 ChinaPIPL + Cybersecurity Law 2017Mandatory for critical data categories. Security assessments required for cross-border transfers.STRICT
🇮🇳 IndiaDPDP Act 2023Government authority to restrict transfers. Significant Data Fiduciaries may face localisation. Implementation advancing 2025.STRICT
🇮🇩 IndonesiaPDP Law 2022Localisation required for public electronic system operators. Transfers require equivalent protection.MODERATE
🇻🇳 VietnamCybersecurity Law 2018Telecom, e-commerce, cloud storage must store user data in-country. Regulator can inspect transfer applications.STRICT
🇦🇺 AustraliaPrivacy Act (APP 8)Exporters remain liable for misuse even after data leaves the country. Tightening enforcement in 2025.MODERATE
🇸🇬 SingaporePDPA + AI GovernanceRequires comparable protection or binding contracts before export. Global CBPR participant. Advanced AI frameworks.MODERATE
🇲🇾 MalaysiaPDPA Amendment 2024Adequacy-based model effective April 2025. Stricter in financial services, telecom, health.EMERGING
🇯🇵 JapanAPPI (amended)Adequacy findings or standard contractual clauses. Closely aligned with GDPR.MODERATE

The implication is direct: if your data must stay in the country, there must be a building in that country capable of storing it. Data sovereignty laws are, among other things, a mandate for local data center investment. The regulatory map and the infrastructure map are becoming the same map.

There is a simpler reason too. When data is stored closer to you, services are faster. The gap between a server in Singapore and a server in Virginia is roughly 170 milliseconds of round-trip time. That is imperceptible for email. It matters for real-time financial transactions, AI assistants, and any application that needs to respond within a human interaction window.

And then there is AI. Every model that generates text, images, or decisions runs somewhere physical. The race to build AI capability is, at its foundation, a race to build and control the infrastructure that AI runs on. Data centers are not background story for the AI era. They are the story. We have simply not been treating them that way.


Data centers moved from the edge of our awareness to the centre of our infrastructure without most of us noticing. They power every search, every stream, every transaction, every AI response. The buildings are getting larger, the investment is accelerating, and the decisions about where they go and who controls them will shape digital life in Asia Pacific for decades.

We are going to cover this closely — not as a technology story, but as an infrastructure, policy, and community story. Because that is what it is.

Arc Brief tracks AI, technology, and digital infrastructure across Asia Pacific, with focus on the markets, decisions, and communities the mainstream technology press overlooks. This is the first in a series on the infrastructure running the region's digital economy. The next piece examines what actually happens in the 170 milliseconds between you pressing send and a message arriving on the other side of the world.