How the internet democratises access to the future of computing

06 May 2025 | Consulting

James Allen

Article


One of the original goals of the internet was to connect research centres and allow them to share access to a small number of powerful supercomputers. Today, computing power has advanced dramatically – high-performance computers are faster than ever, quantum computers are emerging, and massive data centres dedicated to artificial intelligence (AI) and machine learning (ML) are coming online. The internet remains the essential infrastructure for the world to access services provided using these powerful, centralised computing facilities.

The shift between centralised and local computing

For decades, computing has shifted between centralised and local models. In the early days, large mainframe computers were centralised and shared by multiple users who accessed them through simple terminals. Then, with the rise of personal computers (PCs) and workstations, more computing tasks could be performed locally. Later, cloud computing brought another shift back to centralisation, where users could rely on powerful remote servers instead of their own hardware. Today, even small devices like smartphones and embedded systems can handle complex computations while still needing to connect to centralised computing resources when needed.

A common way to structure this access is through a client–server model, where the client (a local device such as a laptop or phone) sends requests to a remote, more powerful server, which processes the data and sends results back. Despite personal devices becoming more powerful, centralised shared computing facilities are still needed – and access to such facilities through the internet is thus a critical part of our day-to-day life.

Why shared computing facilities matter

The most advanced computing tasks require enormous resources that are uneconomic for individuals or small organisations to maintain on their own. Shared computing facilities are therefore required and serve different needs:

  • AI and machine learning: the latest AI systems, including large language models (LLMs) like ChatGPT, require vast computational power to train. Training these models involves processing enormous datasets and can take weeks or months on specialised hardware. Even using trained AI models (inference) often requires significant computing power, which while sometimes done on local devices often still relies on centralised servers.
  • Scientific simulations and forecasting: supercomputers are crucial for tasks such as climate modelling, weather prediction and aircraft design. Leading edge computations are only feasible on specialised high-performance computing (HPC) systems (supercomputers).
  • Quantum computing: while still in its early stages, quantum computing has the potential to revolutionise fields like cryptography and materials science. Because quantum computers require highly controlled environments, and because the greatest advantages over ‘classical’ computers will be derived from the most capable quantum computers, they are likely to remain centralised resources that researchers and businesses access remotely.

The role of the internet

From its inception, the internet has enabled remote access to computing power. Today, it remains the backbone that connects users to high-cost computing facilities worldwide. Large-scale AI training facilities can cost billions of dollars to build, while regional HPC centres often require investments in the tens or hundreds of millions. Given these costs, only large enterprises and well-funded research institutions can afford to own such facilities. However, through the internet, businesses, researchers and even start-ups can access these resources on a pay-per-use basis via cloud computing services like Microsoft Azure, Google Cloud and Amazon Web Services.

The location of these facilities is also influenced by infrastructure considerations. AI and HPC data centres are often built in areas with access to renewable energy, water for cooling and fibre-optic networks for fast data transmission. Quantum computing introduces additional challenges, as future quantum networks may need specialised connections that can maintain the integrity of quantum information (qubits) over long distances.

Beyond making raw computing power available remotely, the internet also plays a vital role in enabling the collection, storage and sharing of data. AI models, for example, are trained on vast amounts of publicly available and proprietary data, including books, scientific papers, websites and sensor readings. The ability to gather and process this data at scale would not be possible without the global connectivity the internet provides.

The internet has always been essential for accessing computing resources, from early supercomputers to today’s advanced AI and quantum computing facilities. As computational demands continue to grow, the ability to remotely access high-performance computing will remain critical for scientific discovery, business innovation and technological progress. Whether for training AI models, running climate simulations or advancing quantum research, the internet continues to democratise access to the world’s most powerful computing tools, making them available to a broader audience than ever before.

 

Do you know the internet?

Learn more

Author

James Allen

Partner, expert in regulation and policy