AI work is responsible for increasing the power consumption of data centers, and the industry adopts two major solutions to challenge the challenge.

Arm recently released the AI Readiness Index Report, which pointed out that AI work is extremely energy-consuming and usually relies on data centers that consume a lot of power to supply computing and cooling data centers. It is estimated that AI wo...


Arm recently released the AI Readiness Index Report, which pointed out that AI work is extremely energy-consuming and usually relies on data centers that consume a lot of power to supply computing and cooling data centers. It is estimated that AI work load accounts for about 10-20% of the total electricity used in data centers. Such demand raises persistent problems related to carbon emissions.

From the current perspective, AI-related energy efficiency encounters two major challenges, namely data center power consumption and carbon footprint. First of all, the power required to train large AI models is amazing. For example, training a single large language model can consume energy as much as hundreds of households throughout the year, and the cooling system itself consumes quite electricity. The carbon footprint, due to the huge impact of AI on the environment, has become a major obstacle to widespread applications. According to the International Energy Agency (IEA) data, the modern data center pays about 1% of global temperature gas emissions.

To understand these challenges in power consumption, AI manufacturers and integrators mainly adopt two methods: the first is to use energy wafers to solve energy efficiency problems through chip design innovation, such as low-power Arm architecture to minimize energy consumption, improve per watt efficiency, and at the same time, do not lose computing power; the second is to use a permanent data center, a module data center that uses renewable energy power supply, or use advanced cooling technology and virtualization technology to improve operational efficiency.

Arm pointed out that the AI data center is currently being transformed to use renewable energy such as solar energy and wind energy. This transformation not only reduces temperature-room gas emissions, but also is consistent with the global sustainable development goals. Companies like Google and Microsoft have promised that their data centers will use 100% renewable energy, setting up rods for other basic facility suppliers in the industry.

In addition, Google recently partnered with Kairos Power to deploy small nuclear reactors in its AI data centers; at the same time, Microsoft is helping reboot a downtime reactor in Three Mile Island, which is expected to power its operations in 2028.

Arm believes that as AI applications become increasingly large and complex, AI technology-based facilities must evolve. In key infrastructure challenges such as power and efficiency balance, large-scale data processing and reduced delays, the industry has made major breakthroughs through hardware and design innovation, and perpetuity has become the top priority and must be realized by reducing the environmental impact of AI operations.

Extended reading: Samsung purchases the US medical service company Xealth to enter the US-based network care market Intel has started major cuts again, Israel's Kiryat Gat factory closed

Recommend News