Information and Communications Technology and Policy

Information and Communications Technology and Policy

Information and Communications Technology and Policy ›› 2025, Vol. 51 ›› Issue (6): 44-51.doi: 10.12267/j.issn.2096-5931.2025.06.008

Previous Articles     Next Articles

Low-carbon AI: research on green training and inference optimization methods for large model

GE Jian1, NIU Xiaoyan2, BI Ran2, HUANG Yongtao2   

  1. 1. Institute of Technology and Standards, China Academy of Information and Communications Technology, Beijing 100191, China
    2. Institute of Security Research, China Academy of Information and Communications Technology, Beijing 100191, China
  • Received:2025-04-22 Online:2025-06-25 Published:2025-07-04

Abstract:

In recent years, the scale of Artificial Intelligence (AI) models has been expanding continuously, leading to increasing energy consumption in training and inference processes. This has driven the rise of low-carbon AI research. This paper systematically reviews the key technologies of green computing for large models, with a focus on low-carbon training and inference optimization methods. In the training phase, existing studies aim to reduce energy consumption through model architecture optimization, computational precision adjustment, and resource scheduling strategies. These approaches include neural architecture search, mixed-precision computing, and distributed training. In the inference phase, optimization strategies primarily focus on techniques such as model pruning, quantization, edge computing, and cache reuse to reduce computational costs and carbon emissions. Additionally, this paper summarizes the challenges faced by low-carbon AI, including hardware energy efficiency bottlenecks, uncertainties in carbon footprint quantification methods, and limitations in the use of green energy in data centers. Furthermore, this paper explores the future development trends.

Key words: large language model, AI, low carbon, green computing

CLC Number: