By analyzing the output of AI algorithms and applications from related literatures and projects, The project of AI Carbon Efficiency Observatory (AI-CEO) aims at providing comprehensive evaluations on various AI models and applications about whether the empowerment through AI are efficient from the perspective of Carbon Efficiency. Project AI-CEO is aimed at promoting the sustainable, green development of AI and empowerment through AI.
1. The carbon emission calculations of all models are assumed to be calculated on the Google Cloud platform and us-central1 region by default from https://mlco2.github.io/impact/.
2. Wudao includes eight models based on their official website
, including GLM, CPM, Transformer-XL, Lawformer, EVA, CogView, BriVL, and PoteninLM. Among these models, Transformer-XL, EVA, and PoteninLM did not provide training configurations and are not calculated here. The GLM, CPM1, BriVL models provided their training configurations based on their paper, hence can be directly calculated. We assume the training configuration of CPM2 is the same as that of CPM1. The Lawformer paper gives a configuration that uses 8 V100GPUs but does not give the training time. We assume it costs 20 days. CogView provides a configuration that uses 512 V100 GPU for training but does not give the training time too. We assume that the training time is 1 day.