May 8, 2024

News Collective

Complete New Zealand News World

How many nuclear power plants does Meta need to power its AI?  The company's engineer has the answer

How many nuclear power plants does Meta need to power its AI? The company's engineer has the answer

Sergei Idonov is clear about the number of nuclear power plants needed to power AI in 2024, and if Meta's chief generative AI engineer says so, we should believe him

The number of nuclear power plants needed to support the development of artificial intelligence in 2024 is lower than expected

Artificial intelligence or Artificial Intelligence is one of the most innovative and in-demand technologies of the 21st century., but it is also one of those that consume the most energy. How can the increasing demand for electricity to power AI applications, especially content generation, be met without compromising the environment and sustainability? According to Meta's Generative AI Engineering Director, Sergey Idonov, The solution could be nuclear energy.

Edunov is one of those responsible for training LlaMa 2, the open language model for Meta. In a discussion session in Digital Workers Forum In Silicon Valley, he mentioned that Only two new nuclear power plants will be needed To meet the demand for artificial intelligence for the entire year and that this is acceptable.

This is the number of nuclear power plants needed to power the AI ​​for an entire year.

In response to a question about whether the world has sufficient capacity to handle the increasing energy needs of artificial intelligence, he said: “We can definitely solve this problem.”. Idonov explained that his response was based only on approximate calculations he had previously made.

However, he said it gave a good estimate of How much energy is needed for so-called “inference” of artificial intelligence. Inference is the process by which artificial intelligence is applied in an application to answer a question or make a recommendation.

See also  Hyundai and Kia recall more than 3.3 million vehicles due to fire risk – Telemundo New York (47)

The concept of inference is different from training an AI model, in which the model is trained with massive amounts of data so that it is ready to make inferences.

Idonov explained how he made his calculations simple for the inference side: He said that Nvidia, a major supplier of AI processors, appears ready to… Launching between one million and two million H100 GPUs Next year.

If all of these GPUs were used to generate codes for language models of reasonable size, it would be possible, he said They will generate about 100,000 tokens for every person on the planet daily, which I admit is a fair number of symbols. Tokens are the basic units of text that language models use to process and generate language.

They can be words, parts of words, or even individual letters, depending on how the language model is designed. For example, The word “Hello” can be a single symbolOr it can be divided into two symbols: “is” and “is.”

Then Idonov appreciated it Each token will require generating about 10 microjoules of energy, which would give a total of about 10^15 microjoules per day to all of humanity. By dividing this number by the number of seconds per day, It got an average power of about 10 GW.

Compare this number with power A typical nuclear power plant is about 5 GWHe concluded that only two nuclear power plants would be needed to power next year's AI inferences. Logically, these are estimates so it may be necessary to do this to verify them.

See also  Iberdrola shuts down major business in the US

In order to always be up to date with the latest technologies, Subscribe to the official and approved Andro4all channel on WhatsApp.