HE is generation requires a large amount of energy and water, as well as during the training of the model, as well as in the use of the chatbot was. The Washington Post and researchers at the University of California (Riverside) have assessed the use of ChatGPT when writing an e-mail to over 100 words the model, GPT-4.
How much time spent ChatGPT?
When a user's request ChatGPT to write an e-mail containing 100 words, the demands (inputs) are processed on the servers located in the center of the record (in the case of OpenAI, they are mostly of the service and the Azure of Microsoft). Your use of the servers, generating heat, which is eliminated by the systems that the cooling of the water. In addition to the strengthening of the directly to the servers of the spectrum, electricity is also used to lower the temperature of the interior of the center of the data.
Water consumption depends on the geographic location of the center of the data. According to researchers at the University of California (Riverside), he is in the state of Washington (where it is the headquarters of Microsoft) less 1,468 mililitra the water, or on the 1.5 liters to generate an e-mail to over 100 words. He is on the Texas-consuming 235 mililitra.
On average, the generation of a color requires a 519 mililitra water. Taking into account the at least one request a week for a year, one in 10 employees of the usa (about 16 million people), the consumption of water passing 435 million liters, or, the water is consumed by all of the families of Rhode Island in the 1.5 days.
Writing the email it asks for 0,14 kWh of electricity, is equivalent to the ignition of the 14-up LED's for an hour or so. Taking into account the at least one request a week for a year, one in 10 employees of the usa (about 16 million people), the consumption of energy exceeds the 121 MW, equivalent to the electricity consumption of all households in the District of Columbia to 20 business days.
The models HE's been getting more and more huge (billions of parameters), so that the required servers, all the more powerful for the speed of training. Data centers used by Microsoft to train GPT-3 (not the latest) konsumuan more than 700,000 gallons of water, the same amount needed to produce the nearly 100-pound (45 kg), the flesh of the calf.
Instead, the Handicapped, and consumed The 22 million gallons of water on the train Called the 3, the same amount required to produce approximately 4,439-pound (2,013 kg) of rice. All of the Big Tech has pledged to reduce the emissions of the gas, the greenhouse, but with the advent of the chatbot was going to be very difficult to achieve the objectives of (Microsoft is the use of a thermal power plant nuclear).
Discussion about this post