• #OpenAI and partners are building a massive AI #data_center in #Texas

    On Tuesday, OpenAI announced a partnership with #Oracle to develop 4.5 gigawatts of additional data center capacity for its #Stargate AI infrastructure platform in the US. The expansion, which TechCrunch reports is part of a $30 billion-per-year deal between OpenAI and Oracle, will reportedly bring OpenAI’s total Stargate capacity under development to over 5 gigawatts.

    The data center has taken root in #Abilene, Texas, a city of 127,000 located 150 miles west of Fort Worth. The city, which serves as the commercial hub of a 19-county region known as the “Big Country,” offers a location with existing tech employment ecosystem, including Dyess Air Force Base and three universities. Abilene’s economy has evolved over time from its agricultural and livestock roots to embrace technology and manufacturing sectors.

    “We have signed a deal for an additional 4.5 gigawatts of capacity with oracle as part of stargate. easy to throw around numbers, but this is a gigantic infrastructure project,” wrote OpenAI CEO Sam Altman on X. “We are planning to significantly expand the ambitions of stargate past the $500 billion commitment we announced in January.”

    The new agreement builds on OpenAI’s initial $500 billion commitment announced at the White House in January to invest in 10 gigawatts of AI infrastructure over four years. The company estimates that the 4.5 GW expansion will generate jobs across construction and operations roles, including direct full-time positions, short-term construction work, and indirect manufacturing and service jobs.

    The 5 gigawatts of total capacity refers to the amount of electrical power these data centers will consume when fully operational—enough to power roughly 4.4 million American homes. It turns out that telling users their every idea is brilliant requires a lot of energy.
    Stargate moves forward despite early skepticism

    When OpenAI announced Stargate in January, critics questioned whether the company could deliver on its ambitious $500 billion funding promise. Trump ally and frequent Altman foe Elon Musk wrote on X that “They don’t actually have the money,” claiming that “SoftBank has well under $10B secured.”

    Tech writer and frequent OpenAI critic Ed Zitron raised concerns about OpenAI’s financial position, noting the company’s $5 billion in losses in 2024. “This company loses $5bn+ a year! So what, they raise $19bn for Stargate, then what, another $10bn just to be able to survive?” Zitron wrote on Bluesky at the time.

    Six months later, OpenAI’s Abilene data center has moved from construction to partial operation. Oracle began delivering Nvidia GB200 racks to the facility last month, and OpenAI reports it has started running early training and inference workloads to support what it calls “next-generation frontier research.”

    Despite the White House announcement with President Trump in January, the Stargate concept dates back to March 2024, when Microsoft and OpenAI partnered on a $100 billion supercomputer as part of a five-phase plan. Over time, the plan evolved into its current form as a partnership with Oracle, SoftBank, and CoreWeave.

    “Stargate is an ambitious undertaking designed to meet the historic opportunity in front of us,” writes OpenAI in the press release announcing the latest deal. “That opportunity is now coming to life through strong support from partners, governments, and investors worldwide—including important leadership from the White House, which has recognized the critical role AI infrastructure will play in driving innovation, economic growth, and national competitiveness.”

    https://arstechnica.com/ai/2025/07/openai-and-partners-are-building-a-massive-ai-data-center-in-texas
    #centre_de_données #infrastructure #aménagement_du_territoire

  • À #Marseille, les data centers consomment du terrain et beaucoup d’#énergie

    L’entreprise américaine #Digital_Realty veut construire son cinquième #centre_de_données en plein cœur de la ville. Un collectif d’habitants s’y oppose et dénonce un impact fiscal, urbain et environnemental néfaste.

    (#paywall)
    https://www.alternatives-economiques.fr/a-marseille-data-centers-consomment-terrain-beaucoup-denergie/00112632
    #data_center #résistance

    • #merci @sombre, du coup :

      Sur le grand port de Marseille, l’ancien silo à grain, visible depuis l’autoroute qui mène au centre-ville, a été abattu. À la place, l’entreprise texane Digital Realty compte installer son cinquième data center phocéen. A l’horizon 2026, les 12 000 mètres carrés de stockage de données du « vaisseau amiral » MRS5 viendront s’ajouter aux 24 000 actuels.

      L’emplacement stratégique que constitue Marseille, connectée au reste du monde par 17 câbles sous-marins enfouis au large des plages du Prado, a permis de faire sortir de terre un « campus de data centers », selon Digital Realty, et à la ville de devenir « le cinquième hub internet mondial ».

      Ces usines à stockage hébergent en continu nos films, nos jeux vidéo, mais aussi nos télécoms, nos données bancaires et une partie de nos services publics en ligne. Les cryptomonnaies et l’intelligence artificielle, particulièrement voraces en données numériques1, font exploser la demande : d’ici 2030, les besoins de traitement et de stockage par les centres de données devraient augmenter de 160 %, selon la banque Goldman Sachs.

      Une bataille pour l’énergie

      « Ce projet doit être stoppé, car il aura de graves conséquences sur l’environnement et la santé des riverains », s’insurge une habitante, dans l’enquête publique ouverte sur le projet. « Pourquoi, nous, habitants des quartiers nord devons-nous subir tous ces projets écocides ? », s’inquiète une autre.

      Pour faire tourner les serveurs 24 heures sur 24, il faut l’équivalent de la consommation d’une ville de la taille d’Arles pour un entrepôt de 10 000 m²

      Accaparement massif du foncier, de l’eau, de l’électricité, possibilité de nuisances, de pollutions, amoindrissement d’un accès à la mer déjà compliqué… La liste des reproches est longue, si bien qu’un collectif d’habitants, d’associations et d’élus souhaite instaurer un moratoire et saisir la Commission nationale du débat public (CNDP) pour arrêter le projet, tandis que France nature environnement 13 compte attaquer son permis de construire et son autorisation d’exploiter.

      Digital Realty promet pourtant de « minimiser l’impact de ses data centers sur l’environnement ». Stéphane Coppey, chez France nature environnement, en doute. Car pour faire tourner les serveurs 24 heures sur 24, avec des générateurs en cas de panne, il faut énormément d’électricité – l’équivalent de la consommation d’une ville de la taille d’Arles pour un entrepôt de 10 000 m2.

      « Pour le refroidissement des ordinateurs, ils utiliseraient de l’eau souterraine de Gardanne, qui pourrait être plutôt rendue potable par exemple, mais aussi des liquides fluorés, très émetteurs de gaz à effet de serre (GES). »

      Les émissions de GES des centres de données pourraient même être près de huit fois plus élevées que celles officiellement rapportées, selon une étude du Guardian. Sans compter que, dans une ville déjà en surchauffe, le manque de planification urbaine entraîne des conflits d’usage sur l’électricité.

      Même si Christophe Castaner, président du conseil de surveillance du port de Marseille Fos, assure que c’est le dernier projet de ce type, Stéphane Coppey s’étonne : « Les riverains des 15e et 16e arrondissements attendaient plutôt le raccordement électrique des navires de croisière ! » Ces branchements électriques, en cours de déploiement, évitent que les navires à quai brûlent leur combustible en aggravant la pollution de l’air.
      Projets d’intérêt national ?

      Le développement des data centers s’accélère en France, et les subventions se multiplient, dans le cadre du plan France 2030, notamment. Avec la promesse d’un mix énergétique décarboné grâce au nucléaire et un taux réduit de taxe intérieure sur la consommation finale d’électricité (TICFE), la filière croît sept fois plus rapidement que le reste de l’économie.

      Digital Realty a ainsi lancé à La Courneuve la construction du plus grand data center de France dans une Seine-Saint-Denis où leur concentration s’accroît2.

      Avant la dissolution de l’Assemblée nationale, la loi de simplification de la vie économique envisageait même de donner aux data centers le statut de projets d’intérêt national majeur (PINM) afin d’accélérer certaines procédures, comme les permis de construire ou le raccordement au réseau électrique. Quitte à court-circuiter le dialogue avec les élus locaux.

      « Ce projet de loi est absolument aberrant. Au moment où il faudrait être beaucoup plus contraignant, on fait l’inverse », s’insurge Sébastien Barnes, élu EELV à la mairie de Marseille, au sein du collectif contre le projet MRS5.

      « Il faudrait une réglementation européenne, y compris fiscale, pour éviter la mise en concurrence des territoires. Taxer les infrastructures en fonction de leur volume de stockage pourrait être une solution », ajoute-t-il en invoquant l’Allemagne, qui veut écoconditionner leur installation.

      Ailleurs qu’à Marseille, des opposants demandent des comptes et obtiennent des premières victoires : en 2021, le préfet de l’Essonne a par exemple refusé l’agrément à un projet d’Amazon. Pour soigner leur image, les opérateurs des 250 data centers français se font parfois mécènes, comme Digital Realty avec le Musée des civilisations de l’Europe et de la Méditerranée (Mucem). Certains promettent par ailleurs d’étudier la possibilité de réutiliser la chaleur produite... alors que c’est un prérequis pour bénéficier des exonérations de TICFE.

      Pour Clément Marquet, chercheur à l’école des Mines de Paris, cela ne suffira pas : « Même si l’on accompagne ces projets au niveau environnemental et fiscal, et en planification urbaine, on est bien en dehors des limites planétaires. L’économie numérique est complètement hors-sol. » Mais elle occupe toujours plus de territoire.

  • AI tools consume up to 4 times more water than estimated

    A new report shows that artificial intelligence tools, including ChatGPT, are using up to four times more water than previously believed. This discovery raises concerns about the sustainability of #data_centers as AI continues to expand.

    Researchers from the University of California, Riverside found that processing 10 to 50 queries on AI chatbots can consume up to 2 liters of water, far exceeding the earlier estimate of half a liter (https://www.thetimes.com/uk/technology-uk/article/thirsty-chatgpt-uses-four-times-more-water-than-previously-thought-bc0pqsw). The increase is attributed to the intense cooling needs of data centers, where the servers generate significant heat.

    According to Microsoft, the energy and water demands of AI models are much higher than anticipated. Between 2023 and 2024, Google, Microsoft, and Meta have reported water usage increases of 17%, 22.5%, and 17% respectively, further highlighting the growing environmental footprint of AI.

    This is not just a U.S. issue. In the U.K., planned data centers are expected to consume as much water as a city the size of Liverpool. Meanwhile, in Ireland, data centers now account for 21% of the country’s electricity consumption.

    OpenAI CEO Sam Altman recently presented a proposal to the White House to build at least five massive data centers, with plans for unprecedented energy expansions. However, critics argue that the energy production process for AI remains inefficient, with 60% of resources wasted.

    While tech companies pledge to offset their water usage by 2030, critics warn that these efforts may not sufficiently address water scarcity in regions where AI data centers are located.

    https://san.com/cc/ai-tools-consume-up-to-4-times-more-water-than-estimated
    #eau #chatgpt #IA #AI #intelligence_artificielle #centre_de_données

    • AI programs consume large volumes of scarce water

      Every time you run a ChatGPT artificial intelligence query, you use up a little bit of an increasingly scarce resource: fresh water. Run some 20 to 50 queries and roughly a half liter, around 17 ounces, of fresh water from our overtaxed reservoirs is lost in the form of steam emissions.

      Such are the findings of a University of California, Riverside, study that for the first time estimated the water footprint from running artificial intelligence, or AI, queries that rely on the cloud computations done in racks of servers in warehouse-sized data processing centers.

      Google’s data centers in the U.S. alone consumed an estimated 12.7 billion liters of fresh water in 2021 to keep their servers cool — at a time when droughts are exacerbating climate change — Bourns College of Engineering researchers reported in the study, published online by the journal arXiv as a preprint. It is awaiting its peer review.

      Shoalei Ren, an associate professor of electrical and computer engineering and the corresponding author of the study, explained that data processing centers consume great volumes of water in two ways.

      First, these centers draw electricity from power plants that use large cooling towers that convert water into steam emitted into the atmosphere.

      Second, the hundreds of thousands of servers at the data centers must be kept cool as electricity moving through semiconductors continuously generates heat. This requires cooling systems that, like power plants, are typically connected to cooling towers that consume water by converting it into steam.

      “The cooling tower is an open loop, and that’s where the water will evaporate and remove the heat from the data center to the environment,” Ren said.

      Ren said it is important to address the water use from AI because it is fast-growing segment of computer processing demands.

      For example, a roughly two-week training for the GPT-3 AI program in Microsoft’s state-of-the-art U.S. data centers consumed about 700,000 liters of freshwater, about the same amount of water used in the manufacture of about 370 BMW cars or 320 Tesla electric vehicles, the paper said. The water consumption would have been tripled if training were done in Microsoft’s data centers in Asia, which are less efficient. Car manufacturing requires a series washing processes to remove paint particle and residues, among several other water uses.

      Ren and his co-authors — UCR graduate students Pengfei Li and Jianyi Yang, and Mohammad A. Islam of the University of Texas, Arlington — argue big tech should take responsibility and lead by example to reduce its water use.

      Fortunately, AI training has scheduling flexibilities. Unlike web search or YouTube streaming that must be processed immediately, AI training can be done at almost any time of the day. To avoid wasteful water usage, a simple and effective solution is training AI models during cooler hours, when less water is lost to evaporation, Ren said.

      “AI training is like a big very lawn and needs lots of water for cooling,” Ren said. “We don’t want to water our lawns during the noon, so let’s not water our AI (at) noon either.”

      This may conflict with carbon-efficient scheduling that particularly likes to follow the sun for clean solar energy. “We can’t shift cooler weather to noon, but we can store solar energy, use it later, and still be `green’,” Ren said.

      “It is truly a critical time to uncover and address the AI model’s secret water footprint amid the increasingly severe freshwater scarcity crisis, worsened extended droughts, and quickly aging public water infrastructure,” reads the paper, which is titled “Making AI Less ‘Thirsty:’ Uncovering and Addressing the Secret Water Footprint of AI Models.”

      https://news.ucr.edu/articles/2023/04/28/ai-programs-consume-large-volumes-scarce-water

  • Les data centers menacés d’obsolescence à cause de l’essor de l’IA
    https://www.latribune.fr/technos-medias/informatique/les-data-centers-menaces-d-obsolescence-a-cause-de-l-essor-de-l-ia-1007271

    Jusqu’à 80% des data centers seraient inadaptables aux besoins délirants de l’intelligence artificielle, alertent des acteurs du secteur. Pour tenter de répondre à la consommation insatiable des grands modèles d’OpenAI et Meta, ces infrastructures deviennent de plus en plus gigantesques, créant des tensions inédites sur le foncier, notamment à Marseille.

  • #Data_center emissions probably 662% higher than big tech claims. Can it keep up the ruse?

    Emissions from in-house data centers of #Google, #Microsoft, #Meta and #Apple may be 7.62 times higher than official tally.

    Big tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.

    According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are probably about 662% – or 7.62 times – higher than officially reported.

    Amazon is the largest emitter of the big five tech companies by a mile – the emissions of the second-largest emitter, Apple, were less than half of Amazon’s in 2022. However, Amazon has been kept out of the calculation above because its differing business model makes it difficult to isolate data center-specific emissions figures for the company.

    As energy demands for these data centers grow, many are worried that carbon emissions will, too. The International Energy Agency stated that data centers already accounted for 1% to 1.5% of global electricity consumption in 2022 – and that was before the AI boom began with ChatGPT’s launch at the end of that year.

    AI is far more energy-intensive on data centers than typical cloud-based applications. According to Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search, and data center power demand will grow 160% by 2030. Goldman competitor Morgan Stanley’s research has made similar findings, projecting data center emissions globally to accumulate to 2.5bn metric tons of CO2 equivalent by 2030.

    In the meantime, all five tech companies have claimed carbon neutrality, though Google dropped the label last year as it stepped up its carbon accounting standards. Amazon is the most recent company to do so, claiming in July that it met its goal seven years early, and that it had implemented a gross emissions cut of 3%.

    “It’s down to creative accounting,” explained a representative from Amazon Employees for Climate Justice, an advocacy group composed of current Amazon employees who are dissatisfied with their employer’s action on climate. “Amazon – despite all the PR and propaganda that you’re seeing about their solar farms, about their electric vans – is expanding its fossil fuel use, whether it’s in data centers or whether it’s in diesel trucks.”
    A misguided metric

    The most important tools in this “creative accounting” when it comes to data centers are renewable energy certificates, or Recs. These are certificates that a company purchases to show it is buying renewable energy-generated electricity to match a portion of its electricity consumption – the catch, though, is that the renewable energy in question doesn’t need to be consumed by a company’s facilities. Rather, the site of production can be anywhere from one town over to an ocean away.

    Recs are used to calculate “market-based” emissions, or the official emissions figures used by the firms. When Recs and offsets are left out of the equation, we get “location-based emissions” – the actual emissions generated from the area where the data is being processed.

    The trend in those emissions is worrying. If these five companies were one country, the sum of their “location-based” emissions in 2022 would rank them as the 33rd highest-emitting country, behind the Philippines and above Algeria.

    Many data center industry experts also recognize that location-based metrics are more honest than the official, market-based numbers reported.

    “Location-based [accounting] gives an accurate picture of the emissions associated with the energy that’s actually being consumed to run the data center. And Uptime’s view is that it’s the right metric,” said Jay Dietrich, the research director of sustainability at Uptime Institute, a leading data center advisory and research organization.

    Nevertheless, Greenhouse Gas (GHG) Protocol, a carbon accounting oversight body, allows Recs to be used in official reporting, though the extent to which they should be allowed remains controversial between tech companies and has led to a lobbying battle over GHG Protocol’s rule-making process between two factions.

    On one side there is the Emissions First Partnership, spearheaded by Amazon and Meta. It aims to keep Recs in the accounting process regardless of their geographic origins. In practice, this is only a slightly looser interpretation of what GHG Protocol already permits.

    The opposing faction, headed by Google and Microsoft, argues that there needs to be time-based and location-based matching of renewable production and energy consumption for data centers. Google calls this its 24/7 goal, or its goal to have all of its facilities run on renewable energy 24 hours a day, seven days a week by 2030. Microsoft calls it its 100/100/0 goal, or its goal to have all its facilities running on 100% carbon-free energy 100% of the time, making zero carbon-based energy purchases by 2030.

    Google has already phased out its Rec use and Microsoft aims to do the same with low-quality “unbundled” (non location-specific) Recs by 2030.

    Academics and carbon management industry leaders alike are also against the GHG Protocol’s permissiveness on Recs. In an open letter from 2015, more than 50 such individuals argued that “it should be a bedrock principle of GHG accounting that no company be allowed to report a reduction in its GHG footprint for an action that results in no change in overall GHG emissions. Yet this is precisely what can happen under the guidance given the contractual/Rec-based reporting method.”

    To GHG Protocol’s credit, the organization does ask companies to report location-based figures alongside their Rec-based figures. Despite that, no company includes both location-based and market-based metrics for all three subcategories of emissions in the bodies of their annual environmental reports.

    In fact, location-based numbers are only directly reported (that is, not hidden in third-party assurance statements or in footnotes) by two companies – Google and Meta. And those two firms only include those figures for one subtype of emissions: scope 2, or the indirect emissions companies cause by purchasing energy from utilities and large-scale generators.
    In-house data centers

    Scope 2 is the category that includes the majority of the emissions that come from in-house data center operations, as it concerns the emissions associated with purchased energy – mainly, electricity.

    Data centers should also make up a majority of overall scope 2 emissions for each company except Amazon, given that the other sources of scope 2 emissions for these companies stem from the electricity consumed by firms’ offices and retail spaces – operations that are relatively small and not carbon-intensive. Amazon has one other carbon-intensive business vertical to account for in its scope 2 emissions: its warehouses and e-commerce logistics.

    For the firms that give data center-specific data – Meta and Microsoft – this holds true: data centers made up 100% of Meta’s market-based (official) scope 2 emissions and 97.4% of its location-based emissions. For Microsoft, those numbers were 97.4% and 95.6%, respectively.

    The huge differences in location-based and official scope 2 emissions numbers showcase just how carbon intensive data centers really are, and how deceptive firms’ official emissions numbers can be. Meta, for example, reports its official scope 2 emissions for 2022 as 273 metric tons CO2 equivalent – all of that attributable to data centers. Under the location-based accounting system, that number jumps to more than 3.8m metric tons of CO2 equivalent for data centers alone – a more than 19,000 times increase.

    A similar result can be seen with Microsoft. The firm reported its official data center-related emissions for 2022 as 280,782 metric tons CO2 equivalent. Under a location-based accounting method, that number jumps to 6.1m metric tons CO2 equivalent. That’s a nearly 22 times increase.

    While Meta’s reporting gap is more egregious, both firms’ location-based emissions are higher because they undercount their data center emissions specifically, with 97.4% of the gap between Meta’s location-based and official scope 2 number in 2022 being unreported data center-related emissions, and 95.55% of Microsoft’s.

    Specific data center-related emissions numbers aren’t available for the rest of the firms. However, given that Google and Apple have similar scope 2 business models to Meta and Microsoft, it is likely that the multiple on how much higher their location-based data center emissions are would be similar to the multiple on how much higher their overall location-based scope 2 emissions are.

    In total, the sum of location-based emissions in this category between 2020 and 2022 was at least 275% higher (or 3.75 times) than the sum of their official figures. Amazon did not provide the Guardian with location-based scope 2 figures for 2020 and 2021, so its official (and probably much lower) numbers were used for this calculation for those years.
    Third-party data centers

    Big tech companies also rent a large portion of their data center capacity from third-party data center operators (or “colocation” data centers). According to the Synergy Research Group, large tech companies (or “hyperscalers”) represented 37% of worldwide data center capacity in 2022, with half of that capacity coming through third-party contracts. While this group includes companies other than Google, Amazon, Meta, Microsoft and Apple, it gives an idea of the extent of these firms’ activities with third-party data centers.

    Those emissions should theoretically fall under scope 3, all emissions a firm is responsible for that can’t be attributed to the fuel or electricity it consumes.

    When it comes to a big tech firm’s operations, this would encapsulate everything from the manufacturing processes of the hardware it sells (like the iPhone or Kindle) to the emissions from employees’ cars during their commutes to the office.

    When it comes to data centers, scope 3 emissions include the carbon emitted from the construction of in-house data centers, as well as the carbon emitted during the manufacturing process of the equipment used inside those in-house data centers. It may also include those emissions as well as the electricity-related emissions of third-party data centers that are partnered with.

    However, whether or not these emissions are fully included in reports is almost impossible to prove. “Scope 3 emissions are hugely uncertain,” said Dietrich. “This area is a mess just in terms of accounting.”

    According to Dietrich, some third-party data center operators put their energy-related emissions in their own scope 2 reporting, so those who rent from them can put those emissions into their scope 3. Other third-party data center operators put energy-related emissions into their scope 3 emissions, expecting their tenants to report those emissions in their own scope 2 reporting.

    Additionally, all firms use market-based metrics for these scope 3 numbers, which means third-party data center emissions are also undercounted in official figures.

    Of the firms that report their location-based scope 3 emissions in the footnotes, only Apple has a large gap between its official scope 3 figure and its location-based scope 3 figure.

    This is the only sizable reporting gap for a firm that is not data center-related – the majority of Apple’s scope 3 gap is due to Recs being applied towards emissions associated with the manufacturing of hardware (such as the iPhone).

    Apple does not include transmission and distribution losses or third-party cloud contracts in its location-based scope 3. It only includes those figures in its market-based numbers, under which its third party cloud contracts report zero emissions (offset by Recs). Therefore in both of Apple’s total emissions figures – location-based and market-based – the actual emissions associated with their third party data center contracts are nowhere to be found.”

    .
    2025 and beyond

    Even though big tech hides these emissions, they are due to keep rising. Data centers’ electricity demand is projected to double by 2030 due to the additional load that artificial intelligence poses, according to the Electric Power Research Institute.

    Google and Microsoft both blamed AI for their recent upticks in market-based emissions.

    “The relative contribution of AI computing loads to Google’s data centers, as I understood it when I left [in 2022], was relatively modest,” said Chris Taylor, current CEO of utility storage firm Gridstor and former site lead for Google’s data center energy strategy unit. “Two years ago, [AI] was not the main thing that we were worried about, at least on the energy team.”

    Taylor explained that most of the growth that he saw in data centers while at Google was attributable to growth in Google Cloud, as most enterprises were moving their IT tasks to the firm’s cloud servers.

    Whether today’s power grids can withstand the growing energy demands of AI is uncertain. One industry leader – Marc Ganzi, the CEO of DigitalBridge, a private equity firm that owns two of the world’s largest third-party data center operators – has gone as far as to say that the data center sector may run out of power within the next two years.

    And as grid interconnection backlogs continue to pile up worldwide, it may be nearly impossible for even the most well intentioned of companies to get new renewable energy production capacity online in time to meet that demand.

    https://www.theguardian.com/technology/2024/sep/15/data-center-gas-emissions-tech
    #données #émissions #mensonge #ChatGPT #AI #IA #intelligence_artificielle #CO2 #émissions_de_CO2 #centre_de_données

    • AI’s emissions are about to skyrocket even further

      Data center emissions have tripled since 2018. As more complex AI models like OpenAI’s Sora see broad release, those figures will likely go through the roof.

      It’s no secret that the current AI boom is using up immense amounts of energy. Now we have a better idea of how much.

      A new paper, from teams at the Harvard T.H. Chan School of Public Health and UCLA Fielding School of Public Health, examined 2,132 data centers operating in the United States (78% of all facilities in the country). These facilities—essentially buildings filled to the brim with rows of servers—are where AI models get trained, and they also get “pinged” every time we send a request through models like ChatGPT. They require huge amounts of energy both to power the servers and to keep them cool.

      Since 2018, carbon emissions from data centers in the US have tripled. For the 12 months ending August 2024, data centers were responsible for 105 million metric tons of CO2, accounting for 2.18% of national emissions (for comparison, domestic commercial airlines are responsible for about 131 million metric tons). About 4.59% of all the energy used in the US goes toward data centers, a figure that’s doubled since 2018.

      It’s difficult to put a number on how much AI in particular, which has been booming since ChatGPT launched in November 2022, is responsible for this surge. That’s because data centers process lots of different types of data—in addition to training or pinging AI models, they do everything from hosting websites to storing your photos in the cloud. However, the researchers say, AI’s share is certainly growing rapidly as nearly every segment of the economy attempts to adopt the technology.

      “It’s a pretty big surge,” says Eric Gimon, a senior fellow at the think tank Energy Innovation, who was not involved in the research. “There’s a lot of breathless analysis about how quickly this exponential growth could go. But it’s still early days for the business in terms of figuring out efficiencies, or different kinds of chips.”

      Notably, the sources for all this power are particularly “dirty.” Since so many data centers are located in coal-producing regions, like Virginia, the “carbon intensity” of the energy they use is 48% higher than the national average. The paper, which was published on arXiv and has not yet been peer-reviewed, found that 95% of data centers in the US are built in places with sources of electricity that are dirtier than the national average.

      There are causes other than simply being located in coal country, says Falco Bargagli-Stoffi, an author of the paper and Assistant Professor at UCLA Fielding School of Public Health. “Dirtier energy is available throughout the entire day,” he says, and plenty of data centers require that to maintain peak operation 24-7. “Renewable energy, like wind or solar, might not be as available.” Political or tax incentives, and local pushback, can also affect where data centers get built.

      One key shift in AI right now means that the field’s emissions are soon likely to skyrocket. AI models are rapidly moving from fairly simple text generators like ChatGPT toward highly complex image, video, and music generators. Until now, many of these “multimodal” models have been stuck in the research phase, but that’s changing.

      OpenAI released its video generation model Sora to the public on December 9, and its website has been so flooded with traffic from people eager to test it out that it is still not functioning properly. Competing models, like Veo from Google and Movie Gen from Meta, have still not been released publicly, but if those companies follow OpenAI’s lead as they have in the past, they might be soon. Music generation models from Suno and Udio are growing (despite lawsuits), and Nvidia released its own audio generator last month. Google is working on its Astra project, which will be a video-AI companion that can converse with you about your surroundings in real time.

      “As we scale up to images and video, the data sizes increase exponentially,” says Gianluca Guidi, a PhD student in artificial intelligence at University of Pisa and IMT Lucca and visiting researcher at Harvard, who is the paper’s lead author. Combine that with wider adoption, he says, and emissions will soon jump.

      One of the goals of the researchers was to build a more reliable way to get snapshots of just how much energy data centers are using. That’s been a more complicated task than you might expect, given that the data is dispersed across a number of sources and agencies. They’ve now built a portal that shows data center emissions across the country. The long-term goal of the data pipeline is to inform future regulatory efforts to curb emissions from data centers, which are predicted to grow enormously in the coming years.

      “There’s going to be increased pressure, between the environmental and sustainability-conscious community and Big Tech,” says Francesca Dominici, director of the Harvard Data Science Initiative, Harvard Professor and another coauthor. “But my prediction is that there is not going to be regulation. Not in the next four years.”

      https://www.technologyreview.com/2024/12/13/1108719/ais-emissions-are-about-to-skyrocket-even-further

  • Extinction Rebellion Ireland warns new #Google data centre will be met with ’massive pushback’

    Extinction Rebellion Ireland (ERI) has warned that if a planned new Google Ireland #data_centre gets the planning go-ahead for south Dublin “it will be met with massive local and national pushback and action”.

    In an objection against the planned new data centre expansion by Google Ireland for Grange Castle Business Park in south Dublin, Emer Connolly of Extinction Rebellion Ireland has told the Council: “This expansion is a disaster for local communities, water shortages, transition to a more sustainable economy, and reaching our climate targets.

    Ms Connolly states that “if this planning goes through, it will be met with massive local and national pushback and action”.

    She said: “Environmental groups are watching closely and won’t let this go through easily.”

    In a separate submission, An Taisce has warned that planning for the data centre "would further compromise our ability to achieve compliance with our carbon budget limits and would put additional pressure on renewables capacity to deal with the significant additional power demand”.

    The scheme is the third phase of the Google Ireland data centre campus at Grange Castle Business Park and will involve the creation of 800 construction jobs and 50 jobs when operational.

    The new 72,400 square-metre data storage facility data centre will involve the construction of eight data halls on a 50 acre greenfield/brownfield site.

    However, in the eight-page submission, An Taisce’s Planning Officer, Sean O’Callaghan states that the proliferation of data infrastructure has largely gone unchecked, and data centres now consume 21 per cent of Ireland’s total metred electricity.

    He said that this is up from 5% in 2015 and represents more electricity use than all urban households in Ireland combined.

    Mr O’Callaghan has stated that the planned data centre will put great pressure on an already strained electricity grid in the Dublin region, particularly in light of the large number of existing and proposed data centres already in the area.

    Mr O’Callaghan stated that a projected increase of 0.44 per cent in national emissions from the project "is entirely incompatible with our obligations to reduce emissions”.

    Mr O’Callaghan states: "Also, we consider an increase in national emissions of almost half a percentage point as a result of one singular development to be very significant.”

    He said that granting development consent for the data centre, on its own terms and when considered cumulatively with the high concentration of other data centres, “would greatly risk Ireland’s ability to meet carbon budget and sectoral emissions ceiling obligations for the electricity sector”.

    Head of Policy at Friends of the Earth, Jerry MacEvilly has called on the Council to reject the planning application.

    Mr Mac Evilly states “our concern is that the proposed development would actively undermine the achievement of the state’s carbon budget programme”.

    Dr Colin Doyle has told the council that claims of commitment by Google and Google Ireland Ltd “to decarbonisation amount to greenwashing”.

    He said: “The claims are all based on purchase of renewable electricity. While these purchases can be reported in corporate Green House Gas (GHG) accounting systems, they do not mitigate or offset in any way the physical additional GHG emissions caused by Google’s activities in Ireland.”

    In a separate submission, Gino Kenny TD (People Before Profit Solidarity) along with party colleagues, Cllr Madeleine Johansson and Cllr Darragh Adelaide have told the council that the development “would have an adverse impact on the local community, the electricity grid and on Ireland’s carbon emissions”.

    Raising concerns about possible black-outs in the area from data centre development, the three state that “Grange Castle has seen a significant number of data storage developments, some of which have not yet started operating, and we would be extremely concerned about the capacity of the electricity grid at this time to cope with any further developments”.

    A decision is due on the application later this month.

    https://www.breakingnews.ie/ireland/extinction-rebellion-ireland-warns-google-data-centre-will-be-met-with-
    #Irlande #data_center #résistance #environnement #Grange_Castle_Business_Park #centre_de_données

  • Data centers : leur consommation d’eau va exploser
    https://reporterre.net/Data-centers-leur-consommation-d-eau-va-exploser

    L’équation est pourtant assez simple. Les data centers, toujours plus nombreux et plus grands, concentrent des machines qui produisent de la chaleur et qu’il faut refroidir. Différentes techniques existent et se font concurrence. La climatisation classique, mais aussi l’utilisation de circuits d’eau — ouverts, avec rejet d’eaux usées, ou fermés — pour refroidir l’air à l’intérieur des installations, ou bien des systèmes dits « adiabatiques » par pulvérisation d’eau.

    « Ce qui est assez paradoxal, c’est qu’au moment où l’on met en place des indicateurs pour mesurer l’efficacité énergétique des data centers, comme le PUE [Power Usage Effectivness, rapport entre l’énergie totale consommée par un centre de données et l’énergie consommée par ses seuls équipements informatiques] une solution pour avoir un bon PUE et se présenter comme plus “vert”, c’est d’utiliser plus d’eau et moins d’électricité », explique Clément Marquet, coordinateur du groupe de travail Politiques environnementales du numérique au Centre national de la recherche scientifique (CNRS).

    « On est conscients que la consommation d’eau est un sujet énorme, dit l’urbaniste Cécile Diguet, coautrice d’une étude sur le développement des data centers en Île-de-France et de l’ouvrage Sous le feu numérique (avec Fanny Lopez). On sait que les opérateurs de data centers font des forages dans les nappes, plus ou moins déclarés. On voit que, en Île-de-France, il y a des nappes de plus en plus fragilisées, notamment dans l’Essonne. Mais personne n’a encore creusé la question à ma connaissance. »

    Si cette problématique vient s’ajouter à celle de l’artificialisation des sols et de la consommation vertigineuse d’énergie des data centers, elle pourrait devenir une préoccupation majeure avec le déploiement de l’intelligence artificielle. « C’est le gros sujet derrière tout ça, affirme le sociologue Clément Marquet. Avec l’IA, on utilise des ordinateurs qui consomment 5 à 10 fois plus d’électricité et cette électricité est dissipée en chaleur par les ordinateurs. Pour l’heure, on ne connaît pas de solution aussi bon marché que l’eau. » Déjà aujourd’hui, les conflits d’usage autour des data centers se multiplient aux États-Unis, en Uruguay, aux Pays-Bas, en Irlande ou bien encore Espagne, avec le mouvement « Tu nube seca mi rio » (« Ton nuage assèche ma rivière »).

    Si les industriels s’en remettent à leur credo technosolutionniste, à savoir améliorer l’efficacité de leurs systèmes de refroidissement, le seul horizon soutenable reste celui de la frugalité, selon le chercheur en sciences de l’information Olivier Ertzscheid.

    Il établit un parallèle entre fermes de l’agriculture intensive et fermes de serveurs, deux modèles qui partagent selon lui un même ADN extractiviste : « L’affrontement central dans les conflits d’usage à venir [autour de l’accès à l’eau] se fera autour de deux lignes de front : d’un côté le lobby technologique, ses data centers et ses fermes de serveurs, et de l’autre le lobby de l’agriculture industrielle [...]. Le point de convergence des deux modèles est qu’ils concourent à un appauvrissement et à un épuisement des sols autant qu’au stress hydrique des populations avoisinantes. Et que, pour l’essentiel, ils s’en moquent. »

    #Data_centers #Ecologie #Numérique #Eau

  • Sécheresse en Espagne : quand le numérique pompe l’eau potable pour le metavers : 600 000 M3 soit 600 millions de litres _ France Télévisions -

    Pour développer le metavers et l’intelligence artificielle, le géant américain Meta souhaite implanter un nouveau centre de données en Espagne, dans la région de la Castilla-La Mancha. Un projet très gourmand en eau qui pose question dans cette province régulièrement touchée par la sécheresse.

    C’est dans une réserve naturelle protégée, refuge des aigles noirs de la région, que se situe le terrain convoité par Meta. S’il obtient toutes les autorisations, son prochain centre de données devrait voir le jour sur près de 180 hectares, le long de la zone industrielle de Talavera de la Reina.


    Nous sommes à 130 km à l’ouest de la Capitale espagnole, dans un territoire agricole de la Castilla-La Mancha, très aride, malgré la présence du Tage. Ici la sécheresse dure depuis le mois de février. Si les agriculteurs n’ont pas encore de restrictions d’eau pour le moment, tous redoutent d’en manquer, et de renoncer à une partie de leur production, comme l’été 2022. 

    Aurora Gomez est militante écologiste et une enfant de la région. Le problème de l’eau, cette fille d’agriculteur le connaît bien, et c’est pourquoi dit-elle, elle ne comprend pas pourquoi Meta souhaite s’y installer. "Cet endroit va ressembler à un entrepôt géant… de l’extérieur, on ne verra plus l’horizon. Tout ce que vous voyez autour de vous, ça va disparaître !" déplore t-elle, à travers la vitre de sa voiture électrique.

    600 millions de litres d’eau par an
    Le projet de Meta prévoit 102 hectares de hangars pour abriter les milliers de serveurs qui marcheront en continu. Des systèmes informatiques qui ont besoin d’être refroidis, pour éviter la surchauffe.Et pour se faire, Meta prévoit d’utiliser environ 600 millions de litres d’eau par an, dont 200 millions, prélevés dans le réseau d’eau potable de la ville de Talavera de la Reina. Le reste dans un affluent du fleuve Tage.

    Meta tempère : "Le centre de données pourrait potentiellement utiliser jusqu’à 200 millions de litres d’eau potable par an, avec 82,5 millions de litres renvoyés dans le réseau d’eaux usées ? La consommation globale serait donc estimée à 117,5 millions de litres."
    Pour la militante écologiste, cela risquerait de créer des pénuries, dans cette région touchée par des sécheresses à répétition.

    Un centre de données consomme en moyenne en une journée autant d’eau qu’un espagnol en une année. C’est déjà beaucoup, et ça dans un contexte où il n’y aurait pas de réchauffement climatique, pas de hausse de température… Est ce qu’on veut priver les gens d’ici d’eau du robinet pour permettre à d’autres de pouvoir regarder des vidéos de chats sur internet ? Cela n’a pas de sens pour moi.
    Aurora Gomez, militante écologiste "Tu nube seca mi rio"

    Meta et les géants du numérique sous-estiment-ils la facture énergétique de leurs centres de données ? Ce n’est en tout cas pas la première fois que de grands noms sont épinglés sur des projets d’envergure.

    En 2022, en Irlande, capitale européenne des centres de données, les serveurs ont crée des coupures de courant et consommé un demi-million de litres d’eau par jour, et jusqu’à cinq millions de litres par jour quand il fait chaud.

    Aux Pays-Bas, les installations d’un grand groupe américain ont elles, utilisé quatre fois plus d’eau que prévu... si bien que le gouvernement a décidé de limiter l’accès à l’eau pour ces infrastructures avec un moratoire.

    Contacté, Meta assure que le coût écologique de ce projet en Espagne sera compensé par le financement d’autres projets dans le monde, aux Etats-Unis notamment :
    Dans le cadre de nos engagements de durabilité (...) nous restituerons plus d’eau que ce qui sera consommé par le centre de données de Talavera, grâce à des projets de recyclage et de conservation de l’eau.

    Meta  
    De leur côté les pouvoirs publics espagnols viennent de déclarer le projet “d’utilité générale”, c’est à dire un classement qui permet d’accélérer la procédure, d’obtenir des dérogations spéciales.

    Dans cette province où le chômage touche plus de 15% de la population active, ce chantier est prioritaire, selon eux.

    Cela va créer plus de 500 emplois pendant la phase de construction, mais surtout, le plus important, c’est qu’une fois que le site sera achevé, cela va générer dans la région et dans la ville de Talavera de la Reina, plus de 250 emplois directs
    Patricia Franco Jiménez, ministre de l’économie de la région de la Castilla-La Mancha

    Sur les bords du Tage, certains agriculteurs doutent de l’aubaine que représente ce projet pour l’emploi. A commencer par Luis Miguel Pinero Sanchez, céréalier et maraîcher, installé à une dizaine de kilomètres du futur site de Meta. L’an passé, les restrictions d’eau potable - la seule adaptée à l’irrigation des cultures, l’eau du Tage étant selon lui "trop polluée" - lui ont coûté presque la totalité de sa production de tomates. Les exploitations agricoles seraient, dit-il, mis en danger par le géant américain.

    Lorsque Meta va s’installer et pomper l’eau, nous les agriculteurs, nous en manqueront encore davantage. Ca me parait tres bien qu’il y ait de l’activité ici, c’est vrai que l’on manque d’emploi, que c’est sinistré. Mais ça ne peut pas se faire sur le dos d’une autre partie de la population, comme nous, qui va se retrouver au chômage ou avoir moins de ressources.
    Luis Miguel Pinero-Sanchez, agriculteur

    Une enquête publique environnementale est en cours. Si celle-ci donne son feu vert, le chantier devrait débuter d’ici la fin de l’année.

    #mark_zuckerberg #facebook #whatsApp #instagram #data_center #eau #gaspillage #pollution #vol #Espagne

    Source : https://www.francetvinfo.fr/meteo/secheresse/secheresse-en-espagne-quand-le-numerique-pompe-l-eau_5841185.html

    • Remarque : Cette eau de refroidissement sera à l’état « Très polluée » , étant donné qu’on y ajoute a peu prêt autant de cochonneries que dans les eaux permettant la fracturation hydraulique.

    • Comme l’explique El País , la consommation “totale”, y compris en eaux "non potable", pourrait atteindre “120 litres par seconde dans le centre de données et 33 litres par seconde” dans le reste des installations. À ce rythme, on parle d’environ 4,8 milliards de litres d’eau par an . L’entreprise n’a toutefois pas confirmé officiellement ces estimations. 

      Meta dispose actuellement de trois installations de ce type en Suède, au Danemark et en Irlande. Il y a quelques mois, une enquête du média Noordhollands Dagblab a démontré que le centre de données de Microsoft aux Pays-Bas consommerait 84 millions de litres d’eau en 2021, alors que l’entreprise avait annoncé une consommation de 12 à 20 millions de litres.
      . . . . . .
      Cette “sous-déclaration” n’est pas une première. Un schéma similaire s’est produit aux Pays-Bas. Les centres de stockage de données et d’informations Middenmeer, de la firme Microsoft, ont consommé jusqu’à sept fois plus d’eau que ce qui avait été prévu au lancement du projet. 

      Source : https://www.francesoir.fr/societe-environnement/le-megacentre-de-donnees-de-zuckerberg-en-espagne-necessitera-600-million

  • « Comme vous avez pu le voir sur Twitter, j’ai récemment mis en place une infrastructure complète en Data Center. De l’installation des serveurs à la configuration du réseau, venez découvrir comment j’ai mis en place ma nouvelle infra ! »

    Un très détaillé compte-rendu de l’installation de serveurs et de leur connexion réseau. Pour se rappeler que l’Internet (SeenThis, par exemple), ne fonctionne pas dans un nuage !

    https://blog.ataxya.net/une-infra-en-datacenter

    #centre_de_données #Internet

  • « Comment parvenir, alors que l’augmentation exponentielle de l’information semble si chargée de valeurs négatives, à la rendre attractive ? Par la photographie bien sûr ! Si l’esthétique des prises, des câbles et des baies de stockages vous branche, vous allez adorer les photos de data-centers. »

    Très jolies photos, en effet : http://www.ourageis13.com/quotidien-2/infobesite-la-photo-peut-elle-rendre-un-data-center-sexy

    #photographie #datacenter #centre_de_données

  • Cloud souverain, un gâchis à la française
    http://www.lesechos.fr/idees-debats/editos-analyses/0204173981400-cloud-souverain-un-gachis-a-la-francaise-1096130.php#

    Cinq ans après le début des travaux, les deux projets de cloud financés par l’Etat et les industriels français boivent la tasse. Les pouvoirs publics ont fait l’erreur de croire que l’innovation pouvait se décréter dans un bureau de Bercy.

    #Centre_de_données #Cloud_computing #Cloudwatt #Dassault_Systèmes #Data_center #France #Numergy #OVH #Orange_(entreprise) #Économie_numérique