Connect with us
[the_ad id="89560"]

Artificial Intelligence

The Biggest Energy Miscalculation of 2024 by Global Leaders – Artificial Intelligence

Published

13 minute read

From EnergyNow.ca

By Maureen McCall

It’s generally accepted that the launch of Artificial Intelligence (AI) occurred at Dartmouth College in a 1956 AI workshop that brought together leading thinkers in computer science, and information theory to map out future paths for investigation. Workshop participants John McCarthy, Marvin Minsky, Nathaniel Rochester and Claude E. Shannon coined the term “artificial intelligence” in a proposal that they wrote for that conference. It started AI as a field of study with John McCarthy generally considered as the father of AI.

AI was developed through the 1960s but in the 1970s-1980s, a period generally referred to as “the AI Winter”, development was stalled by a focus on the limitations of neural networks. In the late 1980s, advancements resumed with the emergence of connectionism and neural networks. The 1990s-2000s are considered to be the beginning of the AI/ Machine Learning Renaissance. In the 2010s, further growth was spurred by the expansion of Big Data and deep learning, computer power and large-scale data sets. In 2022 an AI venture capital frenzy took off (the “AI frenzy”), and AI plunged into the mainstream in 2023 according to Forbes which was already tracking applications of AI across various industries.

By early 2024, the implementation of AI across industries was well underway- in healthcare, finance, creative fields and business. In the energy industry, digitalization conferences were addressing digital transformation in the North American oil & gas industry with speakers and attendees from E&P majors, midstream, pipeline, LNG companies and more as well as multiple AI application providers and the companies speaking and attending already had AI implementations well underway.

So how did global leaders not perceive the sudden and rapid rise of AI and the power commitments it requires?

How has the 2022 “AI frenzy” of investment and subsequent industrial adoption been off the radar of global policymakers until just recently? Venture capital is widely recognized as a driver of innovation and new company formation and leaders should have foreseen the surge of AI improvement and implementation by “following the money” so to speak. Perhaps the incessant focus of “blaming and shaming” industry for climate change blinded leaders to the rapid escalation of AI development that was signaled by the 2022 AI frenzy

Just as an example of lack of foresight, in Canada, the grossly delayed 2024 Fall Economic Statement had a last-minute insertion of “up to $15 billion in aggregate loan and equity investments for AI data center projects”. This policy afterthought is 2 years behind the onset of the AI frenzy and 12+ months behind the industrial adoption of AI. In addition, the Trudeau/Guilbeault partnership is still miscalculating the enormous AI power requirements.

As an example of the size of the power requirements of AI, one can look at the Wonder Valley project- the world’s largest AI data center industrial park in the Greenview industrial gateway near Grande Prairie Alberta. It is planned to “generate and offer 7.5 GW of low-cost power to hyperscalers over the next 5-10 years.” The cost of just this one project is well beyond the funding offered in the 2024 Fall Economic Statement.

“We will engineer and build a redundant power solution that meets the modern AI compute reliability standard,” said Kevin O’Leary, Chairman of O’Leary Ventures. “The first phase of 1.4 GW will be approximately US$ 2 billion with subsequent annual rollout of redundant power in 1 GW increments. The total investment over the lifetime of the project will be over $70 billion.”

To further explore the huge power requirements of AI, one can look at the comparison of individual AI queries/searches vs traditional non-AI queries. As reported by Bloomberg, “Researchers have estimated that a single ChatGPT query requires almost 10 times as much electricity to process as a traditional Google search.” Multiply this electricity demand by the millions of industrial users as industrial AI implementation continues to expand worldwide. As in the same Bloomberg article- “By 2034, annual global energy consumption by data centers is expected to top 1,580 terawatt-hours—about as much as is used by all of India—from about 500 today.”

This is the exponential demand for electricity that North American & global leaders did not see coming – a 24/7 demand that cannot be satisfied by unreliable and costly green energy projects – it requires an “all energies” approach. Exponential AI demand threatens to gobble up supply and dramatically increase electricity prices for consumers. Likewise, leadership does not perceive that North American grids are vulnerable and outdated and would be unable to deliver reliable supply for AI data centers that cannot be exposed to even a few seconds of power outage. Grid interconnections are unreliable as mentioned in the following excerpt from a September 2024 article in cleanenergygrid.org.

“Our grid, for all of its faults, is now a single interconnected “machine” over a few very large regions of the country. Equipment failures in Arizona can shut the lights out in California, just as overloaded lines in Ohio blacked out 55 million people in eight states from Michigan to Boston – and the Canadian province of Ontario – in 2003.”

AI’s power demands are motivating tech companies to develop more efficient means of developing AI. Along with pressure to keep fossil fuels in the mix, billions are being invested in alternative energy solutions like nuclear power produced by Small Nuclear Reactors (SMRs).

Despite SMR optimism, the reality is that no European or North American SMRs are in operation yet. Only Russia & China have SMRs in operation and most data centers are focusing on affordable natural gas power as the reality sets in that nuclear energy cannot scale quickly enough to meet urgent electricity needs. New SMR plants could be built and operational possibly by 2034, but for 2025 Canada’s power grid is already strained, with electricity demand to grow significantly, driven by electric vehicles and data centers for AI applications.

AI has a huge appetite for other resources as well. For example, the most energy and cost-efficient ways to chill the air in data centers rely on huge quantities of potable water and the exponential amount of data AI produces will require dramatic increases in internet networks as well as demand for computer chips and the metals that they require. There is also an intense talent shortage creating AI recruitment competitions for the talent pool of individuals trained by companies like Alphabet, Microsoft and OpenAI.

AI development is now challenging the public focus on climate change. In Canada as well as in the U.S. and globally, left-leaning elected officials who focused keenly on policies to advance the elimination of fossil fuels were oblivious to the tsunami of AI energy demand about to swamp their boats. Canadian Member of Parliament Greg McLean, who has served on the House of Commons Standing Committees of Environment, Natural Resources, and Finance, and as the Natural Resources critic for His Majesty’s Loyal Opposition, has insight into the reason for the change in focus.

“Education about the role of all forms of energy in technology development and use has led to the logical erosion of the ‘rapid energy transition’ mantra and a practical questioning of the intents of some of its acolytes. The virtuous circle of technological development demanding more energy, and then delivering solutions for society that require less energy for defined tasks, could not be accomplished without the most critical input – more energy. This has been a five-year journey, swimming against the current — and sometimes people need to see the harm we are doing in order to objectively ask themselves ‘What are we accomplishing?’ … ‘What choices are being made, and why?’…. and ‘Am I getting the full picture presentation or just the part someone wants me to focus on?’”

With the election of Donald Trump, the “Trump Transition” now competes with the “Energy Transition” focus, changing the narrative in the U.S. to energy dominance. For example, as reported by Reuters, the U.S. solar industry is now downplaying climate change messaging.

“The U.S. solar industry unveiled its lobbying strategy for the incoming Trump administration, promoting itself as a domestic jobs engine that can help meet soaring power demand, without referencing its role in combating climate change.”

It’s important to note here that the future of AI is increasingly subject to societal considerations as well as technological advancements. Political, ethical, legal, and social frameworks will increasingly impact AI’s development, enabling or limiting its implementations. Since AI applications involve “human teaming” to curate and train AI tools, perceptions of the intent of AI implementations are key. In the rush to implementation, employees at many companies are experiencing changing roles with increased demand for workers to train AI tools and curate results. Will tech optimism be blunted by the weight of extra tasks placed on workers and by suspicions that those workers may ultimately be replaced? Will resistance develop as humans and AI are required to work together more closely?

Business analyst Professor Henrik von Scheel of the Arthur Lok Jack Global School of Business describes the importance of the human factor in AI adoption.

“It’s people who have to manage the evolving environment through these new tools,” von Scheel explains. “It’s been this way ever since the first caveperson shaped a flint, only now the tools are emerging from the fusion of the digital, physical and virtual worlds into cyber-physical systems.”

A conversation with a recent graduate who questioned the implementation of AI including the design of guardrails and regulations by members of an older generation in management made me wonder…Is there a generational conflict brewing from the lack of trust between the large proportion of baby boomers in the workforce- predominantly in management- and the younger generation in the workforce that may not have confidence in the ability of mature management to fully understand and embrace AI tech and influence informed decisions to regulate it?

It’s something to watch in 2025.

Maureen McCall is an energy professional who writes on issues affecting the energy industry.

Artificial Intelligence

UK Police Chief Hails Facial Recognition, Outlines Drone and AI Policing Plans

Published on

logo

By

Any face in the crowd can be caught in the dragnet of a digital police state.

The steady spread of facial recognition technology onto Britain’s streets is drawing alarm from those who see it as a step toward mass surveillance, even as police leaders celebrate it as a powerful new weapon against crime.
Live Facial Recognition (LFR) is a system that scans people’s faces in public spaces and compares them against watchlists.
Civil liberties groups warn it normalizes biometric monitoring of ordinary citizens, while the Metropolitan Police insist it is already producing results.
Britain’s senior police leadership is promoting these biometric and artificial intelligence systems as central to the future of policing, with commissioner Sir Mark Rowley arguing that such tools are already transforming the way the Met operates.
Speaking to the TechUK trade association, Rowley described Live Facial Recognition (LFR) as a “game-changing tool” and pointed to more than 700 arrests linked to its use so far this year.
Camera vans stationed on streets have been deployed to flag people wanted for serious crimes or those breaking license conditions.
Rowley highlighted a recent deployment at the Notting Hill Carnival, where he joined officers using LFR.
“Every officer I spoke to was energized by the potential,” he said to The Sun. According to the commissioner, the weekend brought 61 arrests, including individuals sought in cases of serious violence and offenses against women and girls.
Rowley claimed that the technology played “a critical role” in making the carnival safer.
Beyond facial recognition, Rowley spoke of expanding the Met’s reliance on drones. “From searching for missing people, to arriving quickly at serious traffic incidents, or replacing the expensive and noisy helicopter at large public events,” he said, “done well, drones will be another tool to help officers make faster, more informed decisions on the ground.”
The commissioner also promoted the V100 program, which draws on data analysis to focus resources on those considered the highest risk to women.
He said this initiative has already led to the conviction of more than 160 offenders he described as “the most prolific and predatory” in London.
Artificial Intelligence is being tested in other areas too, particularly to review CCTV footage.
Rowley noted the labour involved in manually tracing suspects through crowded areas. “Take Oxford Street, with 27 junctions—a trawl to identify a suspect’s route can take two days,” he explained.
“Now imagine telling AI to find clips of a male wearing a red baseball cap between X and Y hours, and getting results in hours. That’s game-changing.”
While the Met portrays these systems as advances in crime prevention, their deployment raises questions about surveillance creeping deeper into everyday life.
Expansions in facial recognition, drone monitoring, and algorithmic analysis are often introduced as matters of efficiency and safety, but they risk building an infrastructure of constant observation where privacy rights are gradually eroded.
Shaun Thompson’s case has already been cited by campaigners as evidence of the risks that come with rolling out facial recognition on public streets.
He was mistakenly identified by the technology, stopped, and treated as though he were a wanted suspect before the error was realized.
Incidents like this highlight the danger of false matches and the lack of safeguards around biometric surveillance.
For ordinary people, the impact is clear: even if you have done nothing wrong, you can still find yourself pulled into a system that treats you as guilty first and asks questions later.
Continue Reading

Artificial Intelligence

What are data centers and why do they matter?

Published on

From The Center Square

By

Data centers may not be visible to most Americans, but they are shaping everything from electricity use to how communities grow.

These facilities house the servers that process nearly all digital activity, from online shopping and streaming to banking and health care. As the backbone of artificial intelligence and cloud computing, they have expanded at a pace few other industries can match.

Research from Synergy Research Group shows the number of hyperscale data centers worldwide doubled in just five years, reaching 1,136 by the end of 2024. The U.S. now accounts for 54% of that total capacity, more than China and Europe combined. Northern Virginia and the Beijing metro area together make up about 20% of the global market.

John Dinsdale, chief analyst with Synergy Research, said in an email to The Center Square that a simple way to describe data centers is to think of them as part of a food chain.

“At the bottom of the food chain, you’re sitting at your desk with a desktop PC or laptop. All the computing power is on your device,” Dinsdale said.

The next step up is a small office server room, which provides shared storage and applications for employees.

“Next up the chain, you can go two different directions (or use a mix),” he explained.

One option is a colocation data center, where companies lease space instead of running their own physical facilities. That model can support a multitude of customers from a single operator, such as Equinix.

The other option is to move to public cloud computing.

“You buy access to computing resources only when you need them, and you only pay for what you use,” Dinsdale said.

Providers like Amazon, Microsoft and Google run massive data centers that support tens of thousands of servers. From the customer perspective, it may feel like having a private system, but in reality, these servers are shared resources supporting many organizations.

Cloud providers now operate at a scale that was “unthinkable ten years ago” and are referred to in the industry as hyperscale, Dinsdale added. These global networks of data centers support millions of customers and users.

“The advent of AI is pushing those data centers to the next level — way more sophisticated technology, and data centers that need to become a lot more powerful,” he said.

What is a data center?

At its simplest, a data center is a secure building filled with rows of servers that store, process and move information across the internet. Almost every digital action passes through them.

“A data center is like a library of server computers that both stores and processes a lot of internet and cloud data we use every day,” said Dr. Ali Mehrizi-Sani, director of the Power and Energy Center at Virginia Tech told The Center Square. “Imagine having thousands of high-performance computers working nonstop doing heavy calculations with their fans on. That will need a lot of power.”

Some are small enough to serve a hospital or university. Others, known as hyperscale facilities, belong to companies such as Amazon, Microsoft, Google and Meta, with footprints large enough to be measured in megawatts of electricity use.

How big is the industry?

Synergy’s analysis shows how dominant the U.S. has become. Fourteen of the world’s top 20 hyperscale data center markets are in the U.S., including Northern Virginia, Dallas and Silicon Valley. Other global hotspots include Greater Beijing, Dublin and Singapore.

In 2024 alone, 137 new hyperscale centers came online, continuing a steady pace of growth. Average facility size is also climbing. Synergy forecasts that total capacity could double again in less than four years, with 130 to 140 new hyperscale centers added annually.

The world’s largest operators are American technology giants. Amazon, Microsoft and Google together account for 59% of hyperscale capacity, followed by Meta, Apple, and companies such as Alibaba, Tencent and ByteDance.

How much power do they use?

Large data centers run by the top firms typically require 30 to 100 megawatts of power. To put that into perspective, one megawatt can power about 750 homes. That means a 50-70 megawatt facility consumes as much electricity as a small city.

“Building one data center is like adding an entirely new town to the grid,” Mehrizi-Sani said. “In fact, in Virginia, data centers already consume about 25% of the electricity in the state. In the United States, that number is about 3 to 4%.”

That demand requires extensive coordination with utilities.

“Data centers connect to the power grid much like other large loads, like factories and even towns do,” Mehrizi-Sani said. “Because they need so much electric power, utilities have to upgrade substations, lines and transformers to support them. Utilities also have to upgrade their control and protection equipment to accommodate the consumption of data centers.”

If not planned carefully, he added, new facilities can strain local power delivery and generation capacity. That is why every major project must undergo engineering reviews before connecting to the grid.

Why now?

The rapid rise of AI has supercharged an already fast-growing sector. Training models and running cloud services requires enormous computing power, which means facilities are being built faster and larger.

“AI and cloud drive the need to data centers,” Mehrizi-Sani said. “Training AI models and running cloud services require massive computing power, which means new data centers have to be built faster and larger than before.”

Dinsdale noted in a report that the industry’s scale has shifted sharply.

“The big difference now is the increased scale of growth. Historically the average size of new data centers was increasing gradually, but this trend has become supercharged in the last few quarters as companies build out AI-oriented infrastructure,” he said.

Why certain states lead the market

Different states and regions offer different advantages. According to a July 2025 report by Synergy Energy Group, Virginia became the leading hub because of relatively low electricity costs when the industry was expanding, availability of land in the early years and proximity to federal agencies and contractors.

Texas and California are also major markets, for reasons ranging from abundant energy to the presence of technology companies.

Internationally, Synergy’s analysis shows that China and Europe each account for about a third of the remaining capacity. Analysts expect growth to spread to other U.S. regions, including the South and Midwest, while markets in India, Australia, Spain and Saudi Arabia increase their share globally.

What is at stake?

For most Americans, data centers are invisible but indispensable. Almost everything digital depends on them.

“Streaming movies, online banking, virtual meetings and classes, weather forecasts, navigation apps, social media like Instagram, online storage and even some healthcare services” all run through data centers, Mehrizi-Sani said.

Synergy’s forecast suggests the trend is unlikely to slow.

“It is also very clear that the United States will continue to dwarf all other countries and regions as the main home for hyperscale infrastructure,” Dinsdale said.

This story is the first in a Center Square series examining how data centers are reshaping electricity demand, costs, tax incentives, the environment and national security.

Continue Reading

Trending

X