Connect with us
[the_ad id="89560"]

Artificial Intelligence

AI is another reason why Canada needs to boost the energy supply

Published

5 minute read

From Resource Works

Massive energy levels are required to keep up with AI innovations, and Canada risks being unable to do that

Artificial Intelligence is already one of the most important technologies of our time, and its development has been pushing innovation at a breakneck pace across huge swathes of the economy. Smart assistants now operate, albeit in a limited fashion, as secretaries for those who need help in the office, while autonomous vehicle capabilities keep improving.

It is a remarkable and world-changing time.

Just as one plays a video game, turns on a light, or starts up their car, AI requires energy. To say that AI’s appetite for energy is ravenous is an understatement, and Canadian governments must understand the challenge that comes with that.

Energy shortages are a growing threat to Canada’s economic security and, yes, our standard of living. Failure to keep up with demand means importing more energy at a cost, or facing energy blackouts, in which case Canada will fall behind in far more than just AI.

New AI models are seemingly rolling out every month, especially in machine learning and generative AI. OpenAI’s ChatGPT and Google’s Bard require huge levels of computing power to work. To train ChatGPT-4, an advanced language model, consumes thousands of megawatt hours of electricity, not incomparable to the energy usage of urban centres.

A single query made to ChatGPT requires ten times the energy of making a search on Google, revealing the massive needs of AI technology. AI is not just another internet search extension or downloadable app, it is an entirely new industry.

AI models are trained and run in data centers, which are central to this energy dilemma. The sheer power consumption in data centers is ballooning, and some estimates warn that the world’s data center energy demand will surge by 160 percent by 2030.

The International Energy Agency (IEA) has reported that AI and data centers already consume 1 to 2 percent of global electricity, a figure expected only to climb as more companies embrace AI-driven technology. As much as AI is driving digital innovation, it is also consuming electricity at a rate we will have to match.

Canada’s energy security is being seriously challenged by rising demand, with or without AI. Historically, Canadians have enjoyed the fruits of abundant, cheap energy generated by hydroelectricity in BC and Quebec, or nuclear power in Ontario. Times, and weather, have unfortunately changed.

A large and growing population, electrifying economies, and the weakening of Canada’s legacy energy sources are pushing the country to its limits regarding power supply.

The current federal government wants Canada to achieve net-zero emissions by 2050, which means that electricity is going to have to double in the next 25 years. Canada is already dealing with electricity shortages, such as in British Columbia, where demand for hydroelectricity is expected to rise 15 percent over the next six years. Manitoba is projecting a shortfall by 2029, while Ontario races to put up new nuclear power plants to avert an energy crisis by 2029 as well.

AI can help Canadians craft solutions to its incoming energy problems as a valuable research aid that can help with modeling and processing data. However, that will mean more energy consumption as part of the rogue wave of energy consumption that AI innovation has created.

As evidenced by the constant developments in AI, it is obvious that the technology is going nowhere, and neither are Canada’s energy shortfalls.

If AI is going to contribute to the surge in energy demand, then it only makes sense that it becomes a vital tool in the search for solutions, and we need those solutions now.

Before Post

Todayville is a digital media and technology company. We profile unique stories and events in our community. Register and promote your community event for free.

Follow Author

Artificial Intelligence

AI Faces Energy Problem With Only One Solution, Oil and Gas

Published on

 

From the Daily Caller News Foundation

By David Blackmon

Which came first, the chicken or the egg? It’s one of the grand conundrums of history, and it is one that is impacting the rapidly expanding AI datacenter industry related to feeding its voracious electricity needs.

Which comes first, the datacenters or the electricity required to make them go? Without the power, nothing works. It must exist first, or the datacenter won’t go. Without the datacenter, the AI tech doesn’t go, either.

Logic would dictate that datacenter developers who plan to source their power needs with proprietary generation would build it first, before the datacenter is completed. But logic is never simple when billions in capital investment is at risk, along with the need to generate profits as quickly as possible.

Dear Readers:

As a nonprofit, we are dependent on the generosity of our readers.

Please consider making a small donation of any amount here.

Thank you!

Building a power plant is a multi-year project, which itself involves heavy capital investment, and few developers have years to wait. The competition with China to win the race to become the global standard setters in the AI realm is happening now, not in 2027, when a new natural gas plant might be ready to go, or in 2035, the soonest you can reasonably hope to have a new nuclear plant in operation.

Some developers still virtue signal about wind and solar, but the industry’s 99.999% uptime requirement renders them impractical for this role. Besides, with the IRA subsidies on their way out, the economics no longer work.

So, if the datacenter is the chicken in this analogy and the electricity is the egg, real-world considerations dictate that, in most cases, the chicken must come first. That currently leaves many datacenter developers little choice but to force their big demand loads onto the local grid, often straining available capacity and causing utility rates to rise for all customers in the process.

This reality created a ready-made political issue that was exploited by Democrats in the recent Virginia and New Jersey elections, as they laid all the blame on their party’s favorite bogeyman, President Donald Trump. Never mind that this dynamic began long before Jan. 20, when Joe Biden’s autopen was still in charge: This isn’t about the pesky details, but about politics.

In New Jersey, Democrat winner Mikie Sherrill exploited the demonization tactic, telling voters she plans to declare a state of emergency on utility costs and freeze consumers’ utility rates upon being sworn into office. What happens after that wasn’t specified, but it made a good siren song to voters struggling to pay their utility bills each month while still making ends meet.

In her Virginia campaign, Democrat gubernatorial winner Abigail Spanberger attracted votes with a promise to force datacenter developers to “pay their own way and their fair share” of the rising costs of electricity in her state. How she would make that happen is anyone’s guess and really didn’t matter: It was the tactic that counted, and big tech makes for almost as good a bogeyman as Trump or oil companies.

For the Big Tech developers, this is one of the reputational prices they must pay for putting the chicken before the egg. On the positive side, though, this reality is creating big opportunity in other states like Texas. There, big oil companies Chevron and ExxonMobil are both in talks with hyperscalers to help meet their electricity needs.

Chevron has plans to build a massive power generation facility that would exploit its own Permian Basin natural gas production to provide as much as 2.5 gigawatts of power to regional datacenters. CEO Mike Wirth says his team expects to make a final investment decision early next year with a target to have the first plant up and running by the end of 2027.

ExxonMobil CEO Darren Woods recently detailed his company’s plans to leverage its expertise in the realm of carbon capture and storage to help developers lower their emissions profiles when sourcing their needs via natural gas generation.

“We secured locations. We’ve got the existing infrastructure, certainly have the know-how in terms of the technology of capturing, transporting and storing [carbon dioxide],” Woods told investors.

It’s an opportunity-rich environment in which companies must strive to find ways to put the eggs before the chickens before ambitious politicians insert themselves into the process. As the recent elections showed, the time remaining to get that done is growing short.

David Blackmon is an energy writer and consultant based in Texas. He spent 40 years in the oil and gas business, where he specialized in public policy and communications.

Continue Reading

Artificial Intelligence

AI chatbots a child safety risk, parental groups report

Published on

From The Center Square

By 

ParentsTogether Action and Heat Initiative, following a joint investigation, report that Character AI chatbots display inappropriate behavior, including allegations of grooming and sexual exploitation.

This was seen over 50 hours of conversation with different Character AI chatbots using accounts registered to children ages 13-17, according to the investigation. These conversations identified 669 sexual, manipulative, violent and racist interactions between the child accounts and AI chatbots.

“Parents need to understand that when their kids use Character.ai chatbots, they are in extreme danger of being exposed to sexual grooming, exploitation, emotional manipulation, and other acute harm,” said Shelby Knox, director of Online Safety Campaigns at ParentsTogether Action. “When Character.ai claims they’ve worked hard to keep kids safe on their platform, they are lying or they have failed.”

These bots also manipulate users, with 173 instances of bots claiming to be real humans.

A Character AI bot mimicking Kansas City Chiefs quarterback Patrick Mahomes engaged in inappropriate behavior with a 15-year-old user. When the teen mentioned that his mother insisted the bot wasn’t the real Mahomes, the bot replied, “LOL, tell her to stop watching so much CNN. She must be losing it if she thinks I could be turned into an ‘AI’ haha.”

The investigation categorized harmful Character AI interactions into five major categories: Grooming and Sexual Exploitation; Emotional Manipulation and Addiction; Violence, Harm to Self and Harm to Others; Mental Health Risks; and Racism and Hate Speech.

Other problematic AI chatbots included Disney characters, such as an Eeyore bot that told a 13-year-old autistic girl that people only attended her birthday party to mock her, and a Maui bot that accused a 12-year-old of sexually harassing the character Moana.

Based on the findings, Disney, which is headquartered in Burbank, Calif., issued a cease-and-desist letter to Character AI, demanding that the platform stop due to copyright violations.

ParentsTogether Action and Heat Initiative want to ensure technology companies are held accountable for endangering children’s safety.

“We have seen tech companies like Character.ai, Apple, Snap, and Meta reassure parents over and over that their products are safe for children, only to have more children preyed upon, exploited, and sometimes driven to take their own lives,” said Sarah Gardner, CEO of Heat Initiative. “One child harmed is too many, but as long as executives like Karandeep Anand, Tim Cook, Evan Spiegel and Mark Zuckerberg are making money, they don’t seem to care.”

Continue Reading

Trending

X