(LinkedIn: https://www.linkedin.com/pulse/from-operational-research-agi-ai-electric-utility-industry-marcoux-tlbge/)
Artificial intelligence in utilities has been evolving through complementary techniques such as operational research (OR), machine learning (ML), and, more recently, deep learning (DL), which is the foundation for today’s generative AI (GenAI) systems, like ChatGPT. While discussions about artificial general intelligence (AGI) remain largely speculative, the tools already available are transforming how utilities plan and operate.
I coded systems in the 1980s that, by today’s definitions, would qualify as OR and ML, when OR was already well established but ML was still in its infancy. That early experience does not make me an AI expert, especially today, but it gives me perspective on how technological waves rise, fade, and return in new forms. For utilities, too, the story of AI is less about sudden disruption and more about the gradual evolution of familiar analytical methods.
Operational research has underpinned power systems for generations. Think of t-c curves in substation breakers, under-frequency load shedding (UFLS), optimal power flow, hydro scheduling, outage coordination, logistics, and resource allocation. OR remains the bedrock of transmission operations and planning.
Machine learning was built on that foundation in the 1990s and 2000s. Utilities began using ML for forecasting: load, renewables, and even prices in deregulated markets. Fault detection, predictive maintenance of transformers and cables, and anomaly detection in both physical and cyber domains are all part of this wave.
Deep learning is now emerging and blending into GenAI applications. Drones with computer vision help with vegetation management and line inspections, while natural language processing supports call centres and outage reports. DL and GenAI are also increasingly used in data engineering tasks such as ETL (extract, transform, load) and data validation during IT or OT system upgrades. Large operation manuals running thousands of pages can be loaded into LLMs to provide simplified queries for field staff, reducing the need for extensive retraining. Major technology firms are releasing toolkits designed to assist with these tasks, such as Google’s AI agents for data teams. These tools can accelerate integration but still depend on strong data governance and domain expertise.
AGI is the theoretical next step. But utilities should not chase AGI for its own sake. The practical gains come from systems tailored to grid management, customer service, and asset health, concrete problems that demand explainable and regulator-ready solutions.
Many utilities have been reluctant to adopt even proven OR or ML applications. Preventive maintenance, for example, has often been skipped in favour of run-to-failure strategies. The hesitation is not purely technical: weak data governance, inadequate telecommunications, and cultural or regulatory conservatism have all contributed to slow uptake. Paradoxically, these same approaches are sometimes embraced more readily once rebranded under the banner of AI. Yet the true test for utilities is not technological; it is organizational, requiring management, governance, and training that match the pace of innovation.
There are also risks of errors, especially with generative AI (GenAI) and large language models (LLMs). These systems, built on deep learning, can generate text, code, or answers to queries, making them powerful tools for simplifying information access. However, I use GenAI systems myself for writing, and they can speed up the first 80% of the work, but they still make mistakes. That is acceptable for drafting texts reviewed by a knowledgeable human, but not for mission-critical systems where safety and reliability are paramount.
Data Challenges
A real constraint for utilities is not algorithms but data quality and management. Transmission networks are relatively well-instrumented, but distribution networks are messy: millions of dispersed assets, fragmented records, and legacy systems. As distributed energy resources (DERs) and smart meters multiply, the volume of data explodes. Governance, interoperability, and cybersecurity must come first. Poor data quality also limits AI training and model validation, making good data management a prerequisite for credible results. DL and GenAI tools can assist with data cleaning and integration, but only if strong frameworks are in place.
Workforce and Implementation Challenges
Utilities are reliability-driven and risk-averse, understandably so. But that culture makes adopting digital tools difficult. AI projects require hybrid skills: expertise in power systems and in AI models. Workforce retraining and AI literacy are essential. Change management is often the biggest barrier. Another obstacle is the limited willingness to experiment. Unlike technology firms that dedicate teams to test and iterate new tools, most utilities hesitate to invest in projects whose outcomes are uncertain, even when the learning value is high. Some tasks can indeed be simplified without much training; LLMs answering queries from massive technical manuals are a good example, but most require deeper integration into existing practices.
Regulators will also demand transparency. If AI is used for outage prioritization or rate optimization, the decisions must be explainable. Some approaches, such as OR and many ML models, are relatively transparent in how results are derived, while deep learning and LLMs often function as “black boxes”, making it harder to justify their outputs. Explainability matters not just for model quality but also for public trust, regulatory compliance, and the ability to audit decisions that affect reliability and customer outcomes. Blind faith in vendor promises of “AI magic” will not withstand regulatory or operational scrutiny. In fact, utilities often place more trust in the real-world experiences of peer utilities than in vendor marketing. Vendors that succeed will be those who act as patient knowledge bridges, bringing lessons and insights from other utilities worldwide on how AI has been deployed effectively, not just selling technology.
New Electricity Demand
AI is not just a solution; it also drives new electricity demand. Large AI data centres now consume electricity at levels comparable to an aluminum smelter. This rising load adds pressure on already constrained grids. There are also concerns about where the data is stored: if critical data is kept in foreign countries or may be required to be provided to foreign governments, questions of sovereignty, security, and regulatory compliance arise.
The Way Forward
The next stage for utilities is not about chasing AGI. It is about becoming AI-ready organizations by focusing on:
- Strengthening data governance and interoperability
- Managing IT/OT projects with discipline
- Building workforce skills in AI and data literacy
- Prioritizing explainability and regulator-ready solutions
AI will not replace the fundamentals of electricity, such as safety, reliability, and efficiency, but it can enhance them. The challenge is less about algorithms than about how utilities manage their data, infrastructure, and people. The utilities that master data and governance today will be the ones shaping how AI transforms the grid tomorrow.