AIoT: Thoughts on Artificial Intelligence and the Internet of Things
It’s time for another convergence. IT/OT convergence is happening, initiated by cybersecurity mutual interests, now pressure-fit by fuzzy boundaries with computing at the edge, and in the Cloud. Let’s talk about two other large technology trends forming a junction, Artificial Intelligence, and the Internet of Things. We can even squeeze out a letter, and instead of AI-IoT, we have AIoT. This becomes more interesting when thinking of AIoT at the edge, where the action is, however perhaps more distributed than ‘classic AI’ (a phrase I never thought I’d write).
There is more data from more instrumentation and omnipresent embedded sensors, and this begins the story of IoT and Big Data and the connectedness of things. Not only might we learn more by collecting data, but communication improvements (just watch the 5G advertisements), are making the data velocity, as well as volume, increase. Moreover, every object is now a ‘thing’, including the one you see when you look in a mirror. Think about Fit Bit and Apple Watch. In contrast, a good summer read is Jaron Lanier’s “You Are Not a Gadget” from 2011. It was a national bestseller, very prescient.
Of course, even in basic SCADA, we collect data faster than people can analyze it. If it is unstructured data, such as drone video, an hour’s flight time can take months for analysis. In my area of Smart Grid (which I think of as IoT for Electric Power 1.0), every classic Intelligent Electrical Device is becoming an IoT device, to the point where it is hard to buy a ‘dumb’ electro-mechanical relay.
The thought is sometimes lost that AI means Artificial Intelligence. As with artificial sweeteners, there are good and bad aspects. Making an artificial something, when we don’t completely understand the original workings of intelligence, is ambitious. There are aspects we tend to start with, that represent human intelligence but scale it. From the Smart Grid area:
- Inspection of images, gathered by drones, to keep manual site visits down, is one case. IoT is a general topic; to me, a drone is IoT that flies; a robot is IoT that senses and interacts with the physical world. Pattern recognition of visual defects (e.g. is there surface rust on transmission line towers, or inside of gas pipelines) may not need a mechanical engineering degree to observe. One could have a cub scout troop walk down a transmission line corridor right of way with binoculars and do a credible survey of tower rust, geotagging photos that looked suspicious. The problem is that it doesn’t scale, so we apply AI, though it is simple for people to do it on a small scale
- Being so visually focused (which is why most of a TV’s cost is in the picture circuits, not the sound, and why connecting a small stereo system to your TV can add so much to viewing), we forget the field of acoustics. Did you hear that? What does a machine normally sound like? Just because we’re at the top of the food chain does not mean we’re at the top of the sensory chain. For instance, a recent patent was issued for acoustic monitoring of electric transmission lines.
A major utility was years behind in the manual analysis of images of transmission tower rust. Here AI might not replace people, it could help scale a problem that humans can do well, perhaps better, by having AI be a pre-screening intern. However, back to those cub scouts for a minute, they probably didn’t call reddish-brown leaves on the trees seen through the tower structural members rust, even if the leaves were the correct color, because when you give a ten-year-old that assignment, they know you meant rust on the metal of the tower, not rust color near the tower. So, another point, AI may be aided by pre-processing with more classic techniques, such as mapping the tower steel elements in the photo, before the actual AI part begins.
A similar example in the smart grid now is vegetation management – where do distribution power lines and trees intersect, so problems can be detected, and preventive tree-trimming can be optimized? Cameras in trees and poles is not the answer, but IoT does not mean the sensor has to be on the ‘thing’. Here satellite and LIDAR images can tell us enough about trees via remote sensing. This is a $100M annual cost, even at some mid-size utilities. The amount of data is enormous; again, a troop of boy scouts could do it, but how fast? So, in an encore performance, AI is used to correlate satellite imagery (the original high-flying drone perhaps) with GIS information on power lines, using an advanced geospatial data integration platform.
Of course, people operating industries in real-time don’t have the luxury of searching through all of this data, so AI comes in again, via natural language processing, speech to text, and text to speech, so it can listen to, and answer questions from, field technicians.
There is a so-called aging workforce problem. Of course, it is not a problem that people get older and more experienced. The problem is ‘tribal knowledge’, a polite way of saying it was no one’s job to organize the company’s information learning over decades, only to discover a problem exists at someone’s retirement party. There was a time when computer science and library science were in the same department in universities – now the importance of that relationship is glaring. AI systems need context and defined taxonomies to derive the most value out of IoT streams.
In fact, there is now AI and robotic process automation (a term I’m ambivalent about because it makes me think of the Jetsons having a robot to do workflow) - just to organize the data. Take for example DMS, EMS, AMI, major systems at many utilities. While they are different technologies, they all produce or consume huge volumes of data, and may use different names in different systems for the same data point. This may make upgrades to systems such as DMS absorb thousands of unplanned utility engineering hours. Is it not reasonable for a DMS vendor to expect the information the utility needs to deploy its new system is ready, organized, stored in some XML or systematized taxonomy, perhaps according to some industry standard, as a requirement that customer has to fulfill?
- Some modern utilities (no names are being used in this article to protect the industry leaders) believe all the data is interesting; little is important. How to find a needle in a haystack, where the focus is on the few nuggets that are of value. Value, of course, is in the eye of the person funding the project – if it is transmission it may be synchro-phasors, if it is distribution it might be smart meter data. Data storage is cheap. When contemplating its first analytics projects, the sponsor should first be sure all SCADA data is kept on-line, even if the SCADA system has limited disk storage itself, because the value of that data will increase.
- I like to think of AI as a smart intern, not a replacement. It can help filter through masses of data, for the humans to focus on what may be important. I have heard this is done in radiology – computers determine the X-ray is OK, or has an issue, and the issues go to the M.D.
- With the surge in renewable energy on the transmission system, communication must support data captured at least twice a cycle, over long distances, and this may be problematic. Especially since these lines are sometimes in the middle of nowhere. All is good, good... All is well until something is out of alignment – transmission oscillation due to renewable energy fluctuations perhaps. A potential case for some edge computing.
Technology Changes, People Often Don’t
Just because AI is here, we need to ask the question:
- Does the problem need AI?
- Could it be improved by a machine? Much has already
Rapid application of decisions to manage high volumes of data (95% of the time no panic, just observed – then a critical period of time is demanded to make the best decision quickly). AI does not solve everything, it needs to be matched to the application. If there is a closed-form analytical solution, do we need AI to rediscover it? Do not have AI ‘learn’ Fourier analysis of harmonics on the grid from inverters. Instead give AI the harmonic analysis, the grid state, the weather, the load forecast, and let it help with decision support for future operational contingencies.
Can we trust more to automation just because we can’t do it in time? For example, nuclear has always had more automation because a) things could go wrong faster and b) consequences may be more severe.
Most automation projects since the punched card era starts as an off-line helper, then gets trusted by people in control.
With AI, one must train it; before it ships, and after delivery. Here engineers are key to helping the system learn, and people purchasing AI might consider there is a cost after delivery, not just buying a year’s worth of software support. AI doesn’t have tribal knowledge. It needs engineers to say, ‘you’ve got this wrong!’ Traditional software doesn’t get better, but it doesn’t get worse either, and needs no guidance, just perhaps human patience. AI is not magic out of a box, it is a middle school student who with guidance could become more than its teacher.
Now comes the importance of consistent and available data across large parts of the grid system, but data has to be managed properly, and kept private and anonymized. People are good at filling in the blanks in the face of ambiguity. I’ve reviewed many presentations by US authors and explain to them what won’t carry over to native speakers of another language or culture – by idiom, by abbreviation, or by typos. People are good in their native language at filling in the blanks or discerning new acronyms by context. For AI, this ‘background knowledge’ can be troublesome.
Ai can be an advisor – for decision support, suggestions on how to get operations back on track ASAP. AI can do routine observation – given what is seen from IoT, do experiential learning from operator actions, sift out best practices – which may first go into a simulated training environment. No one wants a pompous system making suggestions that haven’t been tried off-line
AIoT at the Edge can effectively be pushing decisions to the edge. As we consider whether to push decisions and actionability further out, the challenge is also with the control of data and its movement; privacy at the edge; utility becoming disintermediated (humans always want to push the button). Consider what is our comfort level – do we trust it enough to let it run critical infrastructure? Will we forgo speed for control? What’s our level of comfort to give up control? Test and regression analysis in classic software becomes test and correct and help AI differentiate the right answer from the wrong answer. There is always the Hollywood aspect though – the teacher has to consider whether an AI system is on to something, not just that it didn’t do what the human would have suggested.
People are the hard part of technological progress. One needs to think - even when I can trust the AI, how do I integrate it operationally? The grid edge devices are doing things, but I still have a team of people. Change is hard. Where things have failed in the past is change management; why perhaps 80% of projects may miss their initial project plan dates.
Security – how do I build a network that is impervious – redundancy and resiliency are especially important if the Edge is off on its own, even if on a short leash. If it has a computer and a network connection, it has a cyber vulnerability that needs to be addressed. This is something I believe, but will an AI system be able to recognize a cyber problem alone at the edge?
AI for decision support – what caused the fault in the grid? There are plenty of FLISR solutions. Can AI bring together more information, besides weather and SCADA, such as municipal repair activity or other utility construction? Fire in the area that the utility is not aware of? AI can read news feeds, especially from hyper-local companies such as Patch, and Twitter.
AI for creating feedback loops – does anyone go back to the maintenance records today (maybe the summer intern) and look for abnormalities in SCADA data – after a failure? We could use AI to check out anomalous faults and attach annotations to field reports. We could be learning by AI understanding maintenance notes and correlating with the SCADA data from two hours before the fault. What’s not getting done because it takes too many people to do it, even if the data is available?
The Keys Tend to Be
- Have a platform – extensible by the user with their own analytics and learning, and with some industry standard data models. Otherwise, you’ve created your own vendor lock-in of engineering knowledge, whose efforts aren’t portable
- Put learning in a process – learning instantiated in a system eventually grows in the right direction – just see most on-line help forums
- Empower people – they are being assisted, not replaced
- Last, but not least, remember Dilbert – whose biggest problem is (mostly) smart engineers with poor management. Independent AIoT devices will need some clue as to the overall system operating state, because their actions, done independently, while correct for their specific mission as they know it, might be different if the overall scenario is unusual. Think, for example, in the electric power system, there may be protective devices to reduce turbine overspeed driving the generator. An action that causes sudden loss of load, while locally correct, may result in a plant shutdown, though each IoT system was doing its smart mission locally, in a vacuum, the overall result was less than successful. If this was an IEEE Journal article, it would be called emergent behavior, an aspect of systems of systems science.
These are the author’s views, not necessarily those of IBM.There is a Utility University session on the topics of the article, and more, at DistribuTECH 2020, organized by the author. Utilities interested in speaking at this tutorial may contact the author.
Jeffrey S. Katz is a Senior Member of the IEEE. He is a member of the IBM Academy of Technology and the IBM Industry Academy. He was a co-chair of the IEEE 2030 Standard on Smart Grid Interoperability Guidelines, IT Task Force. He was on the External Advisory Board of the Trustworthy Cyber Infrastructure for the Power Grid and is on the Advisory Board of the Advanced Energy Research and Technology Center. He was on the “Networked Grid 100: The Movers and Shakers of the Smart Grid in 2012” list from Green Tech Media. He was appointed to the IEEE Standards Association Standards Board for 2014. He is an Open Group Distinguished IT Specialist. He co-chaired the first IEEE Power and Energy Society workshop on Big Data in Utilities in September 2017 and co-organized the first PES workshop on Utility Cybersecurity in December 2017. He is a member of the Industry Advisory Committee for the IEEE Intelligent Smart Grid Technologies conference for 2019 and 2020.Prior to IBM he was the Manager of the Computer Science department at the U.S. Corporate Research Center of ABB, and then of ALSTOM (now GE).He can be reached at email@example.com or Jeffrey.firstname.lastname@example.org
Sign Up for IoT Technical Community Updates
Calendar of Events
IEEE 7th World Forum on Internet of Things (WF-IoT) 2021
14 June-31 July 2021
Call for Papers
Special issue on Advanced Communications, Control and Computing for Industrial Internet of Things
Submission Deadline: 15 June 2021
Special issue on When Blockchain Meets 5/6G – Enabling Endogenously Secure IoT
Submission Deadline: 1 July 2021
Special issue on Cloud/Fog/Edge-enabled Big Data Intelligence for IoE
Submission Deadline: 15 July 2021