Skip to main content
TechnologyStrategy

Bringing Automated Inspection From Concept to Scale by “Closing the Loop”

By April 27, 2021No Comments

Computer vision and the quickly evolving field of drones and robotics for inspection have become key figures in the race to realize automated operations across generation and grid.

Just over a decade ago, I observed my first computer vision demo: a simple, high-performance application using a smart camera to scan for and analyze product defects in a high-volume production line. Depending on the quality assessment, the line would automatically accept or reject the defective part, effectively closing the loop in a fraction of a second.

Fast forward to today, computer vision and the quickly evolving field of drones and robotics for inspection have become key figures in the race to realize automated operations across generation and grid. Tremendous progress has been made on the hardware front, as can be seen in Boston Dynamics’ advanced Spot robot that is capable of carrying different sensing payloads and in Flyability’s evolved class of drones that feature innovative scaffolding. Each is already deployed and generating value in their respective domains, yet full automation and scaling remains difficult and elusive. Why? Because data and software have yet to sustainably close the inspection loop.

Today, there are four broad workflows in asset or site inspection: 1) the physical collection of images and other data using human or robotic methods, 2) the human or machine-based analysis of that data, 3) a subsequent decision-making process, and 4) action and resolution. Keen observers will recognize that this closely imitates the human OODA loop – Observe, Orient, Decide, Act.

The problem for autonomous inspection is not that the process is wrong, but that the process is perpetually inconsistent; and inconsistency is incompatible with automation. Not only are different groups and dependencies involved in each sub-process – inspection data is still often exchanged by flash drive in many cases, for example – but neither the human operators or the robots have full context or visibility across the process. In short, there is an incomplete bridging of the physical and digital world.

At the same time that inspection hardware has been evolving, improvements in industrial data operations and digital twin technologies bring to market what is likely to be the most sustainable way of closing the inspection loop from data acquisition to action. Rather than just analyzing raw inspection data linearly, data operations and the digital twin add human-like meaning and association to the thousands of images, time stamps and other data that fill key gaps by providing:

  • Two-way context to both the robots and the operators, enabling robotics to construct, “see” and add to the digital operating environment in a human-like manner, while converting data into information that can more easily be interpreted by humans. With a fully contextualized digital twin composed of images, scans, GPS, etc., robotics not only have “visual” access to the environment but all additional data and metadata pertaining to its surroundings.
  • Fluid handoffs between sub-processes so that data and even some decisions can be exchanged from the original point of analysis to the commissioning of a work order without losing quality or integrity. This minimizes error and inconsistencies in what is a manually demanding process that often leaves inspection data under-analyzed. With this barrier lessened or removed, operations and analytics teams can shift their effort away from data overhead and towards valuable decision-making.
  • Connection to broader digital infrastructure so that this inspection data can be reused across the organization for other analysis including planning, reporting, prediction and optimization. Data operations, as a component of digital twins, unlocks data from individual silos and contributes to an open ecosystem of interoperable data and analytics technologies. By using this open approach, more data consumers are empowered with more data and innovation can thrive.

By using these pillars, this contextualized operating environment becomes the true driver of sustainable automation for robotic and drone-based inspections because it exists as a “living” entity. It can be updated daily, or even hourly, with new images taken from exactly the same viewpoints; it can be contextualized with live operational data from across the organization, and it can be leveraged for decision-making applications and work order automation. It also facilitates scale, both in terms of adding new site data, robotics hardware and payloads, missions, etc., without significant system redesign or other major overhead.

But the best part? This transformation doesn’t need to happen all at once or dramatically disrupt your existing momentum. The beauty of a data operations and digital twin-based approach is that the various sub-processes can evolve incrementally while keeping the end in mind: an interoperable ecosystem of robotics, drones, data, analytics and empowered human operators  all working in concert.

Frankly, there’s never been a more opportune time to start closing the automated inspection loop.

 

Gabe Prado is the Product Marketing Director- Power & Utilities at Cognite.