Make your data center more competitive with digitalization
Greater efficiency, safety and reliability now plus a future-proof platform for growth.
Power is the lifeblood of data centers, and with 100MW facilities the new normal for top-tier facilities, managing it has become even more essential. Digital technologies offer the means to reduce energy costs and increase flexibility, making the given facility more competitive in an increasingly competitive market.
Several trends have emerged that appear likely to be with us for the foreseeable future:
Automation across multiple facilities. Modern data centers already rely on a variety of automated processes (e.g., monitoring and control over cooling units), but as the drive toward lights-out operation continues, we can expect even more automation across the industry. The ability to manage multiple facilities from a single control center, for example, will put a premium on presenting relevant data to operators and on configuring or changing a local network remotely.
Simplification. The scale of today’s data centers demands that the systems and processes that support them be as straightforward as possible. Reducing the number of control layers, doing more at the local level (autonomously, if possible), and reducing latency are all part of the solution.
Cost and energy. Both of these are already front-ofmind for operations managers, who will be looking ever deeper and farther afield in search of savings. One example: Google is using its Deep Mind AI to distribute processing load among servers such that the overall cooling load is minimized—by 40% according to the company. Facebook reportedly is even using temperature sensors to ensure the bolts of busway systems remain within tightness specifications.
What may not be apparent in this short list is how much these trends will depend on digitalization, which ABB distinguishes from simple “digitization” (the replacement of analog equipment with digital). Digitalization refers to the use of data to operate, maintain and optimize equipment and processes. It relies on digitization but goes much further.
A good example of how this plays out in the context of a data center can be found in the power distribution system, specifically digital switchgear.
Flexibility at scale
One of the most important aspects of digitalization is the flexibility and scalability it affords. In the case of switchgear, it means that the equipment can be adapted to changes in power system design even at the final stage of the manufacturing process. Once installed, future changes can be applied via software instead of altering or replacing hardware.
The peer-to-peer architecture of digital systems also allows more decisions to be made at the device/local level, reducing network traffic and demand on supervisory level controls. This is useful in power systems as it allows potential disturbances to be addressed before they can cause significant damage and/or downtime.
One enabling force for such flexibility is the adoption of “open” standards like IEC 61850, which governs communications between power system equipment. This protocol establishes a future-proof platform that can be updated easily via programmable logic in protection relays instead making physical changes to hardwired connections. Digital signals don’t degrade over distance like analog ones do, so the Ethernet cables that replace miles of copper wire provide a more reliable, consistent signal. The IEC 61850 standard also calls for the wiring and signal transfer to be actively supervised, automatically, to enable fast and precise actions in event of a failure.
Safety is of particular concern with power systems, and digitalization delivers benefits here, too. Compared to conventional equipment, there is less material in digital switchgear that is exposed to highvoltage electrical stress. That means that data center personnel are not exposed to high voltages during inspections or maintenance activities.
Sensors such as those used to monitor digital switchgear are also more flexible and easy to work with than conventional metering transformers. They are easy to re-configure, even remotely, which cuts down on the risk of human error, further enhancing safety.
Smaller, faster, cheaper
Sensors are smaller (~30%) and lighter than the instrument transformers they replace. They’re also one-size-fits-all, eliminating the need to engineer specific current or voltage transformers for a single installation. Sensors can also meet multiple needs, and are commonly stocked by power equipment distributors. In general, data center users can expect 30% faster delivery (i.e., lead time) and up to 25% faster installation and commissioning time when working with sensors as compared to traditional alternatives.
This pays dividends for digital switchgear that use such devices. A typical 30-panel lineup, for example, uses 80% less relay and protection control wires, incurs lower energy losses, uses fewer panels with fewer components in the control compartment, and can be installed in two days.
Sensors also collect data that can be gathered for use in predictive maintenance programs and other analytic applications.
A typical 30-panel digital switchgear lineup uses 80% less relay and protection control wires, incurs lower energy losses, uses fewer panels with fewer components in the control compartment, and can be installed in two days.
We’ve focused here on switchgear, but the digitalization trend extends to other power system equipment as well. Transformers, for example, are now available with built-in sensors with Wi-Fi and Ethernet connections. With no additional wiring or hardware required, they feed operational data via any number of communication protocols into enterprise applications such as predictive maintenance. All of this additional functionality is rolled into the capital expense of the equipment.
Sensor technology continues to advance while costs continue to fall, so it’s likely that this kind of baked-in digitalization will make its way into more asset classes going forward. The challenge for data center operators may well be an “embarrassment of riches” when it comes to the opportunities for fine-tuning operations that digitalization makes possible.
Feeding the data beast
Keeping up with the exponential growth of data
Head of Data Center Business Development
Trying to get your head around the amount of data that is produced by modern society can be mind-boggling. Billions of devices and machines are getting connected to the internet, sending and receiving so much data that global traffic is now measured in zettabytes (that is 1021 bytes).
Most of this traffic goes through data centers, which are springing up rapidly around the world to keep pace. Data centers are energy-hungry, devouring huge amounts of electricity, equivalent in consumption to a small town. According to ABB assessments, data center traffic will grow by 400% over the next three years, with worldwide data center energy increasing to 60 gigawatts, or 2.5% of global electricty consumption by 2020.
As data centers grow in size and number, owners and customers have a major incentive to manage their energy use wisely. Energy accounts for up to an estimated 40% of the total cost of ownership and, if centers do not become more efficient and innovative, their growth could be constained by overloaded national power grids.
ABB is a leading partner in the industry’s efficiency drive, and ABB Ability™ digital solutions are bringing down costs and energy-use for customers across the world.
In the Netherlands, for example, Alticom has installed micro data centers in telecommunications towers across the country, using space made available as telco technology switched from analogue to digital, shrinking drastically. The thick walls, cool air at altitude and relative ease of securing the sites make these iconic towers (one of which is the country’s tallest building) ideal for safeguarding data. ABB Ability Data Center Automation allows Alticom to remotely monitor and control the cooling and energy use of all 24 of its towers, making it one of the major elements helping Alticom to run the centers more efficiently.
Following a recent partnership with IT giant Hewlett Packard Enterprise (HPE), we can expect even more technological breakthroughs in the coming years. The two companies are developing an intelligent solution that promises to optimally balance data center workloads and power resources, and cut costs.
Almost all aspects of our lives are becoming digitalized, from hospitals to financial institutions to transport, and the consequences of losing that data (or access to it) are potentially devastating. ABB is helping data center operators to ensure that our data is stored reliably, efficiently and at the minimum cost to the environment.
→ Join the conversation
The case for IoT…
Waiting for a rock solid business case may put your company on the back foot
Head of Data Center Business Development
The ‘business case’ is a necessary part of business management but it is not going to necessarily make you more effective. Think about the ubiquitous cell phone. I recall when I worked in tech ops in the late 90’s the cell phone became the de facto communications platform for the operations person, but I don’t recall ever seeing a business case, it just made sense. Likewise, in fast moving tech, waiting for a solid business case before launching a new product or embracing a new capability may mean you are late to the game, so don’t be a slave to the ‘business case’ philosophy.
Internet of things (IoT) is here, it’s real and it’s gaining momentum … some might say this is the advent of a new technology wave that will transform how we live, others might suggest that it’s repackaging a concept that is decades old. Regardless, all manner of devices are now smart, connected and increasingly contextual. This expectation is not a part of a feature set, it’s a minimum requirement.
At ABB, we have been the ‘T’ in IoT for many, many, years. We have worked with customers across many diverse industries to transform how they operate their infrastructure and we have been at the forefront of delivering secure connectivity and automation for their critical infrastructure.
With an installed base of more than 70,000 control systems connecting 70 million devices we have been providing digitally connected and enabled systems for a long time. A growing and exciting market we serve is the global data center industry. We are working with some of our customers in revolutionizing data center energy optimization, offering smart grid connections, off-grid electrification, power protection, and power management software. Central to the company’s strategy is integrating its power optimization systems with sensor-enabled products. The goal is to create a data-driven, intelligently automated solution that balances data center operations with the most cost-efficient power sources in real time to maximize data center power efficiency for its customers. Intelligently connected IT reduces costs and optimizes the data center resources.
So, what next for the data center operator?
In my view IoT is better described as ‘digital operations’. It’s about providing our customers with the ability or ‘ABB AbilityTM
’ to control, interrogate and modify their operational tasks in response to external signals. It’s about making operations more reliable, more cost effective and easier to execute. Digital operations will not replace people, in fact there is strong evidence that we are facing a skills shortage in the data center operations discipline. Digital operations will simply make them more effective. We want to better understand what our customers are trying to achieve and then couple that expectation with our technology.
We have distilled our view down to 3 key characteristics of the next generation data center, where digitizing the ‘how’ will make the ‘what’ more effective.
We contend that rather than being distracted by all that market noise and hype there are a few simple steps to ensure your business can take advantage of digital operations.
- Makes sure any components and systems can be networked, everything from your grid transformer to the humble low-voltage circuit breaker.
- Assess new communications platforms and protocols to ensure you are deploying the most effective network. A great example here is using IEC61850 to reduce physical cables and points of failure across your infrastructure.
- Challenge your supplier to demonstrate that their solution and approach is elastic, that it can add and subtract capacity on demand and in response to other signals.
Now you should have the digital platform to make your operations smarter …what next?
You can now transform your operational tasks into software driven activity, you can extract operational data to identify trends and failure modes, and you can transform how you manage your maintenance regime to reduce downtime.
At ABB we have a strong portfolio of products and systems that can deliver this transformation, we strive to give you more harmonized operations and all the benefits that IoT promises. From the humble current sensor to the Emax 2 power manager and all the way to our cloud enabled automation platform, we have the solutions to digitize your power train. We are reducing the amount of communications cable you need as we shift from analogue to digital and we are replacing analogue sensors in our switch gear with their digital equivalent – software driven and dynamically configurable. We are bringing our experience from all of the diverse industries we serve to our data center clients so they can capture all of the benefit of a truly digitized power train.
A great example here is our customer, Systemec, based in the Netherlands. With a digitally enabled low-voltage power distribution and UPS, they have ultimate visibility and control into how the power is delivered to their clients’ systems, and also have the ability to interrogate and modify operations dynamically.
In conclusion, the quest for the perfect business case may just hold you back, while your competition get to increasing levels of efficiency, reliability and flexibility. Get ready today with an infrastructure that can be world class for tomorrow and beyond.
→ Join the conversation
Smarter devices enable smarter data centers
Why are data center owners embracing smarter power control and distribution devices?
Global Head of Technology, Data Center Industry Sector
When I called on data center operators in the recent past, some of them wondered why they should spend time talking to a company that mainly provides utilities and industrial customers with electric power distribution and control equipment. Today, the light is beginning to come on for our data center customers. They increasingly understand that, as huge power users, they actually have several significant things in common with utilities and industrials.
Data center managers are increasingly interested in power management, in addition to their never-ending quest for near-perfect uptime. That’s because the cost of power continues to be a bigger and bigger slice of a data center’s operating expenses, much more than it is for most production or process plants.
Where does the efficiency come from in data center power management? It starts with one of the smallest components in a power system: the lowly breaker. You’re not going to save any money with more-efficient breakers. In terms of power consumption, they are basically the same as a terminal block. Power comes in and, as long as the breaker is closed, the same power goes out.
The efficiency comes from breakers, like our Emax 2, evolving from what’s basically an On/Off switch to an intelligent device at the front lines of your power network. These breakers include built-in metering that provides really useful information to center operators about power use and quality. They can even help diagnose equipment issues. The breaker’s waveform-capture capability, for example, could reveal a power phase shift that indicates a problem in a motor in your HVAC system.
Getting that information to the control center and beyond is easier than ever because the Emax 2 incorporates the communication capabilities specified in the IEC 61850 standard. The standard basically enables any compliant device to be able to talk to any other compliant device, tearing down the Tower of Babel that existed prior to the standard. That makes aggregating the data from smart devices, like the Emax 2 breaker as well as switchgear, transformers, and other assets much easier. Considering that even a relatively small, 2 mW data center will have several hundred breakers, that’s a pretty big potential pool of data.
Another advantage of using IEC 61850-compliant devices is that it creates huge initial and ongoing savings when it comes to connecting the devices. Point-to-point wires to each device are replaced by a vastly smaller number of fiber-optic connections. In one instance, we reduced 2,000 wires to 200.
Imagine the cost reduction of installing one tenth as many wires when building a new data center. Imagine the reduction in connection issues and maintenance activities if you had one tenth as many wires to troubleshoot. The smart power-control devices do cost a bit more up front, but that’s clearly a good investment since construction cost is about 10% of the total cost of ownership for a data center. Once those smarter devices are in place, you’ll see reduced energy costs and fewer maintenance issues for the life of the center.
Data center managers operate warehouses full of smart-data storage and routing devices. They understand the value of intelligent technology. For data centers, our new mantra is “intelligent data needs intelligent power.” As data center managers learn more about how they can reduce their costs, increase their reliability, and improve their efficiency by embracing smarter power control and distribution devices, they will be more and more interested in adding smarter electrical devices to their facilities.
→ Join the conversation
Green data center makes its home in a disused mine... 100 meters below ground
The Greening of Data Centers Continues
Digital Crossroads exemplifies a new generation of energy-smart facilities
Data centers account for around 3% of global electric power consumption, and that figure is set to triple by 2030 according to industry analysts. The sector’s carbon footprint is accordingly the subject of much attention. So, it’s not surprising that data center owners are seeking to make their facilities more energy-efficient and to make their energy supply chain as green as possible.
Several of the largest data center operators have invested heavily in renewable energy: Apple and Google are 100% renewable-powered while Amazon is at 50%. Facebook and Google have both invested in solar PV projects, the latter representing the world’s first on-site solar facility at a data center.
Against this backdrop, Peter Feldman, CEO of Digital Crossroads, presented the energy bona fides of his new data center on March 7 at ABB Customer World in Houston. Digital Crossroads is being constructed on the site of a retired coal-fired power plant in Hammond, Indiana on the shores of Lake Michigan and takes advantage of several of the site’s features. Two wind turbines take advantage of the site’s breezy conditions and cooling is provided by lake water.
The facility will also feature solar panels on the roof and an energy storage system that offsets grid power costs, maximizes the use of on-site renewables, and provides additional standby backup power to the UPS systems. It also allows the facility to capture energy from the backup generation units when they’re run for monthly reliability testing purposes. This stacking of use cases is instructive for other data center operators as it is often the difference in building a business case for energy storage.
On the demand side, the building is constructed with a ceiling that is pitched toward the outside walls, allowing convection to direct hot air to intake vents. The HVAC units themselves are located outside on the ground, maximizing free cooling in winter and freeing up roof space for the solar array and floor space inside for more server racks. All these measures earn Digital Crossroads a PUE of 1.18 in summer and 1.10 in winter, putting the facility far below the industry average for similar facilities.
It’s important to note that the performance of any data center depends in part on the kinds of applications running on its servers. Some tasks (e.g., batch processing) are easily time-shifted whereas others must be always-on. Data centers running more of the former and less of the latter will find more opportunities for peak shaving, demand charge reduction and optimizing on-site generation.
At a macroeconomic level, the inevitability of renewable energy is becoming clearer. Since 2010, nearly all the new generation capacity installed in the US has been gas-fired or renewable. Solar PV is attracting about the same level of investment as gas and current auction prices for onshore wind and utility-scale solar are hovering around $20/MWh—and that’s without any subsidies. Coal capacity growth, meanwhile, is expected to go negative on a global basis by 2025 when retirements will outweigh new construction.
Storage is getting cheaper, too. In 2010 a commercial-scale battery energy storage system clocked in at around $1,000/kWh. By 2018 the cost had fallen to $150 and industry observers now see it dropping below the $100 mark by 2025.
With the energy density of new data centers continuing to rise, the industry’s appetite for energy will only increase, particularly at hyperscale facilities whose demand is already measured in the hundreds of megawatts. Given the trends noted above, it’s clear that the data center industry is poised to play a larger and larger role in the deployment of new wind, solar and storage projects, if not on-site then via investments in greening the industry’s energy supply.
Made-To-Order Digital Solutions Set to Revolutionize F&B Sector
Customizing digital solutions is vital in the food and beverage industry which is perhaps the most diverse sectors we know of today. Solutions that ensure consistency in milk quality in the dairy industry are vastly different from those used to make milk-based products like butter and cheese. While the digital needs of F&B sectors such as meat, dairy, confectionery and brewing vary, they all focus on similar key performance indictors (KPIs) such as cost, quality, safety and sustainability.
The first step on the road to digitalization is a thorough plant assessment to pinpoint processes that could potentially benefit from new technology. While some F&B companies have gone full steam ahead with integrated digital solutions, most still have basic levels of sensing and gathering data – sometimes as rudimentary as using paper for data storage.
In mid-October, ABB launched an assessment portfolio for food and beverage manufacturers to help them understand how their technology fared in the Industrial Internet of Things age. F&B companies can benefit greatly from data analytics and implementation of internet-based tools, which can help them realize the following:
- Higher productivity and equipment effectiveness
- Greater accuracy in processes and machinery
- Enhanced tracking and traceability
- Waste reduction and efficient energy use
Smart circuit breakers and transformer monitoring can ensure stable electrical systems at food and beverage plants. Take, for example, Morningstar Foods, which makes and packages highly perishable dairy creamers in Washington D.C. The company uses ABB devices to mitigate any impact of voltage fluctuations on its systems, especially during the storm season. The online PCS100 AVC corrects voltage sags, while the offline PCS100 UPS-I is used for short-duration outages, ensuring continued working of machinery and saving thousands of dollars in lost production time.
Food manufacturers use automation and robotics in their operations to cut cost, enhance production and increase quality. Pancake-maker HoneyTop Foods uses ABB’s FlexPicker to recognize, pick and stack pancakes off conveyor belts, allowing the company to pack 400 pancakes per minute. Confectioner Pepperidge Farm uses robots to sandwich chocolate in its Milano cookies, helping boost quality, reduce worker injuries and maintain hygienic conditions.
Easy-to-install smart sensors can remotely monitor the condition of motors through a smartphone application or web portal. ABB’s Ability Smart Sensor, which is used at global food-trader Olam International’s cocoa and dairy operations, enables predictive maintenance as opposed to the traditional reactive maintenance. This reduces downtime, extends equipment life and, in turn, curbs rising expenses related to operation and maintenance.
Implementing digital solutions in the food and beverage industry has its share of challenges. One of the biggest issues is the high cost of acquisition. Manufacturers are hesitant to invest in expensive, state-of-the-art technology, fearing a lengthy timeline for payback and the lack technical expertise.
Industry consolidation has brought with it another set of problems as F&B companies now have to control a wide range of facilities with unique setups all under one umbrella. Moving to cloud-based data solutions has also been difficult due in part to cybersecurity concerns. Another challenge is the reliance on suppliers for technical assistance during the integration process.
Some food manufacturers have basic digitalization capabilities such as PLCs (programmable logic controllers) and SCADA (supervisory control and data acquisition) systems, but these are islanded and limit the potential of automation. Instead companies should look to integrate these islands with manufacturing operations management (MOM) systems, which can evolve as business needs change.
The ever-expanding food and beverage industry is set to grow by 12.5 percent in the next five years, according to market research firm Statista. It is a critical time for F&B companies to evaluate and assess their digital needs to leverage the right solutions for their operations.
Robots and humans teaming up to help Americans stay healthy
Based on the report from Food Processing, "What’s Driving Automation Investments in the Food & Beverage Industry."
In just one generation, eating habits in the United States have undergone a transformation. Millennials who grew up on heat-and-eat mac n’ cheese and sugary colas now prefer to feed their children freshly-cooked meals that are minimally processed. As Americans become more conscious eaters, there is a palpable need for food and beverage companies to invest in robotic automation that offers tailor-made solutions.
Food and beverage companies, which kept costs low by building economies of scale and standardizing production, are now processing smaller batches of a wider variety of products. In this shifting environment, it is not surprising that a survey by trade magazine Food Processing showed an overwhelming number of companies were looking at investing more to modernize their automation systems.
Turns out that the often-used survey question “what keeps you up at night” can give us deep insights into the current state of the F&B industry. The 259 respondents in the Food Processing survey said they were most concerned about major product recalls, disruptions in production and changing consumer trends. Recalls strike at the heart of F&B manufacturing ethos by eroding consumer trust in brands. Production is often delayed or stalled due to a shortage of labor which in turn hampers companies’ ability to respond to changing consumer tastes.
To be sure, hard automation - the use of machines to perform specific tasks - is well established at most food processing companies in the United States. These machines can mix the right amount of ingredients to make everything from simple candy to intricately decorated cakes, fill thousands of containers within minutes and package finished items at a similar speed. But with changeovers becoming increasingly frequent, companies need more flexible robots that can quickly adapt to shifting consumer tastes.
For all the change taking place, the North American F&B industry continues to be a price-driven market that relies on high productivity and low per-unit cost to remain competitive. It is here that robotic automation can be of significant help: by helping the industry become more agile. Instead of pre-set actions, robots can quickly adapt to fluctuating factory environment and can be installed with just a few modifications to existing lines. Another emerging field is soft robotics which represents a paradigm shift in the way machines interact with the environment, especially in their ability to grasp deformable or delicate objects like perishable fruit and meat.
In fact, robotic automation has been around for quite some time, but food companies have been reluctant to harness its powers. The Food Processing survey revealed that financial reservations, which topped the list of concerns, stemmed either from capital expenditure restraints or the uncertainty of promised paybacks from using robotic automation.
However, it has been clearly established that the F&B industry will continue to face a central challenge for the foreseeable future: shifting demands of a dynamic market. As such, the ability to respond is more of a long-term competitiveness question rather than a near-term financial one. Even though robotics represent a 30-40 percent premium to hard-automated alternatives, their low-maintenance features coupled with increased flexibility makes a strong business case for an investment in robotics.
With the U.S. government clamping down on unhealthy additives and processing techniques with strict regulations, it is more pertinent than ever that F&B companies invest in future-ready technologies like robotics. Together, in collaboration with skilled workers, robotics will help more Americans stay healthier for longer.
Sustainability to power the future of F&B companies
Just as food is essential for the human body to flourish so is a solid sustainability plan for a company. The food and beverage industry in the United States is well-aware that sustainability is directly linked to profitability. F&B companies have been trying to reduce their carbon footprint while catering to socially and ethically conscious consumers.
Transparency is not just an option, but a requirement in the food industry. Not only do customers expect to know what’s in their food, they want to know how and where it is made. They are increasingly choosing to buy produce from companies that follow sound environmental and ethical practices.
A good ESG (environmental, social and governance) strategy could potentially save a company two-thirds of its EBITDA, according to a McKinsey report. Manufacturers that adopt these practices exhibit low capital costs and high operational performance.
Efficient machinery can save food processing plants millions of dollars in operational and maintenance costs. A German sausage maker halved energy use in its smokehouses by installing ABB’s variable frequency drives (VFD), which helped motors to ramp up and down instead of running at a constant speed.
The day-to-day functioning of factories across industries is also affected by power quality. Companies strive to keep their power factor – a metric that measures the efficiency of electrical energy use – as low as possible. For instance, PepsiCo shaved electricity bills in one of its plants with STATCON, which is a voltage source converter that facilitates stability and boosts the electrical network.
Automation and Robotics
Food and beverage makers are increasingly investing in automation and robotics to shorten production cycles and prevent accidents. A PMMI report said that nearly one-third of these firms employ robotics in food processing and over 90 percent have automated packaging.
Kraft Foods’ robot distribution system greatly reduced material costs at two Cadbury packaging facilities. CKF Systems designed and implemented ABB four-axis IRB 660 robots, which sped up packaging using vision-controlled distribution. The automation enhanced the company’s environmental program and, importantly, decreased the plant’s carbon footprint.
Pick and place robots have been used in production and packaging lines, reducing costs and turnaround times. For nearly 20 years, systems like ABB’s FlexPicker have been automating picking and sorting for several different F&B manufacturers ranging from fruit to confectionery to even wine and spirits.
Foodborne outbreaks could lead to hefty damages and even cause factory closures. With predictive maintenance systems, F&B manufacturers can save money and keep their reputations intact.
These tools monitor machines by gathering data and then analyzing it to predict when servicing is required, preventing downtime. Scheduling maintenance for only those machines that are not up to the mark results in lower production hours and reduced costs to companies.
Sustainability is vital in the global food industry as it juggles scarce resources and growing demand for production. Food producers have been evaluating and implementing environmental, economic and social factors in their production process. New age technology such as easy-to-install robots and intelligent data analytics have the potential to take productivity and sustainability in the food industry to the next level.
Sensors, Blockchain and the No Flies Zone
Advanced digitalization comes to the F&B sector
Fruit flies are very small, making them nearly impossible to see with the naked eye against a variable background like a crate of blueberries. Detecting the presence of fruit flies and removing them is a major challenge for produce growers, but now there’s a technological answer to this problem.
ABB has developed a system using spatial imaging combined with spectroscopy to generate an image that clearly distinguishes flies from fruit. The combined technology is known as hyperspectral imaging and could represent a major improvement in quality monitoring for the produce industry. The system could eventually be coupled with artificial intelligence to not only detect the presence of insects on fruits and vegetables but also trigger alarms and suggest remediation processes.
This was one of the many interesting examples of digital technology in the food and beverage industry presented at ABB Customer World in Houston the week of March 4th. There are several forces driving the digital trend in F&B. These include increased connectivity and internet bandwidth, computers capable of handling data analytics algorithms, and the proliferation of inexpensive sensors to collect data from devices in the field and on the plant floor.
ABB’s Rich Dovi, Jon Rodriguez and Roger Gaemperle showed several real-world examples that demonstrate how the industry has picked up the pace of digital adoption.
One project involved improving batch cycle times, quality and yield. The plant in question had experienced problems on all these fronts due to repeated manual adjustments to manage acid levels and viscosity. They were limited to producing two batches per week due to a highly iterative process. Following analysis using ABB’s reactor fingerprint and optimization service, the company identified a fix: increasing vacuum on the processing line to pull off more water, which in turn reduced batch time to 40 hrs. The firm is now producing three batches per week with better consistency.
In another process, ABB used artificial intelligence and machine learning derived from the chemical and pulp and paper industries to predict end-of-batch quality. The analytical model learns the various ways that batches can go wrong (i.e., failure modes) and identifies potential problems in real time. This allows operators to take corrective action immediately rather than waiting until the end-of-batch process.
ABB is also working to integrate factory data into blockchain solutions for end-to-end transparency. Data from the blockchain enable users to do production planning based on real-time demand feedback from retailers, for example. It could also be used to store and share cold-chain integrity data of products in transit.
Blockchain can also dramatically simplify the process of determining the source of a given product or ingredient. A test performed by Walmart to trace the origin of mangoes, for example, showed that while a conventional investigation took more than six days, the blockchain solution provided an answer in 2.2 seconds.
Digitalization in F&B is also being applied to energy optimization. One example of this is in a greenhouse located in the United Arab Emirates. Previously the facility only tracked aggregate energy use, mostly for cooling, but the installation of sub-metering provided data on specific end uses. More data was gathered using sensors to measure CO2 consumption, water flow and treatment, HVAC performance and other factors. The resulting data stream was fed into a cloud-based data visualization system that provides various analytic capabilities such as historical analysis to compare growing seasons.
These examples illustrate the tremendous potential that digitalization holds for the F&B industry, but it’s important to start the digital journey with a clear set of objectives. Whether it’s energy use, equipment availability, yield, quality or other related metrics, once they are established the business can focus on solutions aimed at achieving and exceeding those targets. As noted earlier, F&B can also leverage many of the techniques used in other industries.
As the cost of sensors, cloud storage and communications continues to fall, there will likely be a proliferation of sector-specific tools to address the unique needs of each F&B market segment. The time to get started on the digitalization journey is now.
The Digital Grid of the Future is Now:
How to Build Your Capabilities Wisely
As demands on grids surge, grid operators cannot afford a poor decision about where to focus resources. They are under pressure to maintain reliability, improve resiliency, integrate renewable generation, and ensure sound financial performance—all at the same time. The key to unlock this conundrum? The right digital grid investment.
While these challenges are unique to the tightly regulated electric industry, the best solutions are not. The question is no longer: should I digitize, but how to build capabilities wisely. Utilities can learn from organizations in the financial, retail and industrial control sectors that are further along in their technological evolution.
One of the most compelling opportunities for technology investment is in the digital grid, which combines software with powerful data analytics and connected intelligent grid devices.
“The digital grid is not emerging,” said Rackliffe. “The technology is already here.”
The combination of connected, intelligent devices and analytics can help utilities gain insight into their operations and maximize the investments they’ve already made. It also provides data streams to make decisions on how to best serve customers and position the business moving forward.
This paper examines the utility industry’s path toward the digital grid through investment in connected, controllable devices, communication networks and data analytics. Digitalization is a fundamental component of the digital grid, allowing utilities to leverage data across the enterprise.
These digital grid building blocks enable utilities to better manage the grid and offer several important benefits:
Integrated, enterprise-wide digital technology can provide unique value for electric utilities via increased reliability and reduced cost.
Utilities should formulate digitalization technology roadmaps that provide immediate ROI as well as increased value over time, supported by further investments.
The industrial IoT represents a platform for improved grid management through the use of controllable connected devices.
Data analytics applications can leverage existing data streams to improve grid performance, outage management, asset health and resource planning.
The digital substation concept functions as a microcosm of the benefits of digitalization.
Utilities can manage the complexities of distributed energy resources (DERs) to capitalize on the opportunities they create.
Distributed energy resources:
What’s next for distribution grid management?
The power industry is undergoing its greatest transformation since its inception, and much of the change is being driven by distributed energy resources (DERs). These are generation and storage resources (e.g., solar, batteries) connected to the grid at the distribution level “downstream” from the utility substation.
While DER deployments still account for only a tiny fraction of installed generation capacity, they are growing rapidly and their potential—for better and worse—is undeniable.
On the plus side, the potential benefits abound. Reduced environmental impact, deferred capacity upgrades, optimized distribution operations, expanded demand response capabilities and improved power system resiliency are all on the menu, according to EPRI. You can add customer choice as well since in some cases DERs are located behind the meter (e.g., rooftop solar).
DERs represent a vexing challenge for utilities, though, namely keeping their promise to provide safe, reliable and affordable power despite the introduction of thousands of new supply resources, bi-directional power flows and greater intermittency. DERs are also driving a tectonic shift in the industry’s very structure, which we will address later.
In the case of renewables, the Electric Power Research Institute (EPRI) identifies four potential problems with widespread DER adoption:
• Grid-edge connections producing local over-voltage or loading issues
• Increased risk of significant loss of generation (intermittency)
• System planning disruptions due to variability
• The lack of inertia from large plants to stabilize the network
These challenges mean “utilities must operate the grid in a much more agile manner,” as a recent IDC report observes. In particular, “utilities need to learn how to integrate externally originated asset, market, and grid data.” And that’s really the core of the issue. Industry observers, whether they applaud the widespread adoption of DERs or decry it, agree that much more is coming.
The success of broad DER adoption is predicated on utilities’ ability to monitor and control the assets, link them with SCADA/DMS and other enterprise systems, and interact with market operators. Doing so will allow DERs to realize their potential, maximizing returns for asset owners and optimizing operations for utilities.
The DERMS solution
In its 2014 report entitled “The Integrated Grid,” EPRI notes that “in nearly all settings, the full value of DER requires grid connection to provide reliability... and access to upstream markets.” For this reason, says EPRI, DER and the grid should not be seen as competitors but rather as complementary, but we are not there yet.
As the EPRI report notes, “[s]o far, rapidly expanding deployments of DER are connected to the grid but not integrated into grid operations, which is a pattern that is unlikely to be sustainable.”
Solving the integration problem requires a DER management system (DERMS), which Navigant Research defines as “a control system that enables optimized control of the grid and DER, including capabilities such as Volt/VAR optimization (VVO), power quality management and the coordination of DER dispatch to support operational needs.”
“There are two main use cases for DERMS,” explains ABB’s Rick Nicholson, head of global product management for the company’s Enterprise Software business. The first is operational, he says, simply managing a variety of generation resources, especially renewables and energy storage, that are located at the grid edge. Balancing EV charging with variable generation like wind is one example.
The other DERMS use case is economic, perhaps best illustrated by the concept of a virtual power plant (VPP) where a variety of different resources—including storage—can be aggregated via DERMS and presented to the grid operator as a single, dispatchable resource.
In a 2017 paper, Navigant Consulting expresses the optimistic view that “if designed intelligently... a VPP can help foster a system whereby a full portfolio of grid services (regulation service, voltage management, fast DR, contingency reserve, peak demand management, and renewable firming) can be provided by the same DER components that were once feared to be the primary contributors to grid imbalances.”
Who’s in control here?
While DERs hold tremendous potential for utilities, consumers and third parties, the question at the heart of every application is: Who controls the assets? The answer is that depends on the particulars of the use case.
Most assets behind the meter require approval from the asset owner to allow the utility to control them. Permission might hinge on the nature of contractual agreements between the utility and asset owner and why the utility needs control (e.g., for reliability purposes or capacity needs). The utility might offer an incentive, for example, in exchange for the ability to control a solar-plus-storage installation at an industrial site.
In cases involving an aggregator such as demand response programs or VPPs, the aggregator would likely retain control over the assets in order to ensure they optimize returns.
In any case, control over DER assets comes down to two main functions: “doing the math” in real time to make the system work (i.e., DER monitoring and control) and communications, whether with an aggregator via an automated gateway or directly with the asset owner. This is an area where the DER ecosystem is still evolving as the industry seeks to standardize DER communications around a common set of protocols (e.g., IEEE2030.5) to replace proprietary ones.
The state of DERMS
The evolution of DERMS and the assets they control is happening rapidly. Presently, the industry stands at an early crossroads, with a substantial majority of utilities indicating plans to implement DERMS in the near future, but with less clarity around how those systems will be integrated with existing distribution management systems.
DERMS developer and ABB technology partner Enbala conducted a survey of attendees at DistributTECH 2017 and found that while less than a fifth had a DERMS in place, more than three quarters (77%) said they planned to deploy one within the next three years. Interestingly, half of survey respondents indicated “meeting grid reliability concerns” as the top reason for DERMS investment.
A broader Newton-Evans survey published in NEMA’s Electroindustry magazine in November 2017 showed that 64% of investor-owned utilities surveyed indicated they had “activities underway or planned for [distribution management] systems to include some level of deployment of DER management tools.” Of those with ADMS either planned or already in place, 82% said they either had DERMS functionality already or would include it in the future.
However, despite the industry’s apparent willingness to move forward with DER integration, utilities still face a disconnect on the technology side. As IDC Research Director John Villali noted in a DERMS webinar last year, 60% of utilities buy DERMS as a completely separate procurement process from their DMS/ADMS while only 10% of DMS/ADMS purchases include DERMS capabilities out of the box.
These findings demonstrate that DERMS has yet to be integrated with the other (formerly stand-alone) applications that now live behind the single user interface of ADMS. That is likely to change as the industry coalesces around a set of must-have capabilities for DER management.
So, standards and capabilities are evolving, but what should utilities do today when looking to implement DERMS?
Shopping for DERMS: What to look for
There are a few broad characteristics that utilities should insist on when it comes to implementing DERMS. These include:
High performance and scalability.Distributed computing architecture delivers the compute power required to manage thousands of assets on a distribution network in real time.
Fast network response.DERMS provide real-time control and optimization of assets, so the communications system must be able to keep up.
Asset agnostic.DERMS manage various categories of DER (e.g., batteries, smart solar inverters, capacitors, controllable loads), so they should accommodate the operating characteristics of each on a level playing field.
There are also a few specific functions that any DERMS should perform, such as registration of assets, forecasting their output and monitoring/control at the grid edge. The ability to support demand management and voltage optimization schemes (e.g., VVO, CVR) is also foundational.
Integration – the key to value
Perhaps the most consequential capability for any given DERMS implementation is the system’s ability to interface with distribution management systems and other existing utility platforms, whether in operations technology (OT) or information technology (IT). Approaching the issue from the perspective of ADMS, a recent IDC report notes that “a significant component of successful digital transformation is having an ADMS that can integrate with a DERMS that includes a packaged set of capabilities to support DER needs.”
In particular, DERMS should share the same as-operated network model used by ADMS applications. Similarly, the DERMS-ADMS interface should make distribution SCADA telemetry data equally available to each side, and should synchronize state changes and the results of power flow calculations.
Done properly, the integration of DERMS with DMS/ ADMS yields benefits that by now are familiar within the realm of integration projects. Grid operators, for example, will now have a single view of DER assets and events that allows them to respond quickly to changes in operating conditions.
From an IT perspective, marrying DERMS with ADMS eliminates the need to maintain data in two separate systems. It also reduces cost and implementation time thanks to a pre-integrated, productized solution that comes ready to plug and play.
Future of DERMS
In the long term, ABB’s Rick Nicholson says the adoption of DERMS “hinges on the development of distribution-level markets for DER. These markets may be structured centrally in a similar fashion to existing wholesale markets, could be structured in a more decentralized mode, or might even include peer-to-peer energy trading enabled by blockchain technology.” In the meantime, centralized wholesale markets will continue to set prices for DER-provided services.
It’s true that the regulatory and market structures under which DERMS will operate are still evolving. However, it’s also clear that while the underlying technologies are here, more work needs to be done, particularly with regard to integration. For DER to realize its potential, DERMS will have to evolve further to deliver resource optimization, market participation and commercial settlement functionality in addition to managing voltage, active power and power quality on the distribution network.
For now, though, these systems already provide utilities, their customers and third-party players the means to derive value from distributed assets in a number of different use cases. As the technology (and regulatory framework) advances, DERs will be able to deliver even more.
Amid growing challenges, how smart is your grid?
Falling power prices, regulatory changes and increasing competition have forced wider adoption of smart grid analytics by utility companies in the United States. Leading electric, natural gas and water utilities are at different stages in the “smartness” of their grids and are beginning to implement analytical strategies as part of their core organization models.
The shift to new-age digital technology requires heavy investment from utility companies in advanced analytics, allowing for proper planning, management and functioning of their grids. Two-way communication and control systems along with smart sensors and meters are being used to optimize grid operations. More than 80 percent of utilities have some degree of confidence that analytics could provide solutions for the key challenges their businesses face today, a survey by research firm Utility Analytics Institute (UAI) showed.
In this Internet of Things (IoT) era, the amount of data collected can be overwhelming. Knowing what to do with this barrage of information is crucial for the evolution of utilities. Not only do these firms require access to the data, they must also dedicate adequate storage space and utilize skilled personnel to unlock real value from the intel.
“Analytics competence, and confidence in one’s ability to leverage data analytics are essential if utilities are to successfully overcome complex business challenges and adapt to the disruptive changes impacting the industry today,” said Mark Johnson, UAI executive director. “Utilities will find it impossible to effectively manage an increasingly complex grid or navigate the rapidly changing expectations of customers, regulators and other stakeholders without robust analytics capabilities.”
Everything - from system modeling to asset and power quality optimization to the management of outages - are being analyzed to help utilities make better business decisions. The UAI survey suggested that analytics is being employed to the greatest extent in system modeling to help design and conceptualize new components and systems for grids. It has also helped utilities understand the effect of these changes on their existing networks.
Predictive models uncover problems before they disrupt service. Proactive equipment replacement helps companies better plan their work schedules, thereby reducing cost and minimizing outages. Utilities can also identify loading issues in near real-time and implement operational solutions quickly.
Volt/VAR optimization (VVO) and conservation voltage reduction (CVR) yield improved power quality, loss reduction and substantial savings. Access to and analysis of near real-time grid performance data is increasing the productivity and performance of field service personnel. Faults are analyzed and power is automatically re-routed, minimizing the effect of potential outages. Leveraging smart grid telemetry points, the probable location of the cause of momentary power outages is identified more accurately and much more quickly.
Utilities must acknowledge the fact that data is critical to make the shift to a smarter grid. To realize the full potential of data analytics, major electricity and gas utilities should take initiatives to define and execute solid strategies, create roadmaps for analysis and set up appropriate support channels to address challenges.
They also need to re-adjust their approach to grid analytics challenges from a case-by-case basis to an overall organization-wide strategy. Rather than solving issues with the IT department or garnering external support, they need to be able to handle the data in-house, either by hiring analytics professionals or gaining knowledge of the subject themselves.
Grid management is evolving to stay abreast with the interconnected, data-centric age of the Fourth Industrial Revolution. Informed customers want more options. While there is much optimism from the current state of smart grid analytics, there is a cumbersome yet promising journey ahead for utilities that want to make the most of technological advances.
What is a microgrid?
With the world going through an Energy Revolution, industries, utilities and communities are striving to find new ways to harness renewable power, bring electricity to remote areas, and prepare for climate change. And one solution is proving key: the microgrid.
Taking On The Storm: Grid Management
The United States was battered by three hurricanes – Harvey, Irma and Maria - last year. Together, these superstorms inflicted record levels of devastation in the nation’s history.
Even though power companies are well prepared for adverse weather conditions, only “smart” utilities with thorough contingency plans were able to restore electricity to affected areas in a timely manner.
Power companies have come to understand that by investing in modernizing and strengthening their grid, they can not only boost the efficiency of their substations but also reduce downtime - even under unusual circumstances - and save costs.
Utilities have reinforced their power plants by constructing dams and walls to combat rising flood water levels. Elevation has also helped protect substations from water damage. Utilities design elevations for switchyards and control houses at the 100-year-flood elevation, plus 1 foot, according to the Institute of Electrical and Electronics Engineers (IEEE).
As a pre-emptive measure, Florida Power and Light (FPL), the largest power company in the state, installed more than 200 monitors at its substations to alert operators to shut down power. “Those flood monitors saved three or four days of work and mil¬lions of dollars’ worth of equipment that would have had to be replaced rather than simply re-energized,” said Eric Silagy, President of FPL.
Moreover, power companies have been increasingly replacing voluminous air-insulated switchgear (AIS) with gas-insulated switchgear (GIS) that provide an additional layer of security to electrical components and reduce exposure to the elements. High-voltage circuit breakers and disconnectors in GIS substations are enclosed in a gas-tight metal casing that takes up just one-tenth of the space of an AIS substation.
New Jersey’s No. 1 utility, Public Service Electric and Gas Company (PSE&G), has been investing heavily in GIS for its substations over the past few years. GIS technology is preferred for its robustness, reliability and significantly smaller footprint compared with conventional AIS, the company said.
Earlier this year, ABB, which has 50 years of experience in GIS technology, won a $40 million contract to strengthen the electric grid in Washington, D.C. With advanced monitoring systems and data analytics, utilities can gauge situations instantly and cut down outage times drastically.
Tools like advanced distribution management systems (AMDS) can help pinpoint the location of fault, enabling utilities to send repair crew where it matters most. Simultaneously, customers affected by the outage are alerted and updated continuously on the status of the power restoration efforts.
Houston-based CenterPoint Energy, which has been using ABB’s ADMS tools since 2008, collects information from 2.4 million smart meters and field sensors, allowing for real-time grid management. This has improved power reliability and accelerated the utility’s ability to identify outages and restore power.
In 2016, ABB integrated its network management system with an outage management software solution, enabling real-time monitoring and control, network analysis and network optimization, and also outage management of National Grid USA’s distribution infrastructure in New York and Massachusetts.
Pre-planning and strategic vendor relationships have helped power companies speed up restoration time, cut costs for fixing equipment and maintain safe working environments. Though it is sometimes quite impossible to predict how a storm can affect a substation, having a plan of action for disaster management can prove to be invaluable. Utilities such as Jersey Central Power & Light (JCP&L) and Con Edison have partnered with ABB response teams to help with reconstruction of substations in the aftermath of disasters.
A sound offense strategy is key in the management of a power grid. Hardening systems, data leveraging and solid vendor relationships are important aspects in ensuring a storm-ready power station. By investing in the latest technology, utilities experience minimal disruption to electricity supply in areas hit by hurricanes and superstorms.
Industrial digital transformation
Digital substation helps New York guard against storm chaos
Historic power link connects Newfoundland & Labrador to the North American grid
Meeting the challenge of modernizing
the grid to stabilize the integration of renewable energy
The electric grid must be, above all else, reliable. That’s why the integration of renewable generation is perceived to be—at the same time—a transformative opportunity and a formidable threat. We are on the brink of adopting renewable energy at a scale never seen before. To allow renewable power to meet its potential and continue its rapid development, we need long-term planning, investment at the right junctures, and smart grid management at scale.
At its heart, renewable generation is a story of contrast.
Renewable generation offers promises of reducing CO2 emissions to mitigate the effects of global climate change, reduced dependence upon fossil fuels, and tantalizing reductions in costs of energy. Not only are the costs associated with building solar and wind capacity falling, but the marginal cost of operating renewable generation is near zero once installed.
Renewable generation also poses difficult challenges that must be addressed through longtime planning and on-target investment. It is highly variable and difficult to forecast. It tends to peak at the moments when consumer demand is lowest and ebb at the moments when demand is greatest. Remote renewable generation sites often require power to travel across vast distances.
Ideally, when the sun is shining or the wind is blowing, utilities want to capitalize on all available renewable generation in real time. However, as the penetration of renewable generation grows, as one might expect, the challenges also grow. For example, utilities may have to curtail solar generation on a sunny day if demand is not sufficient to utilize the generation. Curtailment, in turn, diminishes return on investment and environmental benefits.
An ABB analysis revealed that for many geographies, curtailment can start when renewable capacity reaches about 25% of annual demand. When approximately half of the generation is renewable, curtailment might reach 5-10%. At this point, adding additional renewable generation capacity would result in increasingly smaller additions in the level of delivered renewable generation. On a grid that faces regional constraints, curtailment may begin even earlier.
These saturation effects can be mitigated with strategies, technologies and techniques. We cover some of the most important strategies in this paper. We also cover shifts in perceptions that we believe are inherent to realizing higher levels of centralized renewable generation.
Top among them is this: when most people picture renewable generation, the association is typically solar rooftop panels. That is one component, but while distributed energy resources (DERs) are generating a lot of attention and excitement, the reality is that rooftop (distributed) solar is expected to generate less than 10% of the total generation mix by 2050.1
As we explore in this paper, significant utility-scale renewable investment is not only happening—in fact, it must happen to reach renewable portfolio standards. And adequate transmission capacity will be essential to deliver this energy. As a result, investing in transmission is just as important—if not perhaps more important—as distribution and grid edge investments to enable the integration of renewable generation into the grid.
Finally, we predict that there will be a shift in the traditional thinking of dispatching generation to match load. Over time, we expect to see more aggressive shaping of consumer load to match renewable generation characteristics. Demand management may be done by industrial consumers (e.g., fully automated future factories) as well as smaller consumers.
 “America's Renewable Electricity Forecast Grows To 2050, Even Under Trump,” Robbie Orvis, Forbes, May 10, 2017.
O&G set to reap benefits from advances in field devices and analytics
The Internet of Things (IoT) and related technologies are making inroads in just about every sector of the economy and the oil and gas industry is no exception. In this paper, we explore how larger players in particular are already finding new ways to extract value from the data flowing from digitized wellheads.
One O&G firm ABB works with has increased its inhouse analytics group by a factor of six over just the last few years as more and more operational data has become available. Another, Australia’s QGC, has implemented a cutting-edge system that allows the company to manage thousands of wells spread across 1,100 square miles with just four staff members.
Most firms are still in the early stages of evaluating technologies and developing a business case for investments in digitalization, but the potential is significant. Aside from the industry’s ever-present focus on safety, cost reduction tops the list. There are 2 million wellheads in North America, and the companies that operate them are keen to reduce the amount of time field crews spend traveling between them to perform maintenance checks. Digitalization offers the possibility of remote monitoring coupled with analytics to do just that, and more.
Why the cloud?
Typically, controllers at the wellhead perform a certain level of optimization locally, taking actions, for example with chemical injection, based on measurements taken locally. Now, the advent of reliable, high-bandwidth data communications has made it possible to do more in the cloud. Wellheads don’t have particularly complex processing needs, and latency is not as much of an issue thanks to more advanced field networks. So, what does this new level of data availability do for O&G operators?
First, they can potentially get faster, more conclusive decisions about their operations by gathering data from the field and applying advanced analytics to it via cloud computing. Analytic programs in the cloud can provide a much more detailed picture of what’s going on across a large number of wells, allowing operators to spot trends and take action. For example, an improvement action at one location might be offset by a detrimental reaction at another location. Such a conflict might not be immediately apparent, but if the operator has the benefit of cloud-based analytics to provide a field-wide view, it’s much more likely to be addressed before developing into a serious problem.
This capability is beginning to change the type of equipment in the field. Edge devices are being tested and deployed now to evaluate augmenting proprietary controllers, enabling users writing their own programs and minimizing concerns of obsolescence. It’s also easy to equip wellheads with technology and simply turn on additional features or even rewrite programs later to adapt to changing production conditions.
With data flows established, the real value in digitalization happens when data is fed into analytic tools, for example predictive maintenance programs that eliminate unnecessary travel by field technicians.
Realizing value from digitalization is predicated first on having the domain knowledge to turn raw data into actionable intelligence. This may seem obvious, but it’s often not immediately clear what to do with data being collected or even where it will be used within the organization. What is the data worth? What is the commercial value of analytics? These are still open questions for the industry, but O&G operators needn’t face them alone. Suppliers already offer the means to leverage data flows via cloud-based AI without fear of compromising IP or data ownership.
There is more software-based control, too, which when combined with increased availability of operational data allows actions to be taken based on historical trends rather than simply on real-time readings. It’s likely that O&G firms will find new applications as they move forward on their digital journey.
Going digital – first steps
As noted above, the first step in digitalization is the ability to handle large data flows. That implies secure, high-bandwidth communications. Scalability is also essential. Can a given technology support a 60-well pad? Will it work across thousands of pads? Wellhead control can be done at this level using local controllers of varying aptitudes—the key is ensuring the ability to scale up.
Suppliers are beginning to offer packages for field automation that will provide onshore operators a more economical option than the larger, more sophisticated (and more expensive) systems used in offshore applications. For example, ABB has developed an integrated offering that combines controllers, motors, drives, wireless communications and electrical balance-of-plant along with digital applications to optimize operations. It can support operations of any size from a few wells to thousands.
Of course, not everything is a candidate for cloud computing. Analytical, data-driven applications such as plunger lift and gas lift applications are well-suited to it, but real-time applications are better off residing in the RTU. It’s also important to understand that the O&G industry—and others—may never converge on a single cloud computing environment. Industrial customers have very specific needs vis-à-vis safety, regulatory compliance and operational requirements, among others. Banking and travel firms, for example, have chosen to keep their own transactional systems despite being highly digitalized otherwise.
The four phases of digitalization.
O&G companies are on an evolutionary path to optimize their operations. The digital wellhead is only the first step.
Phase 1: the digital wellhead
Phase 2: digital pipelines
Phase 3: digital terminal solutions (including LNG)
Phase 4: the integration of all of the above
Eventually, it will be more important to “connect the clouds,” similarly to how consumer systems bring myriad products and services under one interface (e.g., Amazon Alexa). Integration is much easier to do at a cloud-to-cloud level where computing, storage and connectivity resources are plentiful and cheap compared to traditional approaches.
Other important considerations
Security will remain a top concern as the O&G industry becomes more and more digitalized, but securing industrial systems is more challenging than laptops and mobile phones. Many devices have no screens or keyboards, and they run mission-critical systems that cannot be taken offline. Vendors and users together must make sure programs can’t be tampered with. Protecting devices at start-up, ensuring software updates come from reputable sources and detecting (and acting on) threats as they happen are all vital.
Recently another issue, data ownership, has arisen not just in O&G but in other sectors too. Operational data is valuable, and the owners of it don’t want it to be shared with anyone without permission. Some cloud providers currently ask their users to nullifypatents as part of their terms of service, but we believe that companies should not have to forfeit their intellectual property simply because they choose to use cloud-based services.
The onshore oil & gas industry is evolving rapidly. The real challenge will be making sure every development dollar spent generates real ROI. Companies that embrace digitalization will position themselves to uncover real value, but they need not embark on large capex projects to do so. No matter which stage of connectivity a company may be at, modular technologies can deliver on unique application needs today, regardless of region, and scale up as conditions change and new opportunities arise.
Onshore O&G is well situated to benefit from experience from the offshore side of the business, as well as new technologies onshore. Given the rapid pace of innovation in the areas of analytics and cloud services, not to mention field devices, we are likely to see even more options for wellhead digitalization over the coming years. The key to realizing returns and gaining competitive advantage is to start the digitalization journey now.
Where is automation working in the chemicals sector?
Value is clear, but experience varies widely
Recently, ABB conducted a survey with Chemical Engineering magazine of more than 300 chemical industry professionals about their firms’ use of and investment in automation technology. The responses show that while the evolution and proliferation of automation technologies like distributed control systems, PLCs and wireless communication continue, the level of automation investment varies widely between companies. While industry players understand the need for automation, many have yet to realize its full value.
The survey polled mostly engineering professionals but also included finance and management roles. Companies were divided by number of employees into four categories: small (1-50), small-mid (51-300), medium (301-1,000) and large (1,001+).
In terms of overall outlook, there was broad agreement that the future looks bright. Of the professionals polled, more than half saw their business performing better in 2017 than in the previous year and 24% indicated it would remain the same. Similarly, there was a clear convergence of opinion around the major challenges facing the industry.
The challenges… and how to address them
We asked respondents to select the three most pressing factors affecting their business, and the top five answers broke into two distinct tiers. In the top level were “global competition” and “materials costs.” Below these was a second cluster of three answers: “energy costs”, “cyclic or volatile markets” and “skill shortages or training issues.”
One interesting fact about these top responses is that only two—energy costs and skills/training—are directly addressable by the company. The other three are simply facts of life for chemical companies operating in today’s market. That limits what these firms can do to control costs and drive growth, and it puts a spotlight on solutions that address those areas.
When asked how they plan to meet the challenges identified in the previous question, there was a similar cluster of top responses. Survey participants were asked to choose three actions, and the top three were selected significantly more often than the rest:
Develop new products and/or services
Modify our corporate structure or business processes
De-bottleneck, expand or revamp existing plant(s)
New offerings and revamping production lines both imply changes to and/or investment in automation technology. They also would seem to indicate an additional need for training. This sets up a paradox in which the solutions chemicals firms are pursuing to address their top challenges are subject to the same forces that create those challenges.
Where are chemicals producers spending?
Spending on automation is definitely on the rise, with only 17% of survey participants indicating a decline in either opex or capex in 2017. Most saw both categories rising or holding steady. In terms of what’s on companies’ shopping lists, there was no clear leader or group of technologies that rose above the rest. The top five responses were:
Anticipated investments varied widely, however, depending on company size. Respondents working at large firms anticipated spending across the board but mostly on DCS and PLCs. Medium size firms showed a stronger focus on DCS with 60% saying they would buy the technology. This might indicate a sense among second-tier firms that a DCS is essential to support the growth needed to join the top tier whereas larger players perhaps have already made investments in DCS and are now looking to expand system capability, throughput or integration.
Interestingly, the small-mid category had by far the most anticipated investment in terms of volume if not dollar value. A whopping 71% expect to invest in PLCs in the next 18 months, with strong majorities anticipating investments in online instrumentation and variable frequency drives as well. In fact, smallmid companies are set to make more individual investments than their competitors, whether large or small, in each of the top five categories listed above, save only for DCS.
Among small firms, no technology drew a majority of respondents with the top choices being PLCs and DCS.
The future is bright, but it’s not clear
As noted earlier, survey respondents across the board see a bright future, and they also seem to appreciate the vital role automation will play in it. Asked to react to the statement “plant automation has a decisive influence on our profitability,” 79% indicated they agree or strongly agree. Only 6% disagreed.
However, this optimistic view is clouded by the fact that a significant minority (40%) of respondents did not agree that they were “confident and knowledgeable about our place in world of automation.” Why are so many industry professionals at best unsure about their company’s ability to realize value from investments in automation? Look no further than their own experience.
Only one in five of the professionals we surveyed indicated their automation systems had delivered “all that their vendors promised.” More than twice as many reported bad experiences and another third weren’t sure.
The survey did not attempt to ferret out all of the underlying reasons for these unmet expectations, but there are many possibilities. The control system may not be properly maintained; too many people might have access to make changes to the system; operators might lack adequate training, leading to bad habits like shelving frequent alarms.
Integration seems to be a common challenge. When asked if they’d “had issues integrating legacy systems,” only 16% of respondents disagreed, thus indicating a favorable outcome. More than half agreed or strongly agreed that they had experienced problems. Given how much of any technology’s value is attached to how well it plays with others, this is clearly an area for improvement for vendors and users alike.
Key(s) to success
Once a process is automated, management will look for opportunities to improve further. Objectives might include reducing energy consumption, increasing yield, or eliminating waste. This lets the company move from simply “doing more” to “doing better.”
Process industries are using automation to move beyond simply gathering and analyzing data to begin applying it. The technology has reached the point where it now produces “actionable intelligence” that leads to concrete steps that allow the company to do more and do better. The challenge now lies not in the technology per se, but in how it is used.
As the data reported here shows, the implementation of automation technology is at least as important as choosing and procuring it, probably more so. Applying the best practices of project management is a must, as is a focus on the integration of legacy systems. Take the time to understand the relationships and interdependencies between systems, and invest in training to ensure operators are getting the most out of the technology once it’s installed.
The chemicals industry is enjoying a period of renewed growth, but as automation technology continues to advance, the bar is raised ever higher. To compete in this market, it is no longer enough to gather and analyze data. Chemicals companies must leverage their automation investments to take action. Systems integration, operator training, and a holistic view of the process are some of the keys to successful implementation that closes the loop of sense-analyzeact.
Automation - The next frontier of the U.S. chemical industry
The U.S. bulk chemical industry is poised for a period of strong growth as economic prospects across the globe brighten. This optimism is backed by, among other factors, a robust manufacturing sector that feeds demand and a booming shale industry that is likely to foster more investment.
The question then is: for an industry that is so sanguine of the future, where does the need for automation arise?
Companies in the petrochemicals, pigments and industrial gases sectors often miss out on branding their products as there is little differentiation. That, coupled with cut-throat competition across the globe, compels companies to maintain low costs while striving for wider margins.
In this scenario, automation serves a dual purpose of increasing efficiency at every level, and allowing companies to provide more value to customers through product consistency and on-time delivery. A survey of more than 400 industry professionals, conducted by trade magazine Chemical Engineering, revealed that even in this flourishing market, there is a tangible need for automation to stay ahead in game.
The industry today
More than half of survey respondents expected their business to fare better going forward. But they expressed concern over intense global competition, and rising raw materials and energy costs as major impediments to their success. The challenge to spur growth while being competitively-priced turns the spotlight on solutions such as automation technology that can bring dramatic improvements in such areas that aren’t directly controlled by the company.
Unbeknownst to themselves, survey participants implied that automation technology would be the best solution to tackle the problems of the industry. Most respondents said new offerings and revamped production lines – which is more likely than not to include automation – were the right fit to meet the challenges that the industry has to tackle today.
Abound with opportunity but also uncertainty
The prospects of automation have been widely recognized in the bulk chemical industry. Almost 80 percent of respondents felt automation would have a decisive impact on profitability, but are uncertain about how it fits into their future.
Only 14 percent of respondents were confident of their place in the world of industrial automation, while 30 percent were unsure of how their behavior would change to fit the shifting landscape. With such uncertainty rife in the market, it is not surprising that nearly half of all respondents were skeptical of the payoffs of automation.
Revamping for a digital future
Integrating legacy systems, standardizing technology and data collection, and providing adequate training to workers are some of the most effective way to draw the full value out of automation.
While there is no single, clear answer to the question of what technology would take precedence in automation investments, respondents showed a preference for digital solutions. Programmable logic controllers or industrial digital computers took the top spot in the survey, followed by distributed control systems, online capabilities and wireless infrastructure. The responses indicate that the U.S chemical sector is ripe for a Fourth Industrial Revolution where actionable data can take productivity to the next level.
In the competitive world we live in, it does not suffice to simply hasten processes - companies need reliable systems that can quickly adapt to changing demands from the market. New age technology such as Industrial Internet of things (IIoT), which is at the heart of the Fourth Industrial Revolution, can dramatically improve processes such as power quality and reliability, thereby minimizing unplanned downtime that often has costly consequences.
With annual sales of over $800 billion, the U.S. chemical industry is vital to the country's economic health, but optimism of its growth is tempered by heightened global competition that puts a premium on margins and cost. Still, a price-driven strategy can only take companies so far if they fail to harness automation capabilities that have the potential to redefine productivity at every level.
Innovation to transform the energy future
Digitalization and the Chemical Plant of the Future
From device to cloud, digital solutions promise an
answer to some of the industry’s biggest challenges.
The chemicals sector has seen a wave of investment in recent years, particularly in China (a continuation of the industry’s development there) and the U.S. (a result of persistently low prices for natural gas). However, the industry also faces serious challenges on multiple fronts. Digitalization, a long-term trend affecting every segment of the economy, represents both a challenge in itself but also the means to address many others.
Elements of digitalization
Sensors and devices. The “things” where information originates, such as a pressure or temperature sensor, but also radio frequency identification (RFID) technology that uniquely identifies an object.
Edge computing. Often data needs to be processed at the edge to achieve speed or safety, like a compressor anti-surge loop, a safety integrity loop (SIL) or an electronic lock. Edge computing may happen in the device itself or across multiple devices. In process control, this is the distributed control system.
Connectivity is what ties the devices, edge and cloud together across many standards and systems into one homogeneous system. It may be integrated with the cloud such as with the ABB Ability™ cloud based on Microsoft Azure.
Analytics are the many applications that process data to deliver information about equipment diagnostics, logistics, inventory, and trends.
The cloud is the secure but open central repository where all information is stored, accessible to users and applications.
A 2017 report by Accenture and the World Economic Forum projects digitalization to deliver approximately $310 billion to $550 billion in value between 2016 and 2025.  Benefits are anticipated across the enterprise, from R&D to plant operations, supply chain management and workforce performance.
“The chemical industry has an intrinsically sound business model,” McKinsey researchers observed in a March, 2017 report. “The industry as a whole is… positioned to profit from a wide range of trends, from sustainability to e-mobility, from commodity demand surges to major changes in consumer behavior.”
This paper examines some of the mega-trends impacting the chemicals industry and how leading producers are applying digital technologies to gain competitive advantage.
Power, charging and the electric bus business case
As EVs approach cost parity with gasoline-powered cars, in public transit electric power is already there. A 2016 study conducted by New York’s Metropolitan Transit Authority and Columbia University found that while electric buses cost about $300,000 more than the diesel alternative, “savings are estimated at $39,000 per year over the 12-year lifetime of the bus.”
When health impacts (i.e., reduced respiratory problems from improved air quality) are considered, “the resulting health benefit… from the reduction of respiratory and other diseases is estimated at $150k per bus based on EPA data.”
That puts the overall lifecycle cost for diesel at $2.5m compared to $1.1m for electric.
King County, Washington, where Seattle is located, conducted a similar study on converting to an allelectric fleet and found “the societal costs from GHG and air pollutant emissions and noise are three times higher for a diesel-hybrid fleet than for a zeroemission fleet powered by renewable energy.”
The New York and Washington studies did not attempt to account for carbon emissions avoidance, but obviously putting a price on carbon would enhance the eBus business case even more. But as noted in the MTA/Columbia report, the financials already favor electric drive even without accounting for public health.
So, what’s not to love about the eBus?
In the long run, probably nothing, but it’s the transition that’s the trick. For starters, eBuses are still an emerging technology. The National Renewable Energy Laboratory gives eBus a 7 out of 9 on the “technology readiness level,” indicating it is being validated at full scale in a “relevant environment.” By comparison, diesel receives a TRL of 9 as a mature technology.
The eBus has a few important hurdles to overcome, perhaps the most critical of which is the issue of charging standards. Currently, there are two main ones for electric cars, but bus and charger manufacturers offer a variety of designs, some of which are proprietary. The industry is currently working to agree on and create standards.
In the meantime, there are other challenges faced by transit operators looking to go electric. Operational issues related to charging, complexities in operating a mixed fleet, resiliency during blackouts, and even real estate costs all weigh on the minds of would-be eBus operators.
From a charging standpoint, there are a few interrelated issues at work. First, there is a tradeoff between the capacity of the onboard battery, charging time and range. Any given transit system will have multiple options to achieve a desired level of service based on some combination of these. It’s just a question of optimizing how the pieces fit together.
High-power charging is clearly the way of the future as it delivers more range in a shorter time. So, it’s advisable to plan for high-power infrastructure. For example, at present it’s difficult to retrofit a bus for overhead fast charging if the vehicle was not built with it. However, the cost per vehicle is negligible, so absent a compelling argument for opting out, it’s good practice to buy overhead-capable buses in order to take advantage of opportunity charging.
High-power charging is clearly the way of the future as it delivers more range in a shorter time. So, it’s advisable to plan for high-power infrastructure.
Similarly, scalability is vital as bus networks transition from pilot programs to fully electrified fleets. Charging systems using a modular design offer an easy way to increase charging capability by simply adding cabinets.
Efficient deployment of fast charging depends on the identification of charging locations that can serve multiple routes and charge multiple buses at the same time. It’s also good practice to distribute charging infrastructure to accommodate backup power, eliminate bottlenecks and reduce costs. (Don’t underestimate the impact that utility demand charges can have on operating costs.) That might include energy storage, which can help to reduce peak demand and ease the integration of bus charging with the surrounding grid.
All of this good advice is in line with paying closer attention to total cost of ownership rather than upfront capital costs. That is really the core of the eBus business case, and it extends from the bus itself to the charging infrastructure that supports it.
Interview with Sebastien Buemi
Formula E champion driver
ABB became the title sponsor of what is now the ABB FIA Formula-E Championship in 2018, the fourth season of the racing series. Sebastien Buemi took the driver’s championship at the end of Season 3, and talks about the technology and why he made the jump from Formula One.
“The main difference between Formula E and Formula One is down to the engine,” he says. “We have a battery that replaces the fuel tank, and we have an electric motor that replaces the combustion engine.”
Formula One cars are still a bit faster than their electric counterparts, but Formula E is evolving rapidly.
“If you look at the performance of the car, from zero to 100kph [in] around three seconds,” Buemi explains, “it’s quite impressive knowing we are just at the start of the electric revolution.”
The technology is indeed advancing rapidly. The cars Buemi and his fellow drivers used this year will be replaced next season with a Gen 2 car featuring new body geometry, greater battery capacity and more power. Most of the components on the cars, including the battery, chassis and aerodynamics, are standardized. However, the race teams are allowed to use their own designs with regard to the rest of the powertrain (i.e., motor, inverter and transmission).
This has led to a wide variety of designs as each team tries to find the winning formula. Some cars utilize a single motor, some two, and transmissions vary from a single gear up to as many as five. One of the reasons for leaving the powertrain open to individual development is a conscious decision on Formula E’s part to encourage development of technologies that will eventually be translated into passenger vehicles.
The transfer of technology developed within Formula E to consumer vehicles comes at a time when, as some motorsport commentators have observed, the flow of new technology from Formula One to road cars has slowed. New innovations are fewer and farther between while things like anti-lock brakes, traction control and stability control have by now made their way into even the most affordable models.
“The engineers [in Formula E] are spending lots of time and energy on developing the powertrain… and that’s where all the effort is going,” says Buemi. “We are not putting any effort into wings, trying to put more downforce on the car. It would make the car quicker, but it wouldn’t serve the normal automotive world.”
“The engineers [in Formula E] are spending lots of time and energy on developing the powertrain… and that’s where all the effort is going,”
The race series, then, is like a big R&D lab, with teams competing to find the best solution to the challenges of racing.
As for the “user experience” of electric racing, Buemi says it requires a transition, coming from F1.
“When I first drove the car, it was really… kind of… special,” Buemi recalls. “You hear some mechanical noise when you drive very slowly but as soon as you go over 100kph you only hear the wind. That’s quite… stressful, I would say, at the start. But then when you get used to it it’s very peaceful.”
The absence of a roaring gasoline engine directly behind one’s head means that “your ears are fine after an entire race,” observes Buemi.
The Season 3 champion believes strongly not only in the future of electric racing but in the future of electric cars in general, with Formula E leading the way.
“It pushes the boundaries of electric cars,” says Buemi. “All the manufacturers are spending a lot of money developing electric powertrains, and in the next few years everyone will have electric cars. Clearly it’s a good tool to develop the technology.”
Season 5 of the ABB FIA Formula E Championship kicks off December 15 in Ad Diriyah, Saudi Arabia.
High-power electric vehicle (EV) chargers are coming to a station near you
The first 350 kW EV chargers are poised to hit the U.S. market
Today’s EV owners do most of their charging at home, but imagine a future where charging an EV is as convenient and commonplace as pulling up to a gas station. That’s the vision taking shape now as initiatives like Electrify America build national networks of fast-charging stations.
Re-charging your EV may not be as fast as re-fueling your gasoline vehicle—but it will soon be close. Electrify America is rolling out EV charging stations that will feature ABB’s Terra HP high-power chargers, the first 350 kW product on the U.S. market, which can refresh even the largest electric vehicle battery in under 15 minutes. Charging time for a range of 124 miles (200 km) is just eight minutes.
ABB’s Terra HP chargers are compatible with both the CCS and CHAdeMO DC fast charging standards, allowing drivers to buy the EV of their choice with confidence that it will work at all of Electrify America‘s charging sites.
Electrify America will be deploying hundreds of these non-proprietary electric vehicle chargers within and around 17 metropolitan areas and along multiple nationwide highway corridors in an ambition ten-year plan. The rollout is part of Electrify America $2-billion-dollar investment in Zero Emission Vehicle (ZEV) infrastructure, education/outreach, and access/exposure. It’s the largest investment of its kind to date and will include:
• Electric charging infrastructure in 17 metro areas including Boston, Chicago, Denver, Fresno, Houston, Los Angeles, Miami, New York City, Philadelphia, Portland, Raleigh, Sacramento, San Diego, San Francisco, San Jose, Seattle, and Washington, D.C.
• Charging stations placed at workplaces, retail centers, municipal parking lots, and office centers plus more than 100 Walmart locations across 34 states.
• The establishment of high-traffic highway corridors connecting metropolitan areas to facilitate long-distance EV travel. The charging sites in the corridors will be no more than 120 miles apart and will average 70 miles apart.
Building out the Electrify America charging system has the potential to ease consumer’s fears about owning an EV. Although modern EVs have been on U.S. roads for a decade, only about 200,000 were sold in the U.S. in 2017. A major hurdle is that the charging infrastructure hasn’t kept the pace required to spur mass adoption of electric transportation. “Range anxiety”—wondering if there will be someplace to recharge batteries during a road trip—continues to be a deterrent.
Currently, about 15 percent of the 48,472 total public charging stations in the U.S. are DC fast-charging stations, reports EVAdoption. Tesla’s Superchargers run at up to 120 kw, but most publicly available chargers using the CHAdeMO and CCS standards, which operate at 50 kilowatts or less.
This makes sense given that until this past December, no electric vehicles on sale in the U.S. market could handle a faster charging rate. However, this picture is rapidly changing as the market races to catch up to demand. The Hyundai Ioniq Electric can accept charge rates of up to 100 kw, and the Chevrolet Bolt EV can charge at up to 80 kw over a portion of its capacity. And that’s just the tip of the iceberg. GM and Honda, for example, just announced a multi-year partnership to next-generation EV batteries to “deliver higher energy density, smaller packaging and faster charging capabilities for both companies’ future products, mainly for the North American market.”
Bottom line, while electric vehicles may not be posting big sales numbers yet, auto companies are making significant investments in them. Some of the emerging vehicles will offer not only faster-charging capacities but a range of specs and tech features designed to exceed most gas-powered cars. To meet the expected wave of demand for EV, the charging infrastructure is rapidly moving into position.
eBus Charging: a (very brief) primer
Understanding the many tradeoffs in charging system design is essential for any transit operator contemplating a shift to electric drive.
It’s an exciting time to be in the transport business. With battery costs coming down and energy densities going up, Bloomberg New Energy Finance predicts electric vehicles will reach cost parity with conventional cars within the next five years. Amid all the hype around EVs and driverless cars, though, it’s easy to overlook the impact that electrification can have on public transit.
Buses are just another type of EV, but with battery packs ten times the size of a typical passenger car (or more), and as part of a larger transit system, they have unique requirements. Understanding the many tradeoffs in charging system design is essential for any transit operator contemplating a shift to electric drive.
It’s an exciting time to be in the transport business.
First, let’s define the three types of charging for buses:
Depot charging happens… well, at the depot, usually overnight. Chargers in these applications can be either AC or DC and they run at relatively low power levels (10kW– 150kW) because they have 5-8 hours to fully charge the bus’s battery.
Terminal charging happens during the day at a depot or remote terminal location, and is typically defined as taking no more than two hours.
En-route charging, as the name implies, occurs when the bus stops momentarily to pick up and drop off passengers. These applications require higher power to deliver a useful charge in such a short time, so to date all en-route chargers use DC and operate between 150KW and 600KW.
We should note here that while several en-route charging designs use overhead connections, retrofitting overhead charging on a bus that was manufactured without it, is complex and adds cost. This is one of a number of challenges—standards for charging connections is another—that suppliers of eBus rolling stock and charging infrastructure alike are working to overcome.
Transit operators are similarly in an early stage of adoption with regard to electric bus fleets. Choosing the right mix of charging schemes, buses, maintenance schedules and support systems is a daunting task, especially during the transitional phase when you might be running diesel, hybrid and all-electric vehicles simultaneously. In planning a charging approach, though, there are some important things to consider.
The system itself should have as small a footprint as possible, with extra points awarded for modular units that can scale easily by simply adding cabinets. An automated connection system, remote management capability and built-in redundancy are all highly recommended.
Something else to consider is energy storage. The batteries in the buses aren’t the only ones benefitting from the rapid advancement of technology, and strategic use of wayside energy storage systems can help address peak demand and ease integration with the surrounding grid.
Ultimately, though, having the right charging technology (or technologies) in place is something that transit operators will likely revisit often as they navigate the transition from fossil fueled vehicles to electric ones. Service needs will change, and technologies will continue to advance. And there’s always regulation…
The best advice for transit operators is to take a holistic approach and account for the full range of forces that impact system performance and cost. Pulling together engineering assessments, utility rate structures and even the topography of the service territory won’t be easy, but this too should get easier as the sector advances.
Why e-mobility needs the digital revolution
EVs are Inevitable, but Face a Few Key Hurdles
If you survey the field of punditry regarding electric vehicles, you might conclude that a battery-powered future is still a long way off and may not materialize at all. But ask anyone in the auto industry and the inevitability of electrified transport becomes clear. It all comes down to economics. EVs are already cheaper to operate than gas and diesel cars, and may soon be cheaper to buy, even without government subsidies.
We’re still relatively early on the adoption curve, though, and there are a few obstacles standing in the way. One, surprisingly, is the too-frequent unavailability of existing chargers.
As Ram Ambatipudi, VP of Business Development and Utility Engagement at EV Connect, observed at ABB Customer World (held in Houston March 4-7), the problem isn’t the charger itself. That equipment will last for decades with minimal maintenance. Far more often it’s a broken connector “nozzle,” some kind of IT failure in the charger’s communications, or maybe a burned-out display.
“The user experience for public charging today can be terrible,” he explained.
Even putting technical failures aside, the patchwork of charging network operators means dealing with multiple vendors’ apps and entering your payment info multiple times to use chargers on different networks.
On the supply side, meanwhile, the challenges facing utilities are building. While EVs represent a tantalizing new source of demand, the influx of large loads at the grid edge driven by imminent fleets of EVs presents a host of operational challenges.
Karen Hsu, Senior Director of Business Development for Utilities at eMotorWerks, an Enel X company, pointed out that a typical home sees their consumption shoot up as much as 200% during peak times of electricity usage when an EV joins the family. The impact on the local distribution system can be substantial, but she also noted that while most residential chargers will be Level 2 going forward, the growth in high-power DC fast charging represents an even greater challenge for grid operators.
DC fast chargers operate at up to 350kW today and could go higher. The impact is already visible in the “dragon curve,” a new term that describes the spiky top of a utility’s load curve that is created by DC fast chargers turning on and off over the course of the day. Managing the rapid swings in demand from EV charging, then, will require a more agile approach than the utility industry has historically used to match supply with demand. Grid storage—perhaps co-located with high-power chargers—demand response, and smart charging will all play a role.
The good news is that residential charging is a great candidate for load control. Most EV charging (80%) happens at home, and most of that at night. This offers the utility some flexibility in managing charging demand without encroaching on the EV owner’s expectations for vehicle availability. Furthermore, Hsu asserted that regions across the world are making ambitious plans to decarbonize both energy and transportation systems, creating an immense opportunity for EVs to enable a more cost-effective, cleaner electricity grid as energy storage facilities at a fraction of the cost of using only stationary energy storage.
Today, the average EV stays plugged into a home charging station up to 90 percent longerthan the time needed to fully charge the vehicle, and smart charging can dynamically manage EV charging loads to balance grid demand, ultimately reducing wholesale energy costs and mitigating the intermittency of renewables. That’s going to be important because, as ABB’s Steve Bloch noted, industry analysts estimate EV charging demand could be as high as 733 TWh by 2030.
The future is here, and it’s electric.
New York sets an example for shift to EVs
Head of ABB Power Grids North America
Auto shows are invariably about the future, whether it is consumers contemplating their next dream machine or the carmakers showcasing the rides of tomorrow. But at this year's New York International Auto Show, the future has arrived in the form of the electric car, along with all the supporting parts of the electric car ecosystem.
In fact, with a team of New York state agencies and industry partners promoting electric vehicles at the auto show, in a city that has just agreed to "congestion pricing" to discourage gasoline-powered traffic in Midtown Manhattan, New York is poised to emerge as a national leader in the e-mobility revolution.
For proponents of e-mobility, the New York approach can be a model for speeding the local, state and national adoption of EVs with all the benefits in cleaner air and quieter streets this revolution can provide.
Forecasters, including the International Energy Agency, predict that by 2035 the share of EVs on U.S. roads could exceed 50 percent. And as ride-hailing services and fleet owners move to take advantage of the lower costs of operation and ownership that EVs can provide, the market share of electric vehicles seems destined to expand.
For electric cars, this new future is historical vindication. At the beginning of the 20th century, electric cars and gasoline-powered ones competed for market dominance. But battery technology back then didn't provide much cruising range, and there was no convenient system for recharging. With petroleum companies eager to set up networks of filling stations, gasoline-engine vehicles eventually dominated the market and the roadways.
But these days, the evidence of the momentum behind e-mobility includes the alluring array of showroom-ready electric cars on display at the show, which opens to the public on April 19. Many of the newest EVs can go 200 miles or more between charges.
Another leading indicator at the auto show is high-power, public fast-charging stations of the type that will be on display there and are now being installed around the country. These stations can charge an EV in a matter of minutes. Being able to quickly recharge the battery away from home and even on long trips can eliminate the "range anxiety" that impedes mass adoption of EVs. Combine these new long-range EVs with an expanding network of high-power charging stations and the transition race toward electric mobility is on.
Attractive electric cars and fast chargers alone won't suffice, though. The e-mobility revolution also depends on the full support of public policy makers to speed the adoption of environmentally friendly transportation in ways that make economic sense for all involved. And it will depend on participation by power companies, supporting investment in grid modernization and transmission technologies to support the increased load.
Teaming up at the auto show to promote EVs are the major utility ConEdison, the New York Power Authority and the New York State Energy Research and Development Authority. This team's efforts reflect the broader utility industry support of EVs in New York and include state programs such as EVolve NY, a $250 million infrastructure initiative Gov. Andrew M. Cuomo announced last May to "aggressively accelerate the adoption of electric vehicles throughout New York state."
Another state program, Charge NY, is promoting installation of public-access chargers across the state, including along highway corridors and at airports, with a goal of 10,000 additional charging stations by 2021. These initiatives complement the growing networks of charging stations around the region and the country being installed by larger commercial stakeholders such as Electrify America and EVgo.
Vehicle owners — whether individual consumers or fleet operators — are more likely to choose EVs when they realize that the overall cost of owning and operating an electric car can be lower than owning and driving one with an internal combustion engine. And their range anxiety disappears once they know a thriving, affordable charging ecosystem will be available.
Utility companies will continue to invest in updating the grid to accommodate EV charging infrastructure if they see sufficient demand — and if utility regulators let them make the investments. At the same time, the owner-operators of charging networks can continue extending them around the country if they know the utilities will have the regulatory flexibility to price the electricity at levels that make the networks economically viable.
Meanwhile, automakers will continue investing in EV research, development and production to produce increasingly affordable models with long ranges and that are fun to drive, if they know there will be ready buyers of the vehicles.
That's why it matters that in New York, the key players are demonstrating leadership with their commitment to electric transportation at this year's auto show.
The EVs are coming. But only with a comprehensive, collaborative approach to the EV infrastructure — from grid to wheels — can the e-mobility revolution reach its full potential.
YuMi takes center stage in Pisa
YuMi, the collaborative robot, takes center stage in Pisa, conducts Andrea Bocelli and Lucca Symphony Orchestra
Help Wanted, and Lots of It, on the Modern Factory Floor
Robotics maker ABB has 24,000 employees in the U.S., and more job openings than people ready to fill them.
President of the Americas region, ABB
As the U.S. economy continues its long recovery from the Great Recession, one bright spot has been a sector that, to listen to the cynics, you might think had packed up and fled the country. I’m talking about manufacturing.
The truth is, U.S. manufacturing is a growth business. Since the end of the recession in 2009, manufacturing employment in this country has risen by nearly 1.3 million workers, to a total of more than 12.7 million jobs. And a quarter-million of those manufacturing jobs were added in the last 12 months alone, according to the latest government employment report.
That’s why this week, in celebration of National Manufacturing Day, many of the 14,000 member companies of the National Association of Manufacturers are holding public events all over the country and even opening their factory doors to visitors, to help spread the news: Manufacturing, powered by sophisticated digital technology, has made a comeback in the United States and is providing some of the country’s best, and best-paying, jobs.
In fact, U.S. manufacturing – everything from making precision parts and electric motors, to building automobiles, airliners and industrial robots – is so robust that one of the biggest challenges for employers is finding enough people to fill these good jobs. The latest government figures indicate there may be as many as a half-million unfilled manufacturing job openings in the United States right now.
Take Greenville, S.C., where my company, ABB, and a number of other global manufacturing giants have modern factories. That includes General Electric, BMW, Michelin and Bosch Rexroth. In Greenville, all these companies and others are engaged in a continuous, spirited competition to recruit, train and retain the employees we need to run our sophisticated, highly computerized factories.
It’s a challenge faced all over the country for companies like ABB, which though based in Switzerland employs 24,000 people at more than five dozen manufacturing or research-and-development sites around the United States. At virtually all of our U.S. locations, we have more job openings than people ready and able to fill them.
Why do so many of those jobs go unfilled – despite the fact that manufacturing pays more in hourly earnings on average than in any other industry in America?
We think part of the problem is a public misconception of what modern manufacturing involves. And so, part of Manufacturing Day’s mission is to help middle-school and high-school students – and their teachers and parents and even grandparents – understand how profoundly things have changed in U.S. factories from a generation or two ago. The old images of manufacturing work – dirty, dark and dangerous – no longer hold.
Today’s factories are clean, well-lit, air-conditioned and highly computerized workplaces. And because the sophisticated manufacturing that is done in these plants is not the sort of work that can be done by cheap, unskilled labor, it is not susceptible to the “offshoring” that devastated so many American manufacturing communities a generation ago.
That’s what our industry wants people to see firsthand with this week’s open-house events. But raising public awareness is only the beginning of the educational challenge.
More crucial is supporting the education and skills-training opportunities that can enable more people to enter the manufacturing professions as a firm pathway to the American middle class. And it means creating a work culture in which life-long learning and retraining are not only expected but encouraged – and enabled by supportive employers.
A recent study of 25 industrialized nations, “The Automation Readiness Index,” found that too few countries are adequately preparing the workers of tomorrow for careers in an economy where robotics-driven automation and artificial intelligence will augment human work. The good news from the report is that, contrary to the scare-mongering headlines, the robots are not taking our jobs. What the robots and other forms of automation are doing is changing the nature of human work.
Today’s automation means that the highly repetitive or physically arduous tasks that a generation ago went to cheap labor outside the United States can now be done by machines. What’s needed these days are the human skills of creativity, critical thinking and decision-making that only people – educated, trained people – possess.
We still need people, and lots of them, to design, program, operate and maintain the machines of modern manufacturing. We also need people who are capable of building the machines – as 600 ABB employees are doing at our robotics plant near Detroit in Auburn Hills, Mich.
The Automation Readiness Index study from the Economist Intelligence Unit and ABB earlier this year found that the countries doing the most to prepare their workforces – South Korea, Germany and Singapore – are combining government programs and corporate initiatives to update school curriculums, provide occupational training and support a system of continuous learning throughout workers’ careers. By such measures, the United States ranked only 9th in the Readiness Index.
But American manufacturers aren’t sitting around waiting for federal programs to solve the problem. Many of us are working with educators in our local communities to give people the skills they need for modern manufacturing and to encourage them to continue learning and advancing.
In Greenville, for instance, our company and others have developed joint programs with the local community college, Greeneville Tech, to provide training and certification. One of the most successful of these is a 150-hour course that qualifies Level 1 machinists to operate the CNC – Computerized Numeric Control – machines that do much of the precision work in today’s factories. ABB is eager to hire the best graduates from this program, and so are all our local competitors. And we also pay the tuition for our employees who want to pursue their two- and four-year degrees or beyond.
U.S. manufacturers are engaged in similar efforts in their communities all across the country, knowing that the workers of today and tomorrow are looking for jobs that involve leading-edge technologies, continual on-the-job advancement and meaningful careers.
Together, as companies and communities, we can embrace the new economic opportunities that modern manufacturing offer the people of the United States. That’s a message we’re eager to spread and proud to share – not only on Manufacturing Day, but every day.
Workforce readiness in the 21st century
The convergence of robotics, traditional automation and AI is rewriting the rules for manufacturers and their employees.
There are nearly 12.5 million manufacturing workers in the United States, accounting for 8.5 percent of the workforce and 11.7 percent of GDP in 2016 . If the U.S. manufacturing sector were a country, it would rank as the ninth largest economy in the world. Manufacturing remains a vital part of the U.S. economy, but the sector is beginning a dramatic change.
In a December 2017 report entitled “Jobs Lost, Jobs Gained,” McKinsey estimates that 60 percent of all occupations today are susceptible to nearly one-third of their work activities being automated. The report also projects that 8 to 9 percent of the labor force in 2030 will be employed in new occupations that have not previously existed. Meanwhile, employment data provider Paysa estimates that U.S. companies spent more than $650 million in 2017 on salaries for 10,000 new jobs, all within artificial intelligence (AI). 
The net effect of automation on employment in the coming years will be profound, but not in the way many people think. The World Economic Forum reports that while automation will displace 75 million jobs globally by 2022, it will create 133 million new ones. 
The challenge, then, lies not in the number of jobs but in the nature of work. Connecting workers with jobs, though, will require significant effort from all stakeholders.
With a high rate of automation adoption, 14 percent of the global workforce will likely need to transition to new occupational categories and learn new skills, though the effects of automation on workers will vary by sector and region.  Without sufficient and accessible training and placement assistance, the new jobs of the future will remain unfilled and unemployment could rise, despite job openings.
We’ve seen this before
There are many historical examples of technological disruption leading to economic disruption. In most cases, initial fears about workers being replaced by machines were tempered by the increases in both productivity and employment that these innovations brought about. For example, despite fears of tremendous job loss, the arrival of the personal computer is estimated to have created 15.8 million net new jobs in the U.S. since 1980. 
More recently, automation has started to shift toward making workers more productive in the jobs they already have. This is perhaps the most fundamental difference between the automation of the past and that of the future.
Robots, for example, are still assembling cars, but today they are doing much more. Pharmaceutical firms, for example, use robots to maintain hygiene in the packaging process, using them even to clean other equipment.
In 2018, ABB introduced Txplore (Fig 1), a submersible robot about the size of a football designed to perform internal inspections of large power transformers without draining the insulating oil. The unit streams video in real time for analysis using cloud-based applications. The robot displaces the labor needed to drain the insulating oil, and lowers environmental and safety risks. It also creates a need for trained technicians and engineers to interpret the data and support advanced maintenance programs.
Automation vs AI
Artificial intelligence and automation, strictly speaking, are two separate things, but AI underlies the fundamental shift in automation currently underway. Where automation makes processes faster and more consistent, AI allows machines to make evaluations, provide actionable intelligence for decision support, and even make decisions automatically in pre-defined contexts.
As a recent Accenture report demonstrates, the most significant impact of AI won’t be on the number of jobs but on the content of those jobs.  The report estimates that human/AI cooperation could boost business revenues by 38 percent in the next five years, generating higher levels of profitability and employment. For the average S&P 500 firm, that equates to $7.5 billion in new revenue, $880 million¬ in profits, and a 10 percent increase in employment.
Reaching this potential, however, will require major changes in how we view, fund and prioritize education and worker training.
Education, (re)training and the skills gap
To fully realize the opportunities being created by today’s rapidly developing automation technology, we need a concerted approach to re-think education and training at every level. That is one of the findings in the Automation Readiness Index (ARI), an ABB-sponsored study conducted by the Economist Intelligence Unit in 2018.
The study assesses the position of 25 countries with regard to their ability to adapt to the new realities of a digitalized, automated workplace. It notes that education should be a continuous lifelong process, that more of it should focus on vocational programs, and that the development of “soft skills” is just as important as STEM education.
Soft skills refers to things that are difficult to automate, and as the ARI report observes, the workers who possess and develop these skills (e.g., interpersonal communication, analysis, creativity) will be better positioned for employment opportunities in the workplace of the future. Educators, however, often lack the training, experience, classroom technology or curriculum to best serve their students’ needs with regard to future employment.
So, what about employers? According to a 2018 McKinsey study, American employers are aware they have a training mandate. Of the more than 300 executives surveyed (all at companies with more than $100m in revenue), 64 percent said they will need to retrain or replace more than a quarter of their workforce between now and 2023 due to advancing automation and digitization. 
The same percentage said they believe corporations, not governments, educators or individual workers, should take the lead in trying to close the looming skills gap. The government still has a vital role to play, however, in working with employers to fund training programs and structure their curriculum. The key is partnership.
Nevertheless, there is a significant gap between the investment required and what is being done. Between 1993 and 2015, U.S. spending on workforce training programs as a percent of GDP fell by 62 percent.  This raises questions about why spending doesn’t seem to be keeping up with what the business community agrees is necessary in this time of tremendous change.
A report in the Atlantic magazine  points out that publicly funded training programs, as currently implemented, rarely succeed in moving large numbers of workers into new, better jobs. This is often due to a misalignment with employer needs. Meanwhile, many workers aren’t aware of available programs, or are excluded from them either by rule (e.g., if they worked in an industry not targeted for re-training) or by circumstance (e.g., due to a lack of transportation).
The U.S. has met workforce challenges before. The high school movement in the early 20th century drove investment in expanding secondary education and for the first time required all students to attend. The results were dramatic: enrollment of 14- to 17-year-olds rose from 18 percent in 1910 to 73 percent in 1940.  The returns to a worker for having achieved even one year of high school or college were substantial. As early as 1915, this amounted to around 11 percent higher income for men and 12 percent for women. 
In the post-war period, the GI Bill had a similar impact at the college level, allowing millions of veterans to earn bachelor’s and advanced degrees, and it’s still working. The post-9/11 update to the GI bill boosted college enrollment by three percentage points, according to a 2017 study conducted by New York University.  It will take a commensurate effort to address the skills gap challenge of the early 21st century.
There are success stories at the local level, but these will need to be scaled up and magnified. Pittsburgh, for example, remade itself after the collapse of the U.S. steel industry through a transition to a knowledge-based economy and now is home to a variety of firms in biotech and other sectors. Other regions have made similar transitions by harnessing local intellectual capital, often found in universities, coupled with private sector R&D and local governments willing to ensure workforce training meets the new demand.
Following are policy recommendations centered on re-thinking and expanding our notions of training and education to address the employment skills gap challenge.
• Broaden funding—including employer tax credits—to include more technical education, two-year programs, professional certification
programs and apprenticeships.
• Increase outreach to younger (e.g., middle school)
students to encourage awareness of and interest
in STEM related programs when they reach high
• Involve industry in curriculum development, especially at the high school and post-secondary levels, to ensure it addresses employers’ needs
• Increase funding for apprenticeship programs for both high school and community college students
The challenge U.S. workers and employers face is significant, but we have successfully navigated similar technological disruption before. We are well-positioned to succeed again as there is already broad agreement on what the problem is and how to address it. We need only exercise the will—political and otherwise—to put proven solutions into action.
 “Manufacturing Facts,” NAM website, August 2018
 McKinsey, “Jobs Lost, Jobs Gained,”
 Ellyn Shook and Mark Knickrehm, Accenture,
“Reworking the Revolution,” 2018
 Lolade Fadulu, “Why is the US So Bad at Worker
Retraining?” The Atlantic, January 4, 2018
 Pablo Illanes, Susan Lund, Mona Mourshed,
Scott Rutherford, Magnus Tyreman, McKinsey,
“Retraining and Reskilling Workers in the Age
of Automation,” January 2018
 “Steinhardt Study Finds More Veterans Have
Enrolled with Post-9/11 G.I. Bill,” NYU website,
August 9, 2017
 Economist Intelligence Unit, “Automation
Readiness Index: Who is ready for the coming
wave of automation?” 2018
 WEF press release, September 17, 2018.
 “Automation Jobs Will Put 10,000 Humans to
Work, Study Says,” fortune.com, May 1, 2017.
The ABB Ability™ technology powering smart cities
How ‘It just works’ simplicity is coming to the Industrial Internet of Things
Chief Digital Officer, ABB
One common dramatic meme since the dawn of the Digital Age is an executive picking up a telephone handset from and desk and telling the product team, “See this? It just works. That’s the kind of simplicity our customers want.“
Here in the early days of Industry 4.0 – optimizing business by converging the physical and digital worlds – “it just works” is a someday dream. Complexity grows with every new layer of robotics, automation, AI, machine learning, sensors, analytics, cloud, blockchain, 5G, edge technologies, etc., and the necessary but occasionally bumpy harmonizing of Operational Technology with Information Technology.
It’s fair to say that in the complexity-to-simplicity wave cycle, IIoT right now is in a complexity trough. But things are happening that will bring us to the peaceful shore of “it just works.” In fact, the complexity-additives I listed in the preceding paragraph will provide the eventual platform for simplicity.
Recently I met with leaders of a global energy company for which ABB builds IIoT solutions. They turned my theoretical understanding of customers’ hatred of complexity into a very real gut punch: “We operate really complicated factories and plants with the machinery you sell us, and we’ve got a problem,” an executive told me. “The day we know the most about everything you’ve sold and deployed for us is the first day it’s turned on. Then it all goes downhill. Industrial Alzheimer’s sets in: people retire, we forget the right procedures, know-how erodes, and mean time to repair goes up.”
They weren’t happy, and I was thinking, “You think this machinery is hard to operate now? You haven’t seen anything yet. Wait ’til we start adding the digital technology and sophisticated devices you need to stay competitive. We’re going to add a lot of knobs and dials and dashboards to this equipment, and then the robots are going to show up…”
It’s not the energy company that’s worried about complexity. IIoT is expanding across all industries as business undergoes digital transformation on a global basis. According to Gartner, there are currently more than four billion connected business devices worldwide, which will grow to 7.55 billion by 2020. When you add connected consumer devices (12.9 billion by 2020), there will be more connected devices on Earth than people.
We’re facing a growing skills and expertise gap as the machinery of the modern economy gets more complex. By 2030, workers in developed countries will need to spend 55 percent more hours than today on digital and advanced IT tasks, according to McKinsey. Workers are not the only ones unprepared: 20 percent of companies told McKinsey their executive teams lack sufficient knowledge to lead adoption of automation and artificial intelligence (most of the other 80 percent are in denial), and more than a third of companies fear the shortage of workers with requisite digital skills will hurt their financial performance (the other two thirds: denial).
Dire? Yes. But here’s some good news: aware of both the growing global expertise gap and how complexity hobbles productivity and profitability, innovators have charted a course to IIoT simplicity. That journey has begun. Here are four developments – listed, appropriately, in ascending order of complexity – that will bring us ever-closer to “it just works” in IIoT:
- Adoption of universal standards
1) Adoption of universal standards
Remember when the idea of streaming TV was far too complex for most people? Netflix was the pioneer in bringing simple streaming by using standard Internet Protocols instead of building complex new or proprietary technology. Netflix said, “We already know how to get Internet service to your house. We’re just going to add video streaming on top of all the existing underlying technology that delivers the Web.”
In the same way, IIoT companies are wisely adopting such standards as IEC 61850 for defining communications protocols for intelligent electronic devices in utility grids; OPC UA for machine-to-machine communication in factories; TCP/IP for network devices; and 5G (see below for exciting IIoT 5G applications). Adopting universal standards is the digital equivalent of making sure one screwdriver fits all the screws.
The shipping industry pioneered modularity with the shipping container. That allowed the design of trucks, railroad cars, and cranes sized exactly to handling containers. The result was a new, modular generation of efficient logistics.
Now modularity has come to multiple industries. We can put all the complex machinery of an electric utility in a box to build micro-grids, and if customers need more capacity, they simply add more boxes. We’ve also put a refinery inside a shipping container.
Jay Rogers, CEO of modularity pioneer Local Motors, which operates microfactories in the U.S. and Europe to produce crowdsourced electric autonomous vehicles, including the world’s first 3D printed car, Strati, argues modular microfactories allow manufacturers to achieve long-sought micro-customization, running through multiple versions of products and testing them quickly with customers, making more of what sells while quickly abandoning unpopular mistakes.
Rogers told me experienced his modularity ah-ha moment working with the U.S. military in Iraq and Afghanistan, when he realized it would cost more money and time to wait for shipment of sorely-needed spare parts at military bases when onsite modular microfactories could produce a wide range of spare parts with a robot, a 3D printer, and a CNC (computer numeric control) machine for prototyping and full production for cutting, carving, machining, and milling.
There are modular pharmaceutical, chemical, even cement microfactories being sited in – and delivering economic boosts to – remote geographic areas around the world where large conventional factories would never be built.
Beyond the boon of simplicity, modularity is demonstrably a more agile, profitable way to do business. Modular units – often built using an ingenious breakthrough concept called generative design – are faster to deploy and cheaper to operate because everything is pre-thought-out, prefabricated, and pre-integrated. Complexity is collapsed into repeatable building blocks.
Data compiled from industrial use cases across many of our global-brand customers point to modularity’s enormous transformational benefits:
- Operating expenses (OPEX) down 20 percent
- Downtime: -50 percent
- Energy use: -30 percent
- Logistics costs: -30 percent
- Capital expenses (CAPEX) down 40 percent
- Engineering: -50 percent
- Commissioning: -50 percent
- Time to market: 50 percent faster
- Real estate footprint: 50 percent smaller
Since complexity is inside the box, all workers really need to know is where to plug it in. Want more capacity? Just add another box. Modular micro-whatevers don’t need to be designed and built from scratch every time, and customers appreciate the reliable, efficient design that…just works.
Here is where the near-zero latency of 5G comes in. Combined with cameras and augmented reality technology, call centers will monitor equipment and beam in an expert instantly to guide onsite workers with repairs or operations. This both obviates and helps bridge the expertise gap: remote experts working through 5G and AR can “train” onsite workers as they team up to head off or resolve problems.
Tele-operation is also useful with relatively unintelligent autonomous vehicles such as the Automated Guided Vehicles (AGVs) now roaming Amazon warehouses picking-and-packing, and Walmart aisles restocking shelves. AGVs are relatively simple, and get stuck easily. For example, an AGV will come to a halt if it encounters a cardboard box someone left on the floor. Tele-operators will be alerted, make sure there’s no real danger, and guide the AGVs back to work remotely.
4) Autonomy/Artificial Intelligence & the Autonomy Ratio™
Tele-operation of increasingly intelligent autonomous vehicles in IIoT – trucks, cars, ships, mining machines, etc. – leads to something I’ve started to think of as a Moore’s Law of autonomy: the Autonomy Ratio™. (I put the ™ symbol on jokingly, but if the concept catches on, remember: you read about Jouret’s Autonomy Ratio here first.)
Going back to Number 3 for a moment, take the example of simple-minded AGVs roaming a Walmart that can be put out of service by an empty cardboard box lying on the floor. Because all the “intelligence” resides in the tele-operators, it might take 10 tele-operators to help Walmart operate 100 AGVs, so the Autonomy Ratio is 10:1.
But now let’s think about intelligent autonomous vehicles enhanced by AI – industrial robots, trucks, ships, city buses, drones, cars, etc. These smarter autonomous vehicles would have a higher Autonomy Ratio – fewer remote people needed to operate each machine – because AI imbues them with onboard intelligence.
For example, one challenge with early autonomous cars and trucks: if they come to a parked vehicle in their lane – say, a UPS truck making a delivery, or a beer truck whose driver has gone into a sandwich shop for lunch – they just stop and won’t proceed. A “simple-minded” autonomous vehicle will at this point call for a tele-driver, who can assess the situation remotely and move the autonomous vehicle around the stopped truck. Autonomy Ratio: 10:1 again.
But if AI teaches the autonomous vehicle a few rules – say, “wait two or three minutes, and then proceed cautiously into the next lane,” or “use your sensors to detect the body heat of a human driver in the vehicle blocking yours, and if you don’t detect one, then cautiously begin to pass” – then the company would need fewer tele-drivers because the autonomous vehicles themselves carry more intelligence. In this case, a delivery company could operate 100 autonomous trucks with just one tele-driver, raising its Autonomy Ratio to 100:1.
The higher the Autonomy Ratio, the greater both a company’s autonomy prowess and the simplicity (and lower cost) of operating its IIoT machines.
The four steps I just outlined will help today’s workers and those coming behind them bridge the global expertise gap by educating and training workers to do the ever-increasingly technical work required at every level of Industry 4.0. They will not replace human workers, but augment their work. The jobs IIoT will eliminate are, for the most part, dirty, dangerous, and often demeaning – jobs people would prefer not to do.
Tele-operation, for example, provides hands-on, applied job-site training: the worker is standing in front of a problem or potential problem, and the tele-operator, even half a world away, is teaching the worker how to solve the problem by transferring expertise at the human/machine interface.
The technology enabling a cell phone call is bewildering to all but deep domain experts. But the rest of us don’t even have to remember people’s 10-digit phone numbers any more. We just click on someone’s name to call…because the industry wisely created simplicity to work the complexity.
In the same way, the four steps above – now underway and/or in testing phases – are building simplicity atop the witheringly complex technologies needed to integrate the physical and digital worlds of IIoT – the billions of people, systems, and machines that produce and ship things, refine and create energy, and provide transportation more efficiently, reliably, and sustainably than ever before.
Though we can see this simpler, more productive, and rewarding future for IIoT clearly, it’s impossible to say exactly when it will arrive. But it will. I’m confident that someday soon our customers will look across their complex global operations and observe reassuringly to themselves, “It just works.”
Coming to a Site Nowhere Near You: Virtual & Augmented Reality
What do you do if a vital piece of equipment fails in a remote facility and there’s only one factory in the world that services it? You bring the factory to the site.
That’s the value prop for various applications of augmented reality (AR) that ABB showed last week in Houston at ABB Customer World 2019. One uses Hololens glasses to equip field technicians with a broad arsenal of tools. The device can identify equipment by it size and shape, provide technical drawings and maintenance information on-screen on mobile devices (and on-lens in the glasses), and facilitate real-time collaboration with experts anywhere in the world.
“You can’t be an expert in everything,” says ABB’s Craig Stiegemeier. “We work with 109 different designs of tap changers [a device found in power transformers] so this technology allows us to deliver all the information for the customer’s specific unit to that worker in the field.”
AR-equipped field devices are also useful even if the manufacturer sends service personnel out to the site.
“It amounts to ‘we’re coming, but show me what you have,’” says Kim Fenrich, Industrial Automation, Simulation Services Product Manager. “That way our people know what equipment to bring and what to expect when they get there.”
While remote collaboration might be the killer app for industrial AR and VR applications, safety is perhaps the most compelling benefit. At the same event two years ago, ABB introduced a research and development project that produced a small submersible robot equipped with high-definition video cameras to perform inspections of power transformers.
Instead of draining the insulating oil and sending a person into the transformer tank—a time-consuming and dangerous process—the technician pilots the robot using virtual-reality goggles to see what the robot sees. The whole process takes a few hours, compared to multiple days using the traditional approach. More importantly, it eliminates the need to send human workers into a dangerous, potentially deadly environment.
This application is now available as TXplore™, ABB’s transformer inspection service. A demo on the exhibit floor invited attendees to try their hand at piloting the sub through a mockup of a transformer submerged in a tank filled with oil. It takes some practice, especially if you use the goggles instead of just looking through the clear walls of the tank (transformer tanks are made of steel, after all).
These are just a couple of examples of the kinds of products and services that VR and AR make possible. Expect to see many more in the coming years as the underlying component technologies evolve.
Paying Attention to the Ridiculous
Lessons on innovation, technology and disruption in the digital age
One of the more thought-provoking discussions at ABB Customer World (held in Houston Mar 4-7) was a panel featuring ABB Chief Technology Officer Bazmi Husain, Chief Digital Officer Guido Jouret and EVgo Chief Technology Officer Ivo Steklac. The trio discussed technology, innovation, industry disruption and the value of low-dose paranoia during a lively hour moderated by ABB’s Allen Burchett. The session covered a lot of ground, but there were several key themes that emerged.
The migratory patterns of technology have been changed
Those of us of a certain age remember a time—it wasn’t that long ago, really—when exciting new technologies developed for industry would eventually make their way into consumer products. The rise of digital has inverted that process.
“Everything comes to our wrist before it goes anywhere else,” Husain observed.
Jouret added that the consumer tech industry has taken over what used to be the domain of large companies, and government before that. R&D at companies like Bell Labs, IBM and Xerox—whether on behalf of the government or their own product development efforts—produced a fantastical parade of technological innovation. Then things started to change.
On a visit to Xerox’s Palo Alto Research Center in 1979, Steve Jobs is famously reported to have seen prototypical versions of networked computing, the graphical user interface and the mouse. The experience blew his mind, as the story goes, and shaped the development of the Macintosh into the machine that made its debut in that iconic Super Bowl commercial years later.
It’s a great Promethean story, stealing fire from the tech gods and giving it to the people, and even if it’s not exactly true it illustrates the emergence of consumer industries as the new wellspring of tech innovation. Xerox had developed all of what Jobs saw but failed to see the potential in the technology they’d created and put it on the laboratory shelf.
In 2019, forty years on from Jobs’ pilgrimage to Xerox PARC, the flow of tech from consumer applications to industry is manifest in more ways than we could have imagined. ABB’s TXplore™ service, for example, uses a submersible cloud-connected robot equipped with multiple cameras to inspect the oil-filled inside of large power transformers. It’s a dramatic improvement in downtime, cost and safety from the traditional method of draining the oil and sending a person into the tank. And the hardware used to pilot the robot? It’s a video game controller—an inexpensive, reliable and easily sourced device that is already familiar to many a field technician.
From handheld devices to cloud-based services, technology first commercialized for consumer applications is being adapted to industrial purposes at a dizzying pace, and this is likely to continue because some industries in particular have a lot of catching up to do.
We are living in the golden age of industrial digitalization
If you plot various industries along the classic technology adoption S-curve, you’ll find sectors like consumer finance, media and telecom well into the upper right quadrant of the graph. These industries have been the early adopters, and have seen the most disruption (think online banking, mobile phones, Netflix…)
So why hasn’t digital rewritten other industries like mining or manufacturing in the same way?
Jouret answered this question by pointing out that connectivity challenges, the need for much larger computing capacity and the fact that these industries are more complex than consumer markets have all conspired to slow the pace of digitalization in many of ABB’s client industries.
Now the innovations in the consumer world are the building blocks for “old economy” industries to realize the same potential that digital has brought to other sectors, but there is also danger ahead.
Be concerned about the ridiculous
History is littered with the wreckage of companies and entire industries that didn’t see their disrupters coming. The cardinal sin in most of these cases was a kind of myopia, seeing competition as coming only from existing competitors. The message for digital latecomers is that the real threat lies not in the offices of your rival but in a suburban garage or college dorm room.
“You will be disrupted by someone outside your market, by companies you’re not even aware of,” warned Jouret. “You should be concerned about the ridiculous.”
The problem is that we humans aren’t wired to see disruption coming. By the time we recognize the potential of a new technology, business model or industry, it’s already too late.
“Humans don’t understand exponential change,” Jouret continued. “It takes the same amount of time to go from zero to 1% adoption [of a new technology] as it does to go from 1% to 80%. We need to look at the rate of change.”
He also noted it’s useful to maintain a certain amount of paranoia because digital works in mysterious ways. It can make small things big, for example by aggregating the ROI of thousands of rooftop solar installations and serving them up to investors under a single instrument. And there are no sacred cows.
Jouret asked the audience members if they could envision a manufacturing plant that could switch from making one type of product to another, or even a completely different one. A sedan factory switching to SUVs? Sounds… ridiculous, doesn’t it? But if you extrapolate the potential for 3D printing from where we are today—and remember the exponential nature of the change we’re talking about—it begins to look more and more plausible for a manufacturing plant to quickly re-tool for a totally different product.
Certainly, there will be carnage. Some business models will collapse under the pressure of technological disruption, but even in the age of Expedia we still have travel agencies to organize walking tours of Provence, cooking classes in Thailand and other individually tailored offerings. Technology is both a threat and an opportunity, but the threats are mostly to complacent incumbent firms—people are more resilient.
Disruption destroys jobs, but it usually creates many more. McKinsey recently analyzed the jobs lost and gained by the introduction of the personal computer, beginning in 1980. They found that while the loss of 3.5 million jobs could be attributed to the arrival of the PC, the creation of 19 million more could too, more than a 5-to-1 ratio.
Still, we would be well advised to not lose sight of the human element in the excitement of all of this creative destruction.
It’s still all about people
Isaac Asimov, the prolific author of books ranging from science fiction to saucy limericks, set down three fundamental rules for robots that, though conceived in the context of fiction, are worth considering sincerely as we move into the age of artificial intelligence.
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
That sounds good for the robots, but what about the humans?
“More digital means more exposure to bad actors,” said Steklac, observing that security and safety must remain paramount in industrial systems.
Speaking about the rise of autonomous vehicles, he presented a vexing question: how much harm should an autonomous vehicle cause in order to avoid greater harm? Given the choice of hitting a pedestrian or an oncoming vehicle, what should the car do?
The answer is anything but clear, but the question puts in stark terms the challenge of turning raw data into something more useful, like a decision.
The currency of digitalization is data but the “money” is domain expertise
Toward the end of the panel, an audience member (ok, this audience member) asked Jouret what firms like ABB could build value on once data and digital tech were fully democratized. After all, our phones now contain innumerable tools and capabilities, some of which are ostensibly free. So, what’s left when everyone has everything they need in their pocket?
“There’s always another layer,” he replied.
Operating systems begat databases, which begat analytics packages, which begat software as a service, and so on, Jouret explained. Basically, every time we think we’ve reached the top of the hierarchy, someone puts another pancake on the stack.
“Hardware incumbency is actually an advantage,” Jouret continued. “Like having EV chargers located in the best locations. Competitiveness is a combination of data, analytics and the physical world.”
So, the future is bright for digitalization. Industries just starting on their digital journey can learn from those that have gone before. They can leverage technologies developed for consumer markets. They can use their specific expertise to deliver value even when all their competitors have access to the same tools. They just need to keep an eye on the rearview mirror for something that looks ridiculous.