Josep Vilarrasa, Jorge Vidal-Ribas Robotics, Final Trim & Assembly Barcelona, Spain, firstname.lastname@example.org, email@example.com; Jordi Artigas Robotics, Consumer Segment & Service Robotics Barcelona, Spain, firstname.lastname@example.org; Tomas Groth Robotics & Discrete Automation Technology Västerås, Sweden, email@example.com; Biao Zhang ABB Robotics Raleigh, NC, United States, firstname.lastname@example.org
Now, thanks to advances in ABB robotics and visual tracking technologies, these roadblocks are giving way to pilot applications. What is more, the technologies described in this article are expected to eventually be applicable in other fields that focus on unstable targets, such as logistics operations based on the use of automated guided vehicles.
Historically, automation of many automotive final assembly operations has been considered to be practically impossible. As a result, associated components were never conceived of or designed with current automation capabilities in mind. The result is that today’s automotive final assembly lines →01 still overwhelmingly rely on dexterous human manipulation.
Fully automated lines in areas such as body-shop welding operate in “stop & go” mode, which facilitates the automation of different cells. Manually operated lines, on the other hand, move slowly and continuously, with car bodies transported by different conveyor types or, in advanced cases, on automated guided vehicles (AGVs). These systems, which are typically used in factories and warehouses, follow marked lines or wires on floors or use radio waves, vision cameras, magnets, or lasers for navigation. Regardless of whether the transport medium is a conveyor system or an AGV, movement takes place at a moderate speed of around 100 mm/s, which allows human operators to perform assembly tasks with the required level of safety.
For robots, however, such environments present a series of difficult challenges. For one thing, vehicle transport systems tend to be irregular, while floors can sometimes be uneven, resulting in shaking and vibrations. Thus, if a robot is to emulate human behavior in such an environment, it must be equipped with artificial vision. Current conventional vision systems are normally based on captures of static reference positions to locate target positions for assembly tasks. In this case, as car bodies move along a line, an additional capability is required: continuous vision tracking to cope with movements, irregularities and vibrations. This is designed to allow robots to continuously adapt their movement to a sequence of vision-captured reference images, with the frequency of the captures ranging between 20–50 per second.
Vision tracking is based on a technique known as Visual Servoing →02, which uses feedback information extracted from a vision sensor to compensate for movements and vibrations. Here, instead of moving according to a programmed path as a conventional robot would, the robot’s movements are guided by the information provided by vision sensor(s). With regard to ABB robots, this functionality, which is called External Guided Motion (EGM), can update guidance inputs every 4 ms, thus resulting in very fast responses.
In addition to EGM, ABB robots also use a force-torque sensor: ABB Integrated Force Control. This technology allows a robot to adapt its movements based on force and torque inputs resulting from contact with a car body (compliant mechanical behavior). The sensor is normally installed between a tool and a robot’s wrist →02. The combination of Visual Servoing with compliant mechanical behavior is an example of sensor fusion, a technology in which data coming from different sensor sources is combined in real time.
As these technologies mature, it is worth considering what a completely automated final assembly process might look like. Basically, as outlined in →03, it would boil down to the following tasks.
1. Line movement tracking
This means tracking the main movement of an assembly line to derive a pseudo-static environment. Typically, robots have tracked parts on a conveyor system using what is known as a conveyor tracking function. This is based on using an encoder that is mechanically connected to the conveyor movement. However, in the case of car bodies travelling on AGVs, this form of tracking is not easy to implement due to associated required mechanical adaptions.
ABB’s solution to this challenge has been to apply its Visual Servoing technology, which calls for tracking an AGV either by means of an AprilTag or by means of a visual characteristic of the AGV. The advantage of using an AprilTag is that it is not only easy to install, but much more robust. Developed by the University of Michigan, AprilTags are two-dimensional bar codes that are conceptually like QR Codes. The difference is that AprilTags are designed to process smaller amounts of data, which makes them comparatively robust and easier to detect, resulting in improved localization accuracy and faster computer processing.
Visual tracking, on the other hand, involves use of floor-mounted cameras or a camera mounted on the robot foot if the robot itself is mounted on a linear axis. Robot linear axis, which is also known as “track motion” in ABB’s product portfolio, involves the use of a linear servo-controlled unit that is used to extend the robot’s reach. The robot linear axis is required in assembly processes characterized by significant contact time between the part or component to be mounted and a car body.
The technology is not limited to tracking robot movements on AGVs and can also be used in conventional conveyors.
2. Target tracking and robot guidance
Once this pseudo static environment has been achieved, a robot still needs to cope with residual tolerances, irregularities and vibrations to reach its assembly target. In this case, a camera is mounted on the robot tool. The robot knows the relative position of the camera in relation to the tool’s force torque controller (FTC), thus allowing it to focus on the target position. In contrast to AprilTags-based line movement tracking, the camera focuses on an image of a real feature →03 of a tracked car body whose position relative to the assembly target is known. Under these circumstances, AprilTags are not used because of the added complexity required for mounting and subsequently removing them, which would have to be performed at different stations. However, tracking visual features as car bodies move at high speed is still a challenge, especially with regard to coping with different colors and variations in lighting conditions. →04 illustrates robot movements as they relate to tracking a target using real-time vision.
3. Compliant physical interaction
Once the target has been identified, physical contact is initiated, and the robot carries out its assembly task →05 by following a combination of visual and force-torque sensor inputs. This means maintaining visual tracking throughout contact and behaving in a compliant way by using the continuous feedback provided by the Force Control sensor installed between its wrist and tool. This combination of inputs from a vision sensor and a force-torque sensor (sensor fusion) is the key to successfully performing an assembly process.
In some application cases, skills used in steps 2 and 3 may not be required. For example, flawless accuracy may not be necessary when a task is followed by a manual operation in a downstream station. In other cases, step 1 might be skipped because of the use of traditional conveyor tracking.
In the future, new concepts may be implemented that will make it possible to assemble targets in static cells, thus obviating the need for technologies designed for performing assembly tasks in moving lines. For example, one concept considers grouping those cells that can be easily automated into a fully automated sub-line operating in stop & go mode. Other concepts consider static assembly cells that address the logistics of cars as well as components as they move.
However, demand for automating assembly tasks on moving lines will continue to play a key role in the short and medium term. Although this demand currently comes from application needs in the automotive final assembly environment, once the above-outlined technologies have been fully developed, there is no doubt that they will be applicable in other fields that focus on unstable targets, such as logistics operations involving AGVs.
For now, however, ABB is focusing on applying this technology to the automotive sector, where its customer-based pilot applications already include cockpit assembly – currently in production ramp-up – carpet placement, seat placement, door assembly, and much more. Furthermore, experience gained through these and other applications will be applied to other automotive assembly applications characterized by similar technical challenges.