Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Session Overview
Session
SES 9.2: Collaborative Robotics in Smart Manufacturing
Time:
Thursday, 29/Jun/2017:
11:20am - 1:00pm

Session Chair: Pedro Neto
Location: Aula N (first floor)

Show help for 'Increase or decrease the abstract text size'
Presentations

128. Towards shared autonomy for robotic tasks in manufacturing

Andreas Pichler, Sharath Chandra Akkaladevi, Markus Ikeda, Michael Hofmann, Matthias Plasch, Christian Wögerer, Gerald Fritz

Profactor GmbH, Austria

In recent years, the concept of robots cooperating with humans has gained a lot of interest, in both domestic and industrial areas. In industrial environments the combination of cognitive capabilities of humans with the physical strength and efficiency of the robots/machines can essentially reduce the amount of fixed production costs in relation to variable costs. The robot systems are also understood as proper mean to address changes in demography and shortage of skilled labor in material goods production. Furthermore, they provide higher flexibility for the automation and ensure durable quality of products which are already nowadays challenging companies.

Setting up and operating a system in a fenceless environment as well as being responsive to human interactions requires new sensor capabilities integrated in a human robot system. Such a flexible system has to be embedded in smart manufacturing system.

Human robot interaction is differentiated at different levels with varying degree of shared autonomy as human robot coexistence, cooperation and collaboration.

In this paper we will present a platform called XROB, which builds and utilizes models of human robot interactions in an intuitive way. Using this platform, the operator gets qualified to pursue different kind of task sharing operation in applications requiring customized patterns of interactions. According to the different kinds of shared autonomy we give examples of how processes can be implemented in industrial settings. The processes addresses key issues in manufacturing such as fast ramp up, zero defect inspection and increasing flexibility in the automation of assembly processes.

The coexistence scenario describes a robot assistant system focusing on quality control tasks. The mobile platform features a flexible quality inspection system which can be enhanced with a variety of sensors and inherits intuitive configuration capabilities. Working side by side in the same working space describes a scenario of an assembly of automotive combustion engines. Beside rapid reconfiguration of the system, also safety issues have to be taken into consideration. The assembly scenario demonstrates a cooperation scenario where robots carrying out screwing operations beside human attaching parts on the same work piece.

The third example shows the collaboration of human robot teams. The intense interaction between human and robot requires a mutual understanding of the task at hand. Specifically, for the robot to assist the human operator for a given task involves understanding the actions performed by the human, interpreting the activity and eventually interacting with the human. This is prerequisite to enable seamless interaction.

Finally, the paper sets the different levels of shared autonomy in comparison and gives remarks on the requirements of successful implementation in industry.


256. On Autonomous Robotic Cooperation Capabilities Within Factory and Logistic Scenarios

Giuseppe Casalino, Enrico Simetti, Francesco Wanderlingh, Kourosh Darvish, Barbara Bruno, Fulvio Mastrogiovanni

University of Genoa, Italy, Italy

The paper presents the development of a unified functional, algorithmic and Software (Sw) architecture, which can be adopted as a standard for controlling, at action level only, any robotic structure within a given wide class of them; even of reconfigurable type within the class; Such control architecture is therefore deemed very suitable for operating within factory and/or logistic, possibly reconfigurable, scenarios. Moreover, for the few cases of cooperative activities to be established between agents not allowed to be cable connected, an effective coordination policy, based on the exchange of a reduced information set, only regarding the cooperation goals, is developed; and relevant simulative and experimental trials are briefly outlined. Moreover, the advantage of having, in whatever operative condition, the possibility of commanding the involved structures only in terms of the ultimate goals of each action, also seems to be the right basis for having non-negligible improvements within their integration with automated action planning, and even learning, techniques.


190. Integration of a Skill-based Collaborative Mobile Robot in a Smart Cyber-Physical Environment

Rasmus Andersen1, Emil Blixt Hansen1, David Cerny1, Steffen Madsen1, Biranavan Pulendralingam1, Simon Bøgh2, Dimitrios Chrysostomou2

1Dept. of Mechanical and Manufacturing Engineering, Aalborg University, Fibigerstræde 16, Aalborg Øst, DK-9220, Denmark; 2Robotics & Automation Group, Dept. of Mechanical and Manufacturing Engineering, Aalborg University, Fibigerstræde 16, Aalborg Øst, DK- 9220, Denmark

The goal of this paper is to investigate the benefits of integrating collaborative robotic manipulators with autonomous mobile platforms for flexible part feeding processes in an Industry 4.0 production facility. The paper presents Little Helper 6 (LH6), consisting of a MiR100, UR5, a Robotiq 3-Finger Gripper and a task level software framework, called Skill Based System (SBS). The preliminary experiments performed with LH6, demonstrate that the capabilities of skill-based programming, 3D QR based calibration, part feeding, mapping and dynamic collision avoidance are successfully executed and strategies for further expansion of the operational capabilities of the system are discussed.


210. 3D metrology using a collaborative robot with a laser triangulation sensor

Gil Boyé De Sousa, Adel Olabi, Jorge Palos, Olivier Gibaru

Arts et Métiers ParisTech - Lille, France

Industrial robots are a key element in Smart Manufacturing systems. They can perform many different tasks such as assembly, pick-and-place, or even 3D metrology operations. In order to perform 3D metrology, the robot is equipped with a 2D laser triangulation sensor. The accuracy of the measurements made by this system is dependent of an accurate TCP (Tool Centre Point) calibration and the accuracy of the robot. In this paper, a TCP calibration method is applied to a collaborative robot. The hand-guiding feature of this kind of robots is used to establish a human-robot interaction to obtain the laser sensor TCP using a calibration sphere. Experimental results are presented to validate the procedure and evaluate the quality of the measurements.


92. Pose estimation and object tracking using 2D images

Fernando Casado García, Yago Luis Lapido, Diego P. Losada, Alejandro Santana-Alonso

AIMEN technology centre, Spain

Different factors are forcing the change in the logistics market, most notably e-commerce and manufacturing of custom-made products. Increasingly, customers look for personalized products and mass customization is pushing the industry to reduce time to market and to enhance production flexibility, where batch size tends to one. This fact is highly linked with warehouse management, where exploitation costs increase with the value-added tasks, where third party logistics (TPLs) must raise service quality while maintaining operating costs.

Thus, logistics is one of the most important links within the manufacturing chain. For this reason, automating intra logistic processes is a priority task to improve their performance. Goods are usually placed on pallets for transportation and stacking, so handling pallets becomes a necessity.

In this work, we present a detection system, based on 2D pattern recognition, for localizing and obtaining the pose of pallets in the working environment of autonomous mobile forklifts. The detection method is part of a novel automation solution designed to retrofit manual operated logistics vehicles, adding a new autonomous working mode. With this new working mode, the forklifts could be operated autonomously or in manual mode, obtaining a highly flexible pallet handling system that could be applied in shared spaces with humans.

We use two industrial HD cameras, one RGB installed on top of the forklift and one NIR installed between the forks. The detection system has two working modes, a) initial pallet identification and b) pallet tracking to perform visual servoing.

To identify and locate pallets in the working area of the autonomous forklifts, the detection system scales and applies homographic transforms to the 2D pattern. This method allows to obtain the pallet pose using 2D images with the forklift motionless, but with a high computational cost, even using low resolution images. To allow detecting and tracking a pallet, in the second mode we use a ROI (Region Of Interest) -to restrict computational needs- in full resolution, using the pattern scale and transform values obtained in the first mode. Using the detected position in the 2D image and applying geometrical transformations we obtain the pallet pose relative to the forklift. To improve robustness and reduce computational time further, the detection system makes use of the vehicle odometry to perform visual servoing on pallet handling.

The first camera is used to locate pallets on the floor and the second camera is used to locate pallets on shelves. In this latter case, only a scale transform is applied to disambiguate the distance to the pallet, as load and unload operations are always performed with a fixed orientation.



 
Contact and Legal Notice · Contact Address:
Conference: FAIM 2017
Conference Software - ConfTool Pro 2.6.112
© 2001 - 2017 by H. Weinreich, Hamburg, Germany