Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Session Overview
SES 6.1: Collaborative Robotics in Smart Manufacturing
Wednesday, 28/Jun/2017:
1:50pm - 3:10pm

Session Chair: Pedro Neto
Location: Aula Convegni (first floor)

Show help for 'Increase or decrease the abstract text size'

183. Hand/arm gesture segmentation by motion using IMU and EMG sensing

João Lopes, Miguel Simão, Nuno Marques Mendes, Mohammad Safeea, José Afonso, Pedro Neto

University of Coimbra, Portugal

Gesture recognition is more reliable with a proper motion segmentation process. In this context we can distinguish if gesture patterns are static or dynamic. This study proposes a gesture segmentation method to distinguish dynamic from static gestures, using (Inertial Measurement Units) IMU and Electromyography (EMG) sensors. The performance of the sensors, individually as well as their combination, was evaluated by different users. It was concluded that when considering gestures which only contain arm movement, the lowest error obtained was by the IMU. However, as expected, when considering gestures which have only hand motion, the combination of the 2 sensors achieved the best performance. Results of the sensor fusion modality varied greatly depending on user. The application of different filtering method to the EMG data as a solution to the limb position resulted in a significative reduction of the error.

257. Teaching Assembly by Demonstration using Advanced Human Robot Interaction and a Knowledge Integration Framework

Mathias Haage1, Grigoris Piperagkas2, Christos Papadopoulos2, Ioannis Mariolis2, Jacek Malec1, Yasemin Bekiroglu3, Mikael Hedelind3,4, Dimitrios Tzovaras2

1Lund University, Sweden; 2Centre of Research & Technology – Hellas, 6th km Charilaou - Thermi, 57001, Thessaloniki, Greece; 3ABB AB Corporate Research, Sweden,; 4VINNOVA, Sweden

Industrial robots have been successfully used in manufacturing by reducing production cost and eliminating unsound manual work. However, the use of industrial robots in assembly applications still suffers from complex, time-consuming programming and the need for dedicated hardware. In this work a novel system is presented that proposes the use of a teaching by demonstration methodology that would significantly reduce the time and required expertise to setup a robotized assembly station. The teaching by demonstration paradigm has been sought after in the robotics community for a long time, however, it is now believed to be an achievable approach due to recent developments in perception and cognition systems. Three key components within the proposed system are described; a portable human robot interaction interface, a perception module and a knowledge integration framework. An experimental setup and a teaching-by-demonstration experiment are presented utilizing the described components. The setup targets assembly of small parts, e.g. cell phone components, using a collaborative industrial robot, the ABB YuMi. The experiment targets the insertion of one mobile phone component into another through a folding movement, taught by human demonstration. The human instructor is guided by the HRI interface on how to teach the robotic system a new assembly. The experiment considers a single two-part assembly in each demonstration. The parts are placed in front of the system’s camera and the instructor performs the assembly of the parts. Utilizing image analysis and machine learning methods, the perception module uses the demonstration data to track the pose of the parts throughout the assembly and select basic snapshots of the demonstration called key-frames. The key-frames contain visual information of the scene, the extracted information by the perception module, and semantic information that can be edited by the user through HRI. The user can also inspect the automatically extracted key-frames, in order to add or remove any from the extracted list. Once the list with the semantically annotated key-frames is created, it is provided by the system to the Assembly Program Generator module, which generates a new assembly program for the robot, utilizing a Knowledge Integration Framework. Automatic generation of the assembly program is based on the semantic information of the key-frames, with each semantic annotation corresponding to a state of a Sequential Function Chart (SFC), whereas the order of the key-frames in the list defines also the order of the state sequence. Transitions between states are predefined based on the type of the assembly and the skills the system has already acquired. The folding insertion movement is characterized by a SFC sequence including the sub-states: grasping, picking, aligning, establishing contact, and folding. Qualitative evaluation indicates the usefulness of the presented approach. Planned efforts include the use of physical Human Robot Interaction during execution of the assembly in order to fine-tune the demonstrated operation using a learning by doing approach.

81. Walk-through programming for industrial applications

Federica Ferraguti1, Chiara Talignani Landi1, Cristian Secchi1, Cesare Fantuzzi1, Marco Nolli2, Manuel Pesamosca2

1Università di Modena e Reggio Emilia, Italy; 2Gaiotto Automation, via Toscana 1, 29122 Piacenza, Italy

Collaboration between humans and robots is increasingly desired in several application domains, including the manufacturing domain. The paper describes a software control architecture for industrial robotic applications allowing human-robot cooperation during the programming phase of a robotic task. The control architecture is based on admittance control and tool dynamics compensation for implementing walk-through programming and manual guidance. Further steps to integrate this system on a real industrial setup include the robot kinematics and a socket communication that sends a binary file to the robot.

274. Assisted Hardware Selection for Industrial Collaborative Robots

Casper Schou, Michael Natapon Hansson, Ole Madsen

Aalborg University, Denmark

This paper presents a configuration framework for assisting shop floor operators in selecting a suitable hardware configuration from commercially available components. The primary focus of this work is the modeling of process, product, and equipment knowledge, and the design of a configurator tool implementing this knowledge. The configurator takes process and product information as input and derives a list of suitable components for the operator to choose from. The approach is verified through a preliminary study indicating the feasibility of the approach.

Contact and Legal Notice · Contact Address:
Conference: FAIM 2017
Conference Software - ConfTool Pro 2.6.107
© 2001 - 2017 by H. Weinreich, Hamburg, Germany