201. Using simulation to analyze picker blocking in manual order picking systems
1GENT UNIVERSITY, Belgium; 2Department of Industrial Systems Engineering and Product Design
The rise of the e-commerce practice makes the warehouses be confronted with ever smaller orders that must be met ever faster, often within a 24-h period. This pressures the order picking process as the orders pickers’ workload becomes higher and higher, leading subsequently to congestion in the warehouse and impacting its productivity. It is therefore crucial to determine which order batching and picking policies enhance the performance of order picking activities. This paper carries out an intensive simulation study to examine the performance of different order picking policies with batching in a wide-aisle warehouse with a low-level picker-to-parts system. The performance of the system is measured in terms of total traveled distance, number of collisions between operators (congestion) and order lead times. A full factorial design is set up and the simulation output is statistically analyzed. The results are reported and thoroughly discussed.
130. Job shop flow time prediction using neural networks
University of Coimbra, Portugal
Due date assignment is a complex process of major importance for shop floor control. Quoting realistic due dates and delivering the goods on time allow to enhance customer service and to improve resource utilization by making it more efficient.
The due date assignment problem is intimately related to the problem of flow time prediction. If flow time prediction were perfect, the due date assignment problem would be greatly simplified because completion dates for jobs would be known. Unfortunately, flow time prediction is a challenging task, essentially in dynamic job shops in which jobs arrive at the shop over time. In this case, each arriving job has its own processing needs, in different machines and it will experience different congestion levels. Furthermore, if the shop dispatching rule is not “first in first out”, the arrival of a new job can change the processing sequence and thus the expected completion dates of jobs already in the system.
Flow time prediction is a difficult task due to the number of non-linear related aspects that can affect it. In this study we investigate how Artificial Neural Networks (ANN) can be used as a flow time prediction method.
To evaluate the ability of the proposed model to correctly predict job flow times, a simulation model of a dynamic job-shop was developed to generate the necessary data to train and test the ANN. Since the dispatching rules, adopted in the shop, can affect the flow time of the orders two scenarios are considered: (1) jobs are prioritized according to the First in First Out rule and (2) jobs are prioritized according to the Shortest Processing Time rule.
Results obtained with the proposed ANN flow time prediction model are compared with results obtained by two dynamic due date setting rules proposed in the literature: (1) the Dynamic Total Work Content (DTWK) rule and (2) the Dynamic Processing Time Plus Waiting (DPPW) rule. Results are compared considering three performance measures usually used to evaluate flow time prediction accuracy: Mean Absolute Lateness (MAL), Percentage of Tardy Jobs (PT) and Mean Tardiness (MT).
Results show that the proposed ANN model performs better than DTWK and DPPW rules independently of the used dispatching rules, for all the performance measures considered. Thus, if the shop information is easy to obtain, ANN can be used to predict job flow time and consequently to assign reliable due dates.
68. Process planning in Industry 4.0 environment
Faculty Of Mechanical Engineering and Naval Architecture, Croatia
World is currently facing the fourth industrial revolution. Working environment is demanded to be changed, rapidly, with hope that it will bring significant benefits in the future. Usual manufacturing processes are being automatized and connected to other activities within the company. One of the most important factors in Industry 4.0. environment is data management, big data management to be correct. It is done with use of cyber-physical systems, internet of things and cloud computing. Human professions are obligated to adapt and change so the roles that are known are suggested to get a different structure in the future. Workers have to learn to deal with new situation and accept the term of life-learning process, constantly improving their performance. In the end, with use of both technological and human improvements, bigger productivity, product quality and income with lower product delivery (manufacturing) time and product price are expected.
This paper will deal with change the role of process planner who will be presented as "product planner" in the environment of Industry 4.0. Product planner will use advanced process planning methods and will manage product and feedback database. Product database is some sort of archive of previously manufactured products and feedback database is collection of data from various sources within product supply chain. Feedback database helps to improve the process planning that results with product of higher quality. It allows the product planner to connect with other parts of the company and to be acknowledged with customer feedback. Among mentioned, overall working sphere of the new role will be presented in the paper.
96. Mapping the conceptual relationship among data analysis, knowledge generation and decision-making in industrial processes
Pontifical Catholic University of Parana, Brazil
Due to the development of information technology, monitoring and control systems have been boosted to increase their ability to collect, process and manage data in industrial processes. Increasing information complexity makes it difficult to organize and understand the large volume of data created under different operating and maintenance decision perspectives. The information flow and integration of the systems involved is a theme in the recent development of Industry 4.0, since that many records are being generated, but just a little knowledge is being explored. In this scenario, Data Engineering and Analytics concepts stand out, aiming the analysis and conversion of data stored in knowledge through different techniques, such as Knowledge Discovery in Databases (KDD), Data Mining (DM) and Process Mining (PM). Although it is possible to extract knowledge from the database (quantitative knowledge) through these techniques, the decision making involved in the industrial processes are still very dependent on the tacit knowledge of the operator (qualitative knowledge). In this environment characterized by information heterogeneity and complexity, Multi-Criteria Decision Making/Analysis (MCDM/A) methods present an appropriate approach to assisting operators in information processing and standardization for more assertive and effective decision-making. These methods include the Analytic Hierarchy Process (AHP), Technique for the Order of Priority for Similarity to Ideal Solution (TOPSIS), ELimination and Choice Expressing Reality (ELECTRE TRI) and The Preference Ranking Organization METHod for Enrichment of Evaluations (PROMETHEE). All of them aim to suggest an alternative of choice among several available, based on multiple criteria conflicting in the decision-making. In the literature, quantitative and qualitative approaches are dissociated in the process of knowledge extraction and decision-making, identifying an important gap. Therefore, this paper aims to develop a conceptual map that relates these major areas of knowledge – data analysis, knowledge generation and decision-making. A focal analysis is given to the MCDM/A and Process Mining methods, facilitating the decision-making process supported by data from the factory floor through process models. Through process mining, it is possible to identify and analyze process models considering the moment in which each record occurred (timestamp), its frequency (weight), extracting information and performance metrics. This quantitative information, together with qualitative evaluation criteria, finds in MCDM/A methods a basis of conciliation and informational treatment for analysis and decision-making. The objective is to highlight the way in which conciliation and the decision-making process can occur through conceptual maps that organize the knowledge involved. The results obtained from this analysis will provide important clues for the design of information systems in support of an industrial management that adheres to the requirements of Industry 4.0.
33. Evolutionary Algorithms for Programming Pneumatic Sequential Circuit Controllers
Indian Institute of Technology Madras, India
Sequential actuation of pneumatics is a common form of automation in small and medium scale industries. Changing the sequence of actuation of a given set of cylinders, by changing the logic program of actuation, based on the type of product being produced, is an economical technique to implement flexible automation in such industries. Even though the logic program for sequential actuation of cylinders based on the state of end-position sensors is stored in programmable logic controllers (PLC), the techniques for deriving logic equations (and corresponding PLC programs) has largely remained manual, and has hindered the implementation of flexible automation. Recently, the authors have published techniques to automatically convert a sequence of cylinder actuation into a truth-table using Genetic Algorithm and have suggested using algorithms like Quine-McCluskey for converting truth-tables into logic equations. However, Quine-McCluskey algorithm is NP-Complete and can adversely affect the changeover time in a large flexible automation setup. Hence, in this paper, the authors have addressed the problem of converting a truth-table into corresponding logic equations using Genetic Programming, that promises much quicker solutions for difficult problems. A modified Genetic Programming with elitism was developed specifically for this application and its capabilities has been demonstrated through case studies. Further, a methodology for optimizing the parameters of Genetic Algorithm that converts actuation sequence into truth-table has also been proposed which can further reduce the changeover time. Together, these techniques will provide an efficient way to modify existing fixed automation setups and implement flexible automation in small and medium scale industries.