CERES
Library Services
  • Communities & Collections
  • Browse CERES
  • Library Staff Log In
    Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Thrower, John"

Now showing 1 - 4 of 4
Results Per Page
Sort Options
  • Loading...
    Thumbnail Image
    ItemOpen Access
    The development and evaluation of Robot Light Skin: A novel robot signalling system to improve communication in industrial human–robot collaboration
    (Elsevier, 2018-09-12) Tang, Gilbert; Webb, Phil; Thrower, John
    In a human–robot collaborative production system, the robot could make request for interaction or notify the human operator if an uncertainty arises. Conventional industrial tower lights were designed for generic machine signalling purposes which may not be the ultimate solution for robot signalling in a collaborative setting. In this type of system, human operators could be monitoring multiple robots while carrying out a manual task so it is important to minimise the diversion of their attention. This paper presents a novel robot signalling solution, the Robot Light Skin (RLS),which is an integrated signalling system that could be used on most articulated robots. Our experiment was conducted to validate this concept in terms of its effect on improving operator's reaction time, hit-rate, awareness and task performance. The results showed that participants reacted faster to the RLS as well as achieved higher hit-rate. An eye tracker was used in the experiment which shows a reduction in diversion away from the manual task when using the RLS. Future study should explore the effect of the RLS concept on large-scale systems and multi-robot systems.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    Digitisation of a moving assembly operation using multiple depth imaging sensors
    (Springer, 2015-10-09) Prabhu, Vinayak Ashok; Song, Boyang; Thrower, John; Tiwari, Ashutosh; Webb, Philip
    Several manufacturing operations continue to be manual even in today’s highly automated industry because the complexity of such operations makes them heavily reliant on human skills, intellect and experience. This work aims to aid the automation of one such operation, the wheel loading operation on the trim and final moving assembly line in automotive production. It proposes a new method that uses multiple low-cost depth imaging sensors, commonly used in gaming, to acquire and digitise key shopfloor data associated with the operation, such as motion characteristics of the vehicle body on the moving conveyor line and the angular positions of alignment features of the parts to be assembled, in order to inform an intelligent automation solution. Experiments are conducted to test the performance of the proposed method across various assembly conditions, and the results are validated against an industry standard method using laser tracking. Some disadvantages of the method are discussed, and suggestions for improvements are suggested. The proposed method has the potential to be adopted to enable the automation of a wide range of moving assembly operations in multiple sectors of the manufacturing industry.
  • No Thumbnail Available
    ItemOpen Access
    Robot Light Skin' paper data for Figures 5-11
    (Cranfield University, 2018-11-08 11:38) chun gilbert Tang, Chi; Webb, Phil; Thrower, John
    Experiment data used in the publication "The development and evaluation of Robot Light Skin: A novel robot signalling system to improve communication in industrial human–robot collaboration" for the concept validation of the Robot Light Skin (RLS), the data included correspond to the following figures in the publication: Reaction time to light signals fig 5, hit-rate fig 6, task performance fig 7, ease of monitoring fig 8, level of tiredness fig 9, and eye tracking fixation time fig 10 and 11.
  • Loading...
    Thumbnail Image
    ItemOpen Access
    A study to trial the use of inertial non-optical motion capture for ergonomic analysis of manufacturing work
    (Sage, 2016-08-26) Fletcher, Sarah R.; Johnson, Teegan L.; Thrower, John
    It is going to be increasingly important for manufacturing system designers to incorporate human activity data and ergonomic analysis with other performance data in digital design modelling and system monitoring. However, traditional methods of capturing human activity data are not sufficiently accurate to meet the needs of digitised data analysis; qualitative data are subject to bias and imprecision, and optically derived data are hindered by occlusions caused by structures or other people in a working environment. Therefore, to meet contemporary needs for more accurate and objective data, inertial non-optical methods of measurement appear to offer a solution. This article describes a case study conducted within the aerospace manufacturing industry, where data on the human activities involved in aircraft wing system installations was first collected via traditional ethnographic methods and found to have limited accuracy and suitability for digital modelling, but similar human activity data subsequently collected using an automatic non-optical motion capture system in a more controlled environment showed better suitability. Results demonstrate the potential benefits of applying not only the inertial non-optical method in future digital modelling and performance monitoring but also the value of continuing to include qualitative analysis for richer interpretation of important explanatory factors.

Quick Links

  • About our Libraries
  • Cranfield Research Support
  • Cranfield University

Useful Links

  • Accessibility Statement
  • CERES Takedown Policy

Contacts-TwitterFacebookInstagramBlogs

Cranfield Campus
Cranfield, MK43 0AL
United Kingdom
T: +44 (0) 1234 750111
  • Cranfield University at Shrivenham
  • Shrivenham, SN6 8LA
  • United Kingdom
  • Email us: researchsupport@cranfield.ac.uk for REF Compliance or Open Access queries

Cranfield University copyright © 2002-2025
Cookie settings | Privacy policy | End User Agreement | Send Feedback