Browsing by Author "Sibson, Jim"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Open Access Conceptual framework of a digital twin to evaluate the degradation status of complex engineering systems(Elsevier, 2019-02-18) D’Amico, Davide; Ekoyuncu, John; Addepalli, Sri; Smith, Christopher; Keedwell, Ed; Sibson, Jim; Penver, StevenDegradation of engineering structures and systems often comes in the form of wear, corrosion, and fracture. These factors progressively bring about performance decay, until the system fails to function satisfactorily. Complex engineering systems (CES) need regular maintenance throughout their operation, along with continuous checks on the health status of components and equipment, within regulatory frameworks. A digital twin paradigm is able to continuously monitor CES, to use this data to update a virtual model of the CES and thus make real-time predictions about future functionality. The purpose of this paper is to introduce a conceptual framework of a digital twin to be applied within the degradation assessment process of a CES. The digital twin framework will aim to gather digital data through a network to plan through-life requirements of the system. Data-driven approaches can be used to predict how degradation evolves over time. The proposed framework will help the decision-making process to better handle maintenance operations and achieve targets such as asset availability and minimised cost.Item Open Access Developing an ontological framework for effective data quality assessment and knowledge modelling(Cranfield University, 08/11/2022) Latsou, Christina; Garcia I Minguell, Marta; Sonmez, Ayse Nur; Orteu I Irurre, Roger; Palmisano, Martin Mark; Landon-Valdez, Suresh; Erkoyuncu, John Ahmet; Addepalli, Pavan; Sibson, Jim; Silvey, OllyBig data has become a major challenge in the 21st century, with research being carried out to classify, mine and extract knowledge from data obtained from disparate sources. Abundant data sources with non-standard structures complicate even more the arduous process of data integration. Currently, the major requirement is to understand the data available and detect data quality issues, with research being conducted to establish data quality assessment methods. Further, the focus is to improve data quality and maturity so that early onset of problems can be predicted and handled effectively. However, the literature highlights that comprehensive analysis, and research of data quality standards and assessment methods are still lacking. To handle these challenges, this paper presents a structured framework to standardise the process of assessing the quality of data and modelling the knowledge obtained from such an assessment by implementing an ontology. The main steps of the framework are: (i) identify user’s requirements; (ii) measure the quality of data considering data quality issues, dimensions and their metrics, and visualise this information into a data quality assessment (DQA) report; and (iii) capture the knowledge from the DQA report using an ontology that models the DQA insights in a standard reusable way. Following the proposed framework, an Excel-based tool to measure the quality of data and identify emerging issues is developed. An ontology, created in Protégé, provides a standard structure to model the data quality insights obtained from the assessment, while it is frequently updated to enrich captured knowledge, reducing time and costs for future projects. An industrial case study in the context of Through life Engineering Services, using operational data of high value engineering assets, is employed to validate the proposed ontological framework and tool; the results show a well-structured guide that can effectively assess data quality and model knowledge.Item Open Access An optimisation framework for improving supply chain performance: case study of a bespoke service provider(Elsevier, 2020-07-14) Farsi, Maryam; Bailly, Adrien; Bodin, David; Penella, Victor; Pinault, Pierre-Ly; Nghia, Elodie Thai Thien; Sibson, Jim; Erkoyuncu, John AhmetA service supply chain can be described as a system of systems with a highly interactive and complex network of suppliers, service providers, OEMs and customers. Supply chain management could create value for bespoke service providers, customers and stakeholders cooperating through the supply chain. Bespoke service provider companies are responsible for managing their asset based on different service contracts and possibly through the end of the asset lifetime. Providing a through-life service requires tailored strategic dimensions to measure the supply chain performance. The performance can be evaluated with regards to several supply chain elements such as demand management, procurement, logistics, etc. This article takes a different angle to the current supply chain performance frameworks by discussing performance through DMAIC cycle. Considering a through-life service, this paper presents a performance optimization framework to improve the supply chain performance in terms of an asset or component availability and cost of service. Moreover, an exhaustive list of KPIs to evaluate the supply chain performance are identified. A case study of fleet management for a bespoke service provider is considered to test the validity of the framework. The DMAIC technique has demonstrated to be an effective method to improve supply chain strategies and performance.Item Open Access Transforming industrial manipulators via kinesthetic guidance for automated inspection of complex geometries(MDPI, 2023-04-05) Loukas, Charalampos; Vasilev, Momchil; Zimmerman, Rastislav; Vithanage, Randika K. W.; Mohseni, Ehsan; MacLeod, Charles N.; Lines, David; Pierce, Stephen Gareth; Williams, Stewart; Ding, Jialuo; Burnham, Kenneth; Sibson, Jim; O’Hare, Tom; Grosser, Michael R.The increased demand for cost-efficient manufacturing and metrology inspection solutions for complex-shaped components in High-Value Manufacturing (HVM) sectors requires increased production throughput and precision. This drives the integration of automated robotic solutions. However, the current manipulators utilizing traditional programming approaches demand specialized robotic programming knowledge and make it challenging to generate complex paths and adapt easily to unique specifications per component, resulting in an inflexible and cumbersome teaching process. Therefore, this body of work proposes a novel software system to realize kinesthetic guidance for path planning in real-time intervals at 250 Hz, utilizing an external off-the-shelf force–torque (FT) sensor. The proposed work is demonstrated on a 500 mm2 near-net-shaped Wire–Arc Additive Manufacturing (WAAM) complex component with embedded defects by teaching the inspection path for defect detection with a standard industrial robotic manipulator in a collaborative fashion and adaptively generating the kinematics resulting in the uniform coupling of ultrasound inspection. The utilized method proves superior in performance and speed, accelerating the programming time using online and offline approaches by an estimate of 88% to 98%. The proposed work is a unique development, retrofitting current industrial manipulators into collaborative entities, securing human job resources, and achieving flexible production.Item Open Access A unified framework for digital twin development in manufacturing(Elsevier, 2024-05-04) Latsou, Christina; Ariansyah, Dedy; Salome, Louis; Erkoyuncu, John Ahmet; Sibson, Jim; Dunville, JohnThe concept of digital twin (DT) is undergoing rapid transformation and attracting increased attention across industries. It is recognised as an innovative technology offering real-time monitoring, simulation, optimisation, accurate forecasting and bi-directional feedback between physical and digital objects. Despite extensive academic and industrial research, DT has not yet been properly understood and implemented by many industries, due to challenges identified during its development. Existing literature shows that there is a lack of a unified framework to build DT, a lack of standardisation in the development, and challenges related to coherent goals of DT in a multi-disciplinary team engaged in the design, development and implementation of DT to a larger scale system. To address these challenges, this study introduces a unified framework for DT development, emphasising reusability and scalability. The framework harmonises existing DT frameworks by unifying concepts and process development. It facilitates the integration of heterogeneous data types and ensures a continuous flow of information among data sources, simulation models and visualisation platforms. Scalability is achieved through ontology implementation, while employing an agent-based approach, it monitors physical asset performance, automatically detects faults, checks repair status and offers operators feedback on asset demand, availability and health conditions. The effectiveness of the proposed DT framework is validated through its application to a real-world case study involving five interconnected air compressors located at the Connected Facility at Devonport Royal Dockyard, UK. The DT automatically and remotely monitors the performance and health status of compressors, providing guidance to humans on fault repair. This guidance dynamically adapts based on feedback from the DT. Analyses of the results demonstrate that the proposed DT increases the facility’s operation availability and enhances decision-making by promptly and accurately detecting faults.