##

11/04/2024 – In recent developments within the Dig_IT project, Task 5.5, led by Core, has made significant strides in the implementation of a predictive maintenance system. This task aims at accurately assess the health of the assets and predict their future states in near-real time, enabling early detection of potential failures through anomaly detection.

By implementing anomaly detection models for specific asset types and data to ensure input consistency and model adaptability and ensure adequate data collection by installing edge devices where applicable, the task aims to significantly enhance operational efficiency, reduce downtime, and preemptively address maintenance needs, ensuring the longevity, reliability and availability of critical machinery in the field.

CORE, in collaboration with Marini Marmi, Tampere University, Titania, and Kemi, and the end users being Marini (Milling Machine), Titania (Heavy Trucks), and Kemi (Heavy Trucks), have obtained great results to be visualized in the Decision Support System.

In Task 5.5, partners focused on three unique Use Cases involving the Marini, Kemi, and Titania mines, each presenting its set of assets and necessitating an approach that, while uniform in methodology, was customized to fit each scenario.

An achievement of this task was the data collection phase, which involved gathering data from diverse sources. This included collecting sensor-monitored operational parameters from trucks in the Kemi Mine, such as engine speed, throttle position, and diesel fuel consumption and digitally simulated data from Titania’s truck digital twin developed within the project.

For the Use Case of Marini, to monitor the asset’s operation, we proposed, installed, and included in the task’s scope an implementation of edge devices. Placed on critical parts of the Gangsaw milling machine, sensors of the edge devices measure the ambient temperature and the vibrations in the three axes (x, y, z) from the different components. Data collected, was subjected to an extensive exploratory data analysis (EDA) to identify trends, patterns, outliers, and anomalies, thus offering insights into the intrinsic qualities of the data.

After this preliminary examination, we continued on a data preprocessing stage, designed to rectify issues such as missing values and outliers, as well as to normalize the data. Using specific knowledge from the operations’ field, we cleansed the datasets from anomalies. Since, the known anomalies (faults and damages) may be not known, we proceeded with constructing unsupervised datasets with no known anomalies. This process ensured a structured data form, free of anomalies that could affect anomaly detection models, and transformed the data into an interpretable format.

We developed models based on architectures renowned for their effectiveness in the anomaly detection niche. These models are adept at identifying deviations from normal operational conditions, thereby generating early warnings for potential equipment failures. Utilizing an Unsupervised Anomaly Detection approach, the models are trained on normal operating data, aiming to minimize reconstruction errors. This process involves encoding input data to capture essential features and then decoding it to reconstruct the input.

Anomalies are flagged based on a predetermined threshold of reconstruction error between input and output, established through statistical methods to differentiate between normal, warning, and critical states. In the concluding stage, the models were assimilated into the project’s existing infrastructure, enabling their operation as microservices.

This integration involved monitoring data related to inputs and relaying anomaly detection results to the project’s data warehouse. Such configuration allows for seamless and near real-time integration, making anomaly detection input data and models’ results available to the interested parties.

15/04/2024 – Our partners Tapojärvi, Titania, Tampere University, Schneider Electric, and CORE Innovation and Technology team have been collaborating in Task 2.1 Development of data acquisition system and hardware prototype. Together, they have been working on the development to get the entire communication work end-to-end from the assets via the edge server and aggregator to the big data server.

The raw field data collected through our IIoT framework is integral for the analytics under development. The efficient data flow architecture we’ve implemented facilitates the synthesis and utilization of big data analytics. Recent developments include minor behavioral fixes in asset gateways to enhance reliability, the installation of accelerometers in assets at the Kemi mine, and the migration of edge server services to a new server with merged databases.

Additionally, adjustments have been made to the data model for edge server-to-aggregator communication, alongside modifications to data formats and packet structures to ensure a seamless connection. Throughout each phase, troubleshooting and testing have been conducted to guarantee the effectiveness and reliability of our systems.

Our software and tools, including Yocto/Ångström Linux, Python 3.9, NodeJS with MQTT and Mongoose, and MongoDB, have played a key role in facilitating these developments. These advancements represent a significant step forward in achieving Dig_IT’s objectives.

3rd Dig_IT Technical Meeting

 

On January 28, 2021, the third Dig_IT virtual technical meeting took place.

 

The main objective of the conference was to check the activities put in place during the last three months.

Wp1 was completed last month and currently all other WPs are proceeding on schedule.  

 

After the lineup, the future action plan in view of the next technical meeting was discussed.  

2nd Dig_IT Technical Meeting

The 19th October 2020, the second Dig_IT virtual technical meeting took place.

The main meeting’s objective it was the control of deliverables progress and review process.

 

The conference started with a summary of the project’s objectives for the first quarter, with a particular attention to WP1 and WP2 and a specific analysis of specific needs and open points of the project.

Moreover at the end of the event, participants discussed further actions to set in place to ensure quality of deliverables and they fixed some new important dates for document delivery.

Dig_IT at Final SLIM Event!

Today Dig_IT project was presented by the Technical Manager David de Paz Ruiz and the Project Coordinator María García Camprubí, during the final event of the closely related project, SLIM, that is reaching the end. 

Conceived as a clustering event, during the meeting other projects concerning the mining sector were also illustrated such as: Re-sourcing, Intermin, Mireu, Robominers, Biorecover and Tarantula.