Newsletter tasks

11/04/2024 – In recent developments within the Dig_IT project, Task 5.5, led by Core, has made significant strides in the implementation of a predictive maintenance system. This task aims at accurately assess the health of the assets and predict their future states in near-real time, enabling early detection of potential failures through anomaly detection.

By implementing anomaly detection models for specific asset types and data to ensure input consistency and model adaptability and ensure adequate data collection by installing edge devices where applicable, the task aims to significantly enhance operational efficiency, reduce downtime, and preemptively address maintenance needs, ensuring the longevity, reliability and availability of critical machinery in the field.

CORE, in collaboration with Marini Marmi, Tampere University, Titania, and Kemi, and the end users being Marini (Milling Machine), Titania (Heavy Trucks), and Kemi (Heavy Trucks), have obtained great results to be visualized in the Decision Support System.

In Task 5.5, partners focused on three unique Use Cases involving the Marini, Kemi, and Titania mines, each presenting its set of assets and necessitating an approach that, while uniform in methodology, was customized to fit each scenario.

An achievement of this task was the data collection phase, which involved gathering data from diverse sources. This included collecting sensor-monitored operational parameters from trucks in the Kemi Mine, such as engine speed, throttle position, and diesel fuel consumption and digitally simulated data from Titania’s truck digital twin developed within the project.

For the Use Case of Marini, to monitor the asset’s operation, we proposed, installed, and included in the task’s scope an implementation of edge devices. Placed on critical parts of the Gangsaw milling machine, sensors of the edge devices measure the ambient temperature and the vibrations in the three axes (x, y, z) from the different components. Data collected, was subjected to an extensive exploratory data analysis (EDA) to identify trends, patterns, outliers, and anomalies, thus offering insights into the intrinsic qualities of the data.

After this preliminary examination, we continued on a data preprocessing stage, designed to rectify issues such as missing values and outliers, as well as to normalize the data. Using specific knowledge from the operations’ field, we cleansed the datasets from anomalies. Since, the known anomalies (faults and damages) may be not known, we proceeded with constructing unsupervised datasets with no known anomalies. This process ensured a structured data form, free of anomalies that could affect anomaly detection models, and transformed the data into an interpretable format.

We developed models based on architectures renowned for their effectiveness in the anomaly detection niche. These models are adept at identifying deviations from normal operational conditions, thereby generating early warnings for potential equipment failures. Utilizing an Unsupervised Anomaly Detection approach, the models are trained on normal operating data, aiming to minimize reconstruction errors. This process involves encoding input data to capture essential features and then decoding it to reconstruct the input.

Anomalies are flagged based on a predetermined threshold of reconstruction error between input and output, established through statistical methods to differentiate between normal, warning, and critical states. In the concluding stage, the models were assimilated into the project’s existing infrastructure, enabling their operation as microservices.

This integration involved monitoring data related to inputs and relaying anomaly detection results to the project’s data warehouse. Such configuration allows for seamless and near real-time integration, making anomaly detection input data and models’ results available to the interested parties.

15/04/2024 – Our partners Tapojärvi, Titania, Tampere University, Schneider Electric, and CORE Innovation and Technology team have been collaborating in Task 2.1 Development of data acquisition system and hardware prototype. Together, they have been working on the development to get the entire communication work end-to-end from the assets via the edge server and aggregator to the big data server.

The raw field data collected through our IIoT framework is integral for the analytics under development. The efficient data flow architecture we’ve implemented facilitates the synthesis and utilization of big data analytics. Recent developments include minor behavioral fixes in asset gateways to enhance reliability, the installation of accelerometers in assets at the Kemi mine, and the migration of edge server services to a new server with merged databases.

Additionally, adjustments have been made to the data model for edge server-to-aggregator communication, alongside modifications to data formats and packet structures to ensure a seamless connection. Throughout each phase, troubleshooting and testing have been conducted to guarantee the effectiveness and reliability of our systems.

Our software and tools, including Yocto/Ångström Linux, Python 3.9, NodeJS with MQTT and Mongoose, and MongoDB, have played a key role in facilitating these developments. These advancements represent a significant step forward in achieving Dig_IT’s objectives.

Partners involved

All partners have been involved in Task 8.3. End users involved are Marini Marmi, Titania and Kemi.

Objective and outcomes

The main goal of Task 8.3 is to assist partners in bringing their results closer to the market, thanks to Market Research and Business Model Design.

Task 8.1 Development of the Plan for the Exploitation and Dissemination of Results, and T8.4 Innovation Management have used the outcomes of this task.

What has been done in Task 8.3?

Among Task 8.3 tasks, CORE has developed a business model for Dig_IT IoT platform. In this task, CORE has focused on mining industry as initial customer, and we engaged all edge technologies such as machine learning and predictive maintenance. Mining operations will offer great value on their communities and markets via optimization of their operations, sustainability goals and advanced health and safety measures. These business models are highly financed by investors due to their ESG focused outputs. CORE analyzed all these parameters to ensure necessary activities and resources will be available for the post-project period.

Partners involved

Partners involved in Task 5.1 are BRUNEL, ICCS, LIBRA, and TAPO for Task 5.7. Moreover, the end users involved are TITANIA and KEMI for Task 5.7.

Objectives and outcomes

The main objective is to develop a smart scheduling tool for asset scheduling in a sustainable digital mine of the future. The tool is based on the ‘Phase-Wise, Constrained, Multi-Objective Optimization’ concept, which stems from mine operation practice. The target of the smart scheduling tool is to set the best strategy for operating the trucks and loaders by considering multiple goals that need to be achieved simultaneously. Three objectives have been applied, including (1) minimising the queuing time, (2) maximising the production, and (3) minimising the financial cost. Recently, the fourth objective, (4) minimising the risk of machine downtime, has been incorporated to extend the capabilities of the smart scheduling tool. While the goals may conflict, this tool will help mining operators plan their operation activities smoothly and efficiently.

A Smart Scheduling Tool within the Dig_IT Decision Support System (DSS) is developed to assist mining operators in deciding their mining operation strategy. The tool will provide a list of working sequences with time stamps to guide the mining operator in performing their shift operation. The tool is equipped with a customisation function, which allows the mine operator to adjust the operation parameters such as truck availability, speed, and shift selection. As a result, a graphical representation of the pre-trained set of solutions is sorted according to the competing objectives and displayed to help the mining operator make a quick decision.

The outcome of this task will incorporate real-time data exchange between Brunel (Data Model) and ICCS (Data Warehouse). The production and assets data from the mine are subscribed from the Data Warehouse (near real-time) and further improve the MOO solutions. Once a new set of solutions is available, it will be published to the Data Warehouse using MQTT and consequently updated to the smart scheduling tool hosted by the Decision Support System (LIBRA) Business Intelligence website.

What has been done in Task 5.1?

The Smart Scheduling Tool, which is part of the Dig_IT Intelligent layer, will:

  1. Perform the mining optimisation using the selected phase-based evolutionary algorithm that balances convergence and diversity among different phases.
  2. Suggest a set of optimal solutions that work best with the mining operation after analysing the phase-wise optimisation problem (POP).
  3. Apply the phased-based optimisation algorithms to optimise digital mines’ optimal management and operation in near real-time applications.

Software and tools used

“PlatEMO” software package, also known as an Evolutionary Multi-Objective Optimization Platform, has been used to find the optimal solution for a problem. PlatEMO provides a variety of algorithms for solving optimisation problems in a black-box manner. This software package belongs to BIMK Group and is Copywrite only for research and educational purposes. The set of solutions model output will be updated in the Decision Support System (DSS) via the MQTT protocol that has been established between the CPU and the Dig_IT Platform.

Task 4.4 Reduced Order Models Layers for digital twins

Partners involved

In this Task, ITAINNOVA has participated as Task leader adapting and developing the Twinkle library. Besides, SCHNEIDER and LIBRA have participated integrating the library in the final IoT platform. Finally, SUBTERRA has supported providing input to build the geotechnical Digital Twins.

The end-users involved in this task are the ones that will have a Real-Time Digital Twin at the end of the project, i.e. the end-users: MARINI, TAPO and TITANIA.

Objectives and outcomes

The main objective of this Task is to adapt the existing library Twinkle (developed by ITAINNOVA) to build the real-time digital twins. The library will be adapted for integration in the Dig_IT IoT platform as a layer to evaluate the resulting real-time digital twins, enabling the visualization of real-time risk maps in the DSS. These realtime digital twins will be the base for process control and quality assessment during operation.

This task plays a very important role within WP4, as the Reduced Order Models of geotechnical and fluid dynamic simulations developed in tasks 4.2 and 4.3 are built based on the library developed in this task.

What has been done in Task 4.4?

In this Task, the existing Twinkle library was adapted to generate real-time digital twins for geotechnical and fluid dynamics simulations. The resulting models stand for virtual sensors that can predict in real time the geotechnical or fluid-dynamics risks, mainly related to safety (slope stability, pollutants concentration in air and water). An exhaustive and automatized pre and post processing workflow was developed and implemented in the final tool according to each end-user requirements and needs. A user manual was generated in order to ensure the understanding regarding the use of this tool.

Finally, the exploitation of the generated ROMs is highly improved using a graphical user interface (GUI), which allows for fast evaluations using interactive sliders, real time tendencies evaluation with graphs, plotting of contour predictions etc. ITAINNOVA has a web GUI created in VOILA, which has been successfully adapted by LIBRA for platform implementation.

The deliverables related to this task are D4.4 ‘Python library and documentation for building digital twins from engineering simulation tools (v1.0)’, which was already submitted in M21, and D4.5 ‘Python library and documentation for building digital twins from engineering simulation tools (final)’, which is a final version of the previous deliverable and will be submitted at the end of the project.

Software and tools used

For this task, Python scripting was used in order to adapt and implement the Twinkle library into the IoT platform, to develop the post-processing capabilities according to each use-case, and to adapt the GUI.

GUI to display the real-time digital-twin for air quality and ventilation assessment in Kemi mine.
Image of the Kemi operaiting area under study, indicating the plane where the digital twin is being evaluated.
GUI to display the real-time digital-twin for geotechnical stability assessment of the tailing pond of La Parrilla mine.
Image of the La Parrilla area under study, indicating the plane where the digital twin is being evaluated.

Task 3.4 Big Data Optimisation and Analysis

Partners involved

The partners involved in Task 3.4 are BRUNEL, ICCS, LIBRA, and SUBTERRA for Task 3.9. Moreover, TITANIA has also been involved as end user, along with LA PARRILLA for Task 3.9.

Objectives and outcomes

The main goal of Task 3.4 is to develop a human assistance tool for early warning hazard prediction in a sustainable digital mine of the future. The tool is based on data fusion and Long Short-Term Memory Recurrent Neural Network (LSTM-RNN), which stems from the big data optimisation concept which involves complex applications with elements such as predictive models, statistical algorithms, and what-if analysis in a mine operation practice.


A Human Decision Tool within the Dig_IT Decision Support System (DSS) is developed to alert the mining operator if any slope instability occurs within the open pit mine. The tool will be able to predict at least 16 minutes before the real occurrence to give some time for the mining operator to make strategic operation decision-making.

The outcome of this task will incorporate real-time data exchange between Brunel (Data Model) and ICCS (Data Warehouse). Then, the data will be translated to the Decision Support System (LIBRA) for displaying it in the interactive Human Decision Tool.  

The Human Decision Tool, which is part of the Dig_IT IoT Middleware layer, will: 
  • Construct a residual signal and compare it with a predefined threshold in generating alarms in the presence of bad data resulting from limited communication capacity. 
  • Perform fault detection and fault-tolerant control (FTC) under communication constraints feature/event detection, classification, and prediction in large-scale data environments in supervised and unsupervised modes.
  • Perform efficient evolutionary methods in preprocessing to seamlessly incorporate data denoising and data fusion across the entire early warning system, thereby augmenting the system’s overall robustness, observability, and accuracy.
  • Perform feature engineering techniques to understand the nature of the data and use short time series analysis to establish a model for detecting, identifying and predicting the events/failures.
  • Develop an early warning event prediction model that combines statistical time series modelling, relevant domain knowledge and intelligent search technique.
  • Apply the hazard prediction model using the real-time Dig_IT Decision Support System dashboard.

Software and tools used

The Fault Tolerant Control software package is built on Python 3 with standard libraries such as NumPy, Pandas, Matplotlib, Seaborn, Keras, and Sklearn. A decent machine with a good CPU/GPU with the installed Python libraries can be used to train the model. The model output from the prediction model will be updated in the Decision Support System (DSS) via the MQTT protocol that has been established between the CPU and the Dig_IT Platform.

Task 4.3 Fluid Dynamics digital twins design and development: Risk Maps


The technical development has been carried out entirely by ITAINNOVA.

The end-users have supported ITAINNOVA in the definition of the activities according to their interests and needs; MARINI and TAPOJARVI regarding the air quality models, and TITANIA regarding the water quality model. 

Objective and Outcomes

The purpose of this task is to build fluid dynamic RT-DTs for air and water quality. To that end, a CFD methodology is developed and validated with experimental data. Finally, this methodology is used to simulate a large number of different scenarios (a  design of experiments, DoE). The CFD results are then postprocessed with the library developed in T4.4 to build the virtual sensors of each mine (real time digital twins). Those virtual sensors will be integrated in the DecisionSupport System (DSS) of the Dig-IT IoT platform to allow the end-user to see in real time the risk maps on the air/water quality.

The main goal of this WP is to develop RT-DTs of engineering assets, geotechnical and fluid dynamic processes, so this task contributes directly to the achievement of this goal.

Task 5.6 will directly use the outcomes of this task, as the final ROM for each use-case will be integrated and displayed in DSS of the final IoT platform.

What has been done in Task 4.3?

On the one hand, regarding the air quality models from MARINI and KEMI, a CFD methodology has been developed and validated with experimental measurements performed in the real use-cases. In this way, a specific solver for each use case has been developed using the open-source CFD software OpenFOAM, including the appropriate modifications in the code according to the requirements of each mine, regarding heat transfer and buoyancy phenomena, pollutants transport, etc. After that, a Design of Experiments has been designed and run, covering the range of the input variables according to the end-users requirements. Finally, the results of all these simulations have been used in order to build the virtual sensor (ROM), which will be finally implemented in the IoT platform. 

On the other hand, regarding the water quality analysis from TITANIA, a data-based model has been developed in order to predict in real time the relevant pollutants (suspended solids and nickel) released to the environment. For such task, historical data was used in order to generate and train the prediction model generated through Python scripting. Likewise, the resulting model will be embedded in the DSS of the IoT platform to provide real time information on the prediction of the pollutants in the mine water streams. 

For this task, open-source code OpenFOAM was used in order to perform the CFD simulations. Besides, Python scripting was used to automatize the pre and post processing activities of the workflow, and to develop the data-based model from the TITANIA use-case. The library developed in T4.4 has been used to generate the ROMs.
CO2 concentration prediction in the operating area of Kemi mine (CFD results).

Ventilation flow path in the operating area of Kemi mine  (CFD results), when the rocks loading is starting and the stope is blocked by the rocks.

Task 4.2 Geo-spatial attributes digital twins design and development

Partners involved

SUBTERRA has participated and lead T4.2 along with MARINI MARMITITANIA and LA PARRILLA, which have been involved as end users.

Objectives and outcomes

The main goal of Task 4.2 is to develop geotechnical digital twins in the end users locations. Task 4.3 will use the outcomes of this task, by using outputs of La Parrilla numerical model, to develop the ROMs for the digital twin.

What has been done in Task 4.2?

In this task, three digital twins have been developed: La Parrilla, Titania and Marini Marmi.

The calculation model of La Parrilla has been finalised, using the FLAC 3D software. For this model, the geometries obtained from the drone flights have been taken into account, as well as the data from the installed sensors described in the previous WP2.

In a first step, the actual 3D model of the tailings pond has been represented, using the photogrammetric restitution of the images obtained by the UAS technique. This first approximation serves to compare the geometries in the different flights performed, as well as to detect the appearance of cracks or other elements that could compromise the integrity of the structure.

The first step in the development of the numerical model is the elaboration of the geometry. The meshing of the elements is a complex process, but it requires maximum detail so that the subsequent calculations are not affected due to an incorrect selection of the elements that make up the tailings pond. In the elaboration of the geometry of the model, all the constructive elements present have been taken into account, as well as the materials used for the waterproofing of the ground to avoid the infiltration of percolating water into the subsoil by means of the application of a geomembrane.

The results shown in Figure 1 correspond to the current situation of the tailings pond, where a groundwater level is located below the dam, and a non-evolving pore pressure due to the cessation of activity at the La Parrilla mine.

Figure 1. Numerical model results for current scenario in La Parrilla Tailing Storage Facility

The numerical model for Titania follows a similar methodology to that presented for the La Parrilla model, although in this case the behaviour of a slope in the open pit is simulated. The photogrammetric restoration of the different flights has been carried out and the geometries have been compared. These data have been taken into account for the geometry. Figure 2 shows the geometry obtained in the most recent UAS flight (November 2022). Figure 6 shows the area to be calculated, where the greatest instabilities occur. The geometry that FLAC 3D will use for the calculations has been obtained.

Figure 2. 3D model generated using UAS images performed in the last visit to Titania.

The Marini Marmi model represents the update on the progress of the underground works. At Subterra, this progress is received periodically from Marini Marmi Team, and is represented in a 3D model created with Leapfrog. In this model, the progress of the underground galleries can be observed, as well as their crossing through complex zones such as faults, or low resistance materials such as clays. Figure 3 shows an image of the 3D model representing the excavated galleries. The sections that are not represented in 3D show the progress planned but not executed yet.

Figure 3. 3D model of Marini underground galleries excavated.

Edge computing devices planning and interoperability


ICCS toghether with SCHNEIDER.

This activity involves the end users MARINITITANIA and KEMI

Objective and Outcomes

The taks aims to provision edge computing devices planning and interoperability

The ask help to defining the edge computing devices architecture and implementation to support the overall IIoTp system. Its outcomes affect WP5 and WP6.


Task was completed, validated, and pending to be tested in field (WP6). The overall scheme of the edge computing devices and functions was designed and implemented. Localization Sensor used include UWB and GNSS while Environmental Senso include Portable Gas Sensors, Sensors for temperature, humidity, and noise levels, Biometric Sensors, (PPG), Heart Rate, SPO2, Galvanic Skin Response, Skin Temperature. Voice recognition was included.

Cybersecurity Technologies


ROTECH together with ICCS

This activity involves all the end users: MARINITITANIALA PARRILLA, HANNUKAINEN and KEMI

Objective and Outcomes

The purpose of this task will be to develop security management and services like Confidentiality, Integrity and Authentication analyzing and identifying security/cryptographic mechanisms and techniques to protect data communications, in the context of resource-constrained devices.

Its outcomes affect tasks T2.4, T3.2, T3.3, T3.6


The Internet of Things (IoT) ecosystem is dedicated to providing connectivity to physical devices enabling the collection and sharing of sensed data, and its implementation in mining industry has to face many challenges, mainly related to connectivity, especially in underground mine sites. As mining operations and relevant IoT devices become connected, several security issues could arise, e.g., considering vulnerability to cyber-attacks, which will require additional investment into security systems. 

The cybersecurity management on Dig_IT project aims to provide a security layer for communication on different protocols on mobile and resource constrained devices. A security solution has been implemented with the aims to achieve a minimal impact on the current mines’ network infrastructure, along with a new and improved end users’ network infrastructure compliant with the security requirements. 

In particular, task T3.5 faced the communication security between resource constrained devices, namely Smart Garments and Drone, and the network infrastructure. The identified solution aimed to provide security on MQTT and Wi-Fi protocols, the former used for Smart Garment communication and the latter used for Drone communication.

The security solution planned for the Wi-Fi communication of UAV/UAS monitoring is no longer provided, due to the loss of image quality that could affect the veracity of the 3D model, as well as the heavy weight of the images which would take a long time to send via Wi-Fi. The Smart Garment branch, instead, has been secured on MQTT link with MQTTS (MQTT over TLS), which provides encryption and authentication for MQTT communications, ensuring that data is transmitted securely and can only be accessed by authorized parties. In addition, further security measures at MQTT payload level have been adopted, integrating a Cryptographic Signature and an Integrity Code.

The cryptographic signature field has been implemented via symmetric signature (i.e., keyed hash function). In particular, HMAC has been selected in conjunction with the SHA256 hash function. The integrity code is instead implemented as CRC16 in little-endian byte ordering. The combination of these cryptographic functionalities increases the security level by providing a message-level authentication and an integrity check and anti-tampering.

Furthermore, other main Security Measures are implemented to the Dig_IT platform:

  • Kafka MTLS Security: Transport Layer Security (TLS) is used to secure communication between Kafka brokers and clients. Mutual TLS (mTLS) is employed to authenticate the brokers and clients to each other using digital certificates. This ensures that data is transmitted securely and can only be accessed by authorized parties;
  • SASL for User Access Management: Simple Authentication and Security Layer (SASL) is used to provide authentication and authorization for Kafka clients. This ensures that only authorized users can access the Kafka cluster and the data stored within it;
  • User Management for MQTT: Access to MQTT is managed using username and password authentication. This ensures that only authorized users can access the MQTT broker and the data transmitted over it.