Kaunas hosts eUMaP workshop on Data Integration & Digital Twin Platforms and plenary meeting

October 15, 2024 0 By admin

Kaunas, LithuaniaKaunas University of Technology (KTU), the lead beneficiary of WP5 within the eUMaP project, successfully hosted the sixth eUMaP workshop on September 17th. This workshop focused on data integration and Digital Twin platforms, bringing together experts and participants for insightful presentations and discussions. The event also featured a practical session, where participants collaborated in groups, stepping into the roles of different stakeholders to brainstorm ideas for enhancing the eUMaP platform.

The following day, the eUMaP project partners gathered for the second plenary meeting of 2024. This session provided a key opportunity to review the progress across all work packages and strategize the next steps as the project approaches its final stages. 

Session 1: Data & Data Integration

• Efficiency in energy use: pattern discovery and crisis identification in residential hot-water power consumption data (Lina Morkūnaitė, KTU)

Lina Morkūnaitė discussed a joint study conducted between the eUMaP project and another ongoing Horizon Europe project at KTU – “SmartWins”, focusing on energy consumption patterns in EU households. The research highlights that almost half of energy demand in buildings is dedicated to space and water heating.

A key point in her presentation was the change in domestic hot water consumption during the COVID-19 pandemic. The study identified a significant shift in domestic hot water consumption patterns during the crisis conditions.

Lina detailed the study’s methodology, which involved merging data from residential apartment buildings with Google Community reports. These mobility reports, developed by Google in response to the pandemic, provide anonymized data on movement trends, helping public health officials make informed decisions. For this study, they focused on mobility data in Kaunas City, particularly in the categories retail/recreation and transit stations.

She further explained the predictive modelling approach, using an ensemble stacking classifier – a meta-ensemble learning technique that combines multiple classifiers to improve prediction accuracy. Lina presented the results of the predictive models, along with the extraction of daily consumption patterns based on crisis severity levels and hourly consumption clustering.

Key conclusions from the study:

  • Using Google Community Report data enabled the classification of crisis severity levels, providing a framework that can be applied to other crises that limit citizen mobility.
  • The classifier demonstrated high accuracy on both training and test datasets, indicating minimal overfitting and effective generalization.
  • Domestic hot water consumption patterns exhibited significant differences between baseline (normal conditions) and the most severe crisis levels.
  • PCA and k-means clustering helped identify consumption clusters for each hour of the day and severity level.
  • Combining the predictive algorithm with daily patterns and clusters enables targeted control actions for enhanced system management.

This study effectively investigated the impact of crises on domestic hot water consumption, providing valuable insights for improved control strategies in energy systems.

• Data synthesis (Marius Ivaškevičius, KTU)

Marius Ivaškevičius delivered an insightful presentation on data synthesis, starting by defining synthetic data as artificially generated data, created by computers rather than derived from real-world occurrences. This data is either generated from existing databases or through algorithms and models that replicate the properties and characteristic of real-world data.

The primary reason for using synthetic data is that real-world data can often be difficult to acquire, and in many cases, it may be sensitive, personal, or confidential – such as financial records, medical histories, or mobile communications. Synthetic data provides a solution to this being cost-effective, easy to produce, and perfectly labelled.

However, one of the challenges is ensuring that synthetic data accounts for the wide range of potential real-world events, which can be difficult to predict and replicate fully.

To generate synthetic data, it’s crucial to first define the type of data needed, identify relevant data sources, and generate data according to specified requirements. Marius provided several examples, including generating data with the same distribution, noise generation/denoising images, creating word vectors trained to remove words from sentences, and data augmentation in image classification. He also mentioned automatic labelling in datasets, Generative Adversarial Networks (GANs) and AlphaGo, which employs unsupervised learning from simulations.

Marius concluded by showcasing ASHRAE’s Great Energy Predictor II – a tool used to estimate how much energy a building will consume based on various datasets, highlighting its practical application in energy prediction.

• Real-time IoT data integration – our approach and use-cases in developing Digital Twins (Justas Kardoka, KTU)

Justas Kardoka’s presentation centred on Zabbix, a platform designed to monitor IT infrastructure, with a focus on its application in IoT and Digital Twin development. The presentation highlighted how Zabbix can be used to efficiently manage IoT infrastructure by collecting historical data, automatically registering devices from different vendors, detecting problems, and integrating with other systems.

A key featured discussed was Low-Level Discovery (LLD), which automates the creation of items by sending data to predefined discovery rules. This allows users to structure the incoming data using prototypes and templates. Two use cases were presented that utilized LoRa WAN sensor data and BACnet data to demonstrate the effectiveness of LLD.

The Zabbix API was another focal point, showing how historical data and detected issues, along with other information collected within Zabbix, can be accessed and integrated into other systems via the API. This flexibility allows for seamless integration, making it possible to monitor various devices and systems in real-time.

Zabbix agents can be installed across many supported systems to monitor their availability and resource usage, making it an invaluable tool for managing IoT services running on separate devices. Additionally, Zabbix’s calculated items feature allows for the creation of new data items based on existing data, supported by a variety of aggregate functions – essential for creating Digital Twins.

Certain IoT data is also accessible via specific vendor or platform APIs, which can be integrated into Zabbix through HTTP agents. This approach allows Zabbix to manage integrations independently, eliminating the need for middleware development since Zabbix itself can perform requests for data and then process them.

In summary, Zabbix LLD enables the integration of different types if IoT data into a unified system, and with the Zabbix API, Digital Twin solutions can be developed and enhanced through the data Zabbix collects. Its wide adaptability make it a versatile tool for IoT and Digital Twin applications.

Session 2: Digital Twin Platforms

• FU digital twin platform (Paris A. Fokaides, FU)

Paris delivered a compelling presentation on asset and operational ratings and the use of digital twins in building energy management. He began by defining asset rating as a measure of a building’s energy potential based on its design and materials, reflecting theoretical energy efficiency under standard conditions. Asset rating focuses on the efficiency of the building’s design without considering variations in real-world use, such as occupancy rates or extended operating hours. While useful, this rating doesn’t account for how effectively the building’s systems perform when subject to daily operational elements.

He then introduced the concept of operational rating, which differs from asset rating by focusing on the building’s actual energy performance during day-to-day use. This rating considers factors such as occupant behaviour, maintenance practices, and operational efficiency, offering a more comprehensive understanding of a building’s energy performance. By providing a realistic measure of energy use, the operational rating delivers more actionable insights compared to the asset rating.

Paris emphasized the importance of a standardized approach for operational ratings to ensure reliable and consistent assessments of real-world energy performance. Such standards help ensure that data is collected uniformly, is reliable, and highlights areas for improvement. To this end, Frederick University and Kaunas University of Technology have been working together to develop a five-step operational rating standard. This framework includes defining assessments objectives, data collection, data analysis and validation, calculation methods, and reporting. This standard is still in development but will serve as a critical tool for enhancing energy performance evaluations.

In terms of implementation, Paris discussed how the calculation of operational ratings includes the use of Energy Performance Indicators (EPIs). The more metrics that are introduced, the clearer picture of the building’s energy efficiency. The final step of the standard focuses on reporting, where gaps in performance are identified, and actionable recommendations are provided.

Transitioning from theory to practice, Paris highlighted the role of Digital Twins in delivering real-time operational insights. A digital twin is a dynamic, up-to-date virtual replica of a physical asset of environment. By integrating Building Information Modeling (BIM) and Internet of Things (IoT) technologies, real-world data from the physical asset feeds into the digital twin, allowing for real-time monitoring and optimization of energy use, occupancy, and environmental conditions.

Finally, Paris presented a case study from Frederick University Campus, where digital twin technology was applied across five buildings. A total of 75 sensors and energy meters were installed to collect data. He demonstrated how real-time data was integrated and displayed through the digital twin, offering a clear, dynamic view of the building’s performance. Additionally, historical data gathered through the digital twin proved to be valuable for conducting energy audits, for example. The study also included weather station data to provide insights into environmental conditions affecting the buildings. Digital twins can revolutionize energy management, offering precise, real-time insights and enabling data-driven optimization for operational efficiency.

• Digital Twin for KTU campus buildings operational carbon and indoor climate monitoring and improvement (Darius Pupeikis, KTU)

Darius Pupeikis’ presentation focused on the Digital Twin for KTU’s Building Operational Carbon and Indoor Climate Monitoring and Improvement project, which aims to monitor and reduce the CO2 footprint of KTU Campus buildings. The project breaks down carbon emissions by different energy types and analyses the energy use across buildings, systems, and equipment.

Darius highlighted the advanced digitalization technologies employed in the project, including photogrammetry, Building Information Modeling (BIM), 360º Panorama Tours, IoT, Data Analytics, and Machine Learning. These technologies facilitate real-time monitoring by collecting data from over 2000 parameters broadcasted via APIs, all fed from physical sensors installed throughout the KTU Campus. 

He then outlined the project’s IT Architecture, covering hardware, APIs, middleware, data aggregation and storage, evaluation and calculation processes, geometrical and visual representation, tabular data visualization, integration, and the user interface layer.

The project serves as an ecosystem for education, research, and innovation, leveraging KTU’s infrastructure for various academic and practical applications. Key principles of the project include:

  • Co-creation with stakeholder involvement.
  • A focus on demonstrator projects that push the boundaries of current technologies.
  • Commitment to open and accessible data.
  • Application across multiple scales: from the campus level to individual buildings, spaces, elements, and equipment.
  • Broad applicability for education and research, benefiting everyone from schools to businesses and Horizon Europe research projects.

Darius also showcased daily and hourly visualizations of the operational CO2 footprint across KTU Campus Buildings, covering key areas such as thermal energy use for heating and hot water, electricity, water supply, sewage disposal, waste generation, solar power, and electric vehicles (EV) charging. He highlighted the positive impact of the installed solar power plants and EV charging stations, which together contributed to a 14% reduction in the campus’ carbon footprint.

In closing, he compared these results with KTU’s most modern and energy-efficient building, MLAB, noting that the ability compare performances at both macro and micro levels allows for the identification of areas for improvement in energy efficiency and sustainability across the campus.

• Damage assessment platform for buildings leveraging google street view (Nikolaos Schetakis, ALMA)

Nikolaos introduced the project Earthquake Risk Platform for European cities Cultural Heritage Protection (ERA4CH), an initiative focused on developing an automated workflow for assessing building damage following earthquakes. The core of the project revolves around collecting façade images of buildings in targeted cities, calculating risk scores based on structural characteristics, and generating heatmaps that display areas with varying levels of collapse risk. These heatmaps, created for six cities, highlight buildings at different risk levels.

The process begins with the polygon generation phase, where users can interact with a map centred on their area of interest. By clicking on different positions on the map, users create polygons, calculate routes within the selected area, and obtain precise coordinates along the routes. This is followed by setting parameters, such as adjusting grid spacing and selecting interval steps, which directly influence the level of detail in the analysis.

Building risk scores are then calculated by assigning specific weights to various structural features based on their importance. This results in heatmaps that visually represent risk levels across the selected cities. The workflow has been tested on six different cities, with unique polygon configurations and parameter settings for each location. The final output is a heatmap for each city, indicating areas with varying levels of seismic risk as assessed by the model, followed by a performance and statistical analysis for each area.

In conclusion, Nikolaos addressed several limitations of the current platform, such as outdated imagery, the need for enhanced domain expertise, and the need to improve image retrieval methods. He also mentioned future improvements like incorporating a more robust VLM model, refining building typology modelling, and enhancing the accuracy of building score calculations.

• Updates on the eUMaP platform (Napoleon Papoutsakis, ALMA & Konstantinos Apostolopoulos, GSH)

Napoleon Papoutsakis and Konstantinos Apostolopoulos shared the latest progress and updates on the eUMaP platform, through an extensive presentation and a video showcasing the platform’s advanced data visualization capabilities for key case studies (Larnaca, Kaunas, and Thessaloniki), along with the features of its time series analyser and the forecasting process.

The platform’s data visualization offers multiple formats to suit varying needs:

  1. Interactive visualization of Building Information Modeling (BIM) files and geospatial layers, powered by GSH’s platform.
  2. Consumption data visualization options that allow users to toggle between daily, weekly, and monthly data views, or select custom date ranges of interest.
  3. Integration of GSH’s platform with data charts into a Streamlit app.

For example, the Larnaca case study provides near real-time consumption data via API, while in the Kaunas study, users can explore data specific to each building. Additionally, the platform offers a general overview of geospatial layers, energy consumption trends in different locations, and their variations across the year.

The time series analyser is designed to make advanced data analysis accessible to non-technical users, offering possibilities to reveal hidden data patterns. Its key features include: 

  • Trends and seasonality decomposition, enabling users to isolate trends and seasonal fluctuations to predict future patterns more accurately.
  • Rolling statistics and dickey-fuller test for smoothing data fluctuations and assessing stationarity.
  • Seasonal patterns and autocorrelation, allowing users to explore temporal relationships and identify cyclical behaviours through Autocorrelation (ACF) and Partial Autocorrelation (PACF).

Additionally, the platform has implemented an AI Wizzard, acting as a personal data analyst, offering expert interpretations and actionable insights. This AI tool provides explanations for complex data visualizations, helping users who may be unfamiliar with handling large datasets. It also suggests advanced analysis models based on detected patterns in the data.

On another note, the forecasting process is straightforward:

  1. The user selects hourly or daily data representation.
  2. Chooses from available forecasting models.
  3. Visualize results using interactive plots.

Recent platform updates include the integration of heating energy consumption data for the Kaunas case study, water consumption data for Thessaloniki, and geospatial layer and BIM file visualization using Geosystem’s platform.

The eUMaP project

The eUMaP project aims to address the challenges posed by the COVID-19 pandemic on critical infrastructure such as energy, water, waste, and telecommunication systems. It proposes the development of an open platform that enables local authorities to effectively manage and plan the demand and supply of building utilities during quarantine or lockdown situations, ensuring resilience and continuity. The platform is based on earth observation data and integrated with open BIM platforms in five European cities.

Total Views: 246