Categories
Uncategorized

From Heatmaps to AI 📊: The First Step in Understanding Your Data

Introduction

Every journey into AI begins with a crucial first step: understanding your data. Many clients approach us eager to jump straight into machine learning (ML), but without a clear grasp of their data or meeting key prerequisites for ML, this leap often leads to frustration. Why? ML can only produce meaningful results when built on a solid foundation—with data preprocessing playing a vital role.

This is where data visualization comes in. Seeing your data clearly and gaining new perspectives and insights is the essential first step. Some methods, like heatmaps, go even further—empowering technical teams with detailed analysis while providing non-technical stakeholders with an intuitive, quick and easy-to-understand view of optimization opportunities.

Heatmaps

For those unfamiliar with heatmaps, they are visual representations of data where values are displayed as colors. This makes it easy to spot patterns, trends, and anomalies at a glance, providing an intuitive way to understand complex information. Let’s explore some examples!

In the following set of heatmaps, the x-axis represents the hour of the day, and the y-axis displays the days of the week. Each cell within the heatmap reflects the average energy consumption for a specific hour and day, providing a concise visual summary of the building’s energy use. This approach is comparable to a pivot table with conditional formatting, where data is organized systematically and shaded to highlight key patterns and anomalies.

The first heatmap showcases an office building with a well-configured weekend setback. The heatmap clearly shows low energy consumption during night hours and weekends, which are shaded in blue. This indicates that the building’s energy use is well-managed during non-operating hours, with peak energy usage occurring during standard working hours, from 9 a.m. to 5 p.m. on weekdays, with slight extensions to 6 p.m. on Mondays and Wednesdays.

 

In contrast, the second heatmap highlights an office building with operational inefficiencies. Starting with the weekend setback, we see that Saturday is well-managed, but Sunday shows an anomaly. From 2 p.m. to 7 p.m., energy consumption unexpectedly rises, disrupting the consistent blue pattern of low energy use throughout the day. Additionally, there’s a problem with the startup and shutdown periods. If the building operates from 7 a.m. to 5 p.m., why is it starting up as early as 4 or 5 a.m.? The night setback, which is set to begin at 8 p.m., is also somewhat late. It would be more efficient if the setback were activated earlier, around 6 or 7 p.m., to minimize unnecessary energy consumption.

Image 1: Daily heatmap examples.

To gain a deeper understanding of your building’s energy usage patterns, it’s crucial to look beyond daily patterns and consider how energy usage changes throughout the year. In this set of heatmaps, the y-axis represents the months of the year, giving us a clear view of how energy consumption fluctuates across the seasons. 

 

In the first heatmap, we observe consistent and effective night setbacks year-round. We also see that the building is cooling-dominated, as the highest energy consumption occurs during the summer months of June, July, and August. Some heating-related energy use is apparent in the mornings during January and February. These patterns are typical for an office building in Central Europe, where gas heating is common, and cooling accounts for the majority of electrical energy consumption in warmer months.

 

In contrast, the second heatmap illustrates inefficient cooling practices. During the summer months, particularly in July and August, night setbacks show higher energy consumption than expected, indicating inefficiency. Additionally, we notice regular energy usage at 4 a.m. in certain months, including June, July, August, and December, which suggests unnecessary operational activity during off-hours.

Image 2: Monthly heatmap examples.

A unique energy pattern is shown in the following heatmap from a building located in Central Europe. The lowest energy consumption occurs during the summer and daylight hours. What’s responsible for this change?


The building is equipped with photovoltaic (PV) panels. This example illustrates how renewable energy sources can significantly alter a building’s energy profile, and heatmaps provide an intuitive way to track and understand these shifts.

Image 3: Energy profile after adding PV panels.
Conclusion

Heatmaps are a powerful yet simple tool for understanding energy consumption patterns. Whether revealing operational inefficiencies in weekly heatmaps or uncovering seasonal trends in yearly heatmaps, they provide a clear, actionable view of a building’s energy usage. By starting with this foundational step, we ensure that the data is not only understood but also prepared for deeper analysis, such as machine learning. This approach enables smarter decision-making and paves the way for AI-driven insights that can optimize energy management across buildings and portfolios.

Categories
Demo Series

Demo Series Part 01 – Energy Twin Interactive

New video series is here! In this video we will show you the full demo of Energy Twin Interactive, our machine learning extension for SkySpark for daily, weekly and monthly data.

Categories
Analytics Walkthrough

ET Analytics Walkthrough 03 Utilizing a model for anomaly detection – Part 1

In this tutorial we will showcase how Energy Twin Analytics models can be used for anomaly detection using pre-defined rules, such as Relative Value Deviation rule.

Categories
Analytics Walkthrough

ET Analytics Walkthrough 02 Visualize a Model Using ET Views

This tutorial is focused on visualization of a model in the app ET Views once it has been identified.

Categories
Analytics Walkthrough

ET Analytics Walkthrough 01 Create a Model

This video demonstrates how to identify a model in the ET Admin app using Energy Twin Analytics.

Categories
Uncategorized

Detecting anomalies in KFC restaurants’ refrigeration equipment❄️

In the fast-paced environment of fast food restaurants, maintaining the right temperature in freezers and coolers is critical. Any deviation from the optimal temperature range can lead to food spoilage, financial losses, and health risks. However, recurring issues with refrigeration equipment—such as frequent door openings, equipment shutdowns, or mechanical failures—pose significant challenges.

Understanding the Challenges: Introduction to Refrigeration Anomaly Detection

To proactively detect and address these issues, ACTuate Facility Technologies, a company specializing in facility energy and system performance optimization for small to midsize buildings optimization, approached us for our data analytics expertise. Through our collaboration, we’re leveraging their extensive dataset from various fast food restaurants to develop sophisticated methods for detecting unusual behavior in refrigeration equipment, such as walk-in and reach-in freezers and coolers.

One of the major challenges in monitoring these systems is the variability in temperature that can occur during normal operations. For example, opening a freezer’s door for a few moments can cause a brief spike in temperature, which typically doesn’t indicate a problem. Regular defrost cycles can have a similar impact to the normal variability of the temperature data. However, more sustained deviations or unusual patterns might signal an issue that requires attention.

 

To address this, we need a system that can differentiate between normal fluctuations and potential problems—without the need for significant hardware additions. This is where data modeling and mathematical optimization come into play, allowing us to work effectively with the existing sensors and equipment.

Showcasing Temperature Monitoring Through Data Modeling

Based on discussions with the ACTuate team, we identified occurrences of malfunctions and selected a suitable model structure to analyze the temperature data from freezers and coolers. By fitting this model to the data, we can detect unusual system behavior. The process is entirely data-driven.

 

Our optimization techniques involve using a fluctuating window of past data, which dynamically adjusts in length based on the fit quality. The accuracy of this fit is measured by performance metrics, including Mean Absolute Error (MAE) as can be seen in the picture below. 

 

Image 1 : Multiple fits with fluctuating window adjusted based on fit quality.

Transforming Data Modeling into Anomaly Detection Rule 

One of the most compelling aspects of our collaboration with the ACTuate team is the development of an active detection rule based on data modeling. This approach not only allows us to analyze past temperature data but also transforms that analysis into a proactive monitoring system. 

 

Image 2: Resulting anomaly detection rule based on data modeling.

Real-Time Monitoring

With real-time data, you can also use the data modeling to predict if certain thresholds will be breached and communicate with the on-site team about the necessary actions. Sometimes it can be as simple as a reminder to “close the door.” 

Image 3: Real-time monitoring with threshold breach detection.

Conclusion

Through our collaboration with ACTuate Facility Technologies, we have developed a sophisticated approach to monitoring and detecting unusual behavior in refrigeration systems. While our efforts contribute to significant energy savings, the primary objective is to reduce the number of incidents where temperature deviations lead to costly spoilage, ultimately ensuring the safety and quality of the stored product.

Categories
Uncategorized

📈 Announcing the Energy Twin Evaluation – Now Available!

Great news for Energy Twin users! the energy twin evaluation is now available just in time for christmas! 🎄

Our new extension, Energy Twin Evaluation is now available for free for all Energy Twin licensees. It doesn’t matter if you are using ET Analytics, ET Interactive or both, ET Evaluation offers a way to easily check your model’s performance, quantify energy savings and provide insight into your data.

 

With various aggregation periods, you can easily visualise longer time periods as well as track your energy efficiency journey using cumulative difference between the measurement and prediction.

Ready to optimize your energy strategy? Head to Stackhub now, try out the Energy Twin Evaluation extension, and share your thoughts with us!

Categories
Uncategorized

SmartCoil : How Energy Twin helps Real-Time AHU Coils Monitoring💡

Exciting NEWS! One of the projects where Energy Twin is developmental partner has launched in October. 

SmartCoil – The First Real Time Monitoring System for AHU Coils 

SmartCoil by Sensible  revolutionizes the assessment and maintenance of AHU coils by introducing a cutting-edge, data-driven solution. Unlike traditional methods such as visual inspection and pressure drop testing, which are prone to inaccuracies, SmartCoil utilizes a network of strategically placed sensors feeding data to a cloud-based machine learning algorithm. This innovative approach translates raw coil data into actionable insights displayed on the SmartCoil dashboard and compatible BMS systems. 

 

By providing real-time information on coil fouling levels, capacity, and energy consumption, SmartCoil empowers users to optimize their coil cleaning schedules, leading to significant cost savings and minimized downtime. The Energy Twin team helps with the data analysis and utilises custom models in order to provide a data-driven solution.

Categories
Uncategorized

📊 Simplifying Energy Consumption Forecasting with Energy Twin! ⚡️

In the energy industry, professionals face challenges predicting and forecasting energy consumption. 

Energy Twin & Long-term Forecast

Forecasting future energy consumption for extended periods, such as until the end of the year, presents a unique challenge that traditionally requires hours spent in tools such as Excel. Energy Twin can help with that – with just one click allowing us to forecast the total kWh consumption.

 

While we understand the significance of cost savings, precise figures can be influenced by various uncontrollable factors. Nevertheless, our solution sets itself apart with continuous updates to the forecast, leveraging real-time measurements. Through this approach, Energy Twin ensures predictions remain relevant and reliable, providing users with up-to-date information for well-informed decisions, with the flexibility of monthly or daily granularity.

 

 

 

Predict energy consumption effortlessly with Energy Twin and make data-backed decisions to overcome challenges and achieve efficiency.

Categories
Uncategorized

Energy Twin – Can ChatGPT help energy engineers?

Using ChatGPT has become a huge trend.  It is supposed to be able to make many professions more efficient – why not energy engineers? We have decided to give it a try and integrate it to our Energy Twin extension for SkySpark.

Energy Twin and ChatGPT

We have been experimenting with various ways of summarizing results provided by our SkySpark extension Energy Twin. After successfully integrating OpenAI API into SkySpark we tried various tests, let’s have a look at two of them.

We know that generative AI cannot provide complex calculations, however, one could expect that summing 12 numbers wouldn’t be such an issue. See the screenshot above, where it provides the correct equation with the (always) wrong result with its usual confidence.

 

On the other hand, with the same input data, it can provide you with a nice poem. 

We’re still scratching our heads about how to monetise the ability to generate poems about energy conservation.