thermal-nn
Thermal Neural Networks - Learn dynamic thermal networks from data. Application demo on an electric motor.
Science Score: 67.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 4 DOI reference(s) in README -
✓Academic publication links
Links to: arxiv.org, scholar.google, sciencedirect.com -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (10.7%) to scientific vocabulary
Repository
Thermal Neural Networks - Learn dynamic thermal networks from data. Application demo on an electric motor.
Basic Info
Statistics
- Stars: 46
- Watchers: 1
- Forks: 16
- Open Issues: 0
- Releases: 2
Metadata Files
README.md
Thermal Neural Networks
Lumped-Parameter Thermal Modeling With State-Space Machine Learning
[Wilhelm Kirchgässner](https://github.com/wkirgsn), [Oliver Wallscheid](https://github.com/wallscheid), [Joachim Böcker](https://scholar.google.de/citations?user=vmyBqw0AAAAJ&hl=de&oi=ao) [Paderborn University](https://www.uni-paderborn.de/en/), [Dept. of Power Electronics and Electrical Drives](https://ei.uni-paderborn.de/en/lea) Paper: [ScienceDirect](https://www.sciencedirect.com/science/article/pii/S0952197622005279), Preprint: [arXiv 2103.16323](https://arxiv.org/abs/2103.16323)Abstract
With electric power systems becoming more compact with higher power density, the relevance of thermal stress and precise real-time-capable model-based thermal monitoring increases. Previous work on thermal modeling by lumped-parameter thermal networks (LPTNs) suffers from mandatory expert knowledge for their design and from uncertainty regarding the required power loss model. In contrast, deep learning-based temperature models cannot be designed with the low amount of model parameters as in a LPTN at equal estimation accuracy. In this work, the thermal neural network (TNN) is introduced, which unifies both, consolidated knowledge in the form of heat-transfer-based LPTNs, and data-driven nonlinear function approximation with supervised machine learning. The TNN approach overcomes the drawbacks of previous paradigms by having physically interpretable states through its state-space representation, is end-to-end differentiable through an automatic differentiation framework, and requires no material, geometry, nor expert knowledge for its design. Experiments on an electric motor data set show that a TNN achieves higher temperature estimation accuracies than previous white-/gray- or black-box models with a mean squared error of 3.18 K² and a worst-case error of 5.84 K at 64 model parameters.
Purpose
This repository demonstrates the application of thermal neural networks (TNNs) on an electric motor data set.
The data set is freely available at Kaggle.
The TNN declaration and its usage are demonstrated in the jupyter notebooks, with tensorflow in TNN_tensorflow.ipynb and with PyTorch in TNN_pytorch.ipynb.
There is also a MATLAB implementation in the matlab folder, see the corresponding README for MATLAB.
Topology

Three function approximators (e.g., multi-layer perceptrons (MLPs)) model the thermal parameters (i.e., thermal conductances, thermal capacitances, and power losses) of an arbitrarily complex component arrangement in a system. Such a system is assumed to be sufficiently representable by a system of ordinary differential equations (not partial differential equations!).
One function approximator outputs thermal conductances, another the inverse thermal capacitances, and the last one the power losses generated within the components. Although thermal parameters are to be estimated, their ground truth is not required. Instead, measured component temperatures can be plugged into a cost function, where they are compared with the estimated temperatures that result from the thermal parameters that are estimated from the current system excitation. Error backprop through time will take over from here.
The TNN's inner cell working is that of lumped-parameter thermal networks (LPTNs). A LPTN is an electrically equivalent circuit whose parameters can be interpreted to be thermal parameters of a system. A TNN can be interpreted as a hyper network that is parameterizing a LPTN, which in turn is iteratively solved for the current temperature prediction.
In contrast to other neural network architectures, a TNN needs at least to know which input features are temperatures and which are not. Target features are always temperatures.
In a nutshell, a TNN solves the difficult-to-grasp nonlinearity and scheduling-vector-dependency in quasi-LPV systems, which a LPTN represents.
Citing
The TNN is introduced in:
@article{kirchgaessner_tnn_2023,
title = {Thermal neural networks: Lumped-parameter thermal modeling with state-space machine learning},
journal = {Engineering Applications of Artificial Intelligence},
volume = {117},
pages = {105537},
year = {2023},
issn = {0952-1976},
doi = {https://doi.org/10.1016/j.engappai.2022.105537},
url = {https://www.sciencedirect.com/science/article/pii/S0952197622005279},
author = {Wilhelm Kirchgässner and Oliver Wallscheid and Joachim Böcker}
}
Further, this repository supports and demonstrates the findings around a TNN's generalization to Neural Ordinary DIfferential Equations as presented on IPEC2022.
If you want to cite that work, please use
@INPROCEEDINGS{kirchgässner_node_ipec2022,
author={Kirchgässner, Wilhelm and Wallscheid, Oliver and Böcker, Joachim},
booktitle={2022 International Power Electronics Conference (IPEC-Himeji 2022- ECCE Asia)},
title={Learning Thermal Properties and Temperature Models of Electric Motors with Neural Ordinary Differential Equations},
year={2022},
volume={},
number={},
pages={2746-2753},
doi={10.23919/IPEC-Himeji2022-ECCE53331.2022.9807209}}
The data set is freely available at Kaggle and can be cited as
@misc{electric_motor_temp_kaggle,
title={Electric Motor Temperature},
url={https://www.kaggle.com/dsv/2161054},
DOI={10.34740/KAGGLE/DSV/2161054},
publisher={Kaggle},
author={Wilhelm Kirchgässner and Oliver Wallscheid and Joachim Böcker},
year={2021}}
Owner
- Name: Wilhelm Kirchgässner
- Login: wkirgsn
- Kind: user
- Company: @upb-lea @Beckhoff
- Website: wkirgsn.github.io
- Repositories: 19
- Profile: https://github.com/wkirgsn
Data scientist for motion control
Citation (CITATION.cff)
cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
- family-names: "Kirchgässner"
given-names: "Wilhelm"
orcid: "https://orcid.org/0000-0001-9490-1843"
title: "thermal-nn: Thermal Neural Networks with an application to an electric motor"
version: 0.0.1
date-released: 2021-03-30
url: "https://github.com/wkirgsn/thermal-nn"
preferred-citation:
type: article
authors:
- family-names: "Kirchgässner"
given-names: "Wilhelm"
orcid: "https://orcid.org/0000-0001-9490-1843"
- family-names: "Wallscheid"
given-names: "Oliver"
orcid: "https://orcid.org/0000-0001-9362-8777"
- family-names: "Böcker"
given-names: "Joachim"
orcid: "https://orcid.org/0000-0002-8480-7295"
doi: "10.1016/j.engappai.2022.105537"
journal: "Engineering Applications of Artificial Intelligence"
month: 1
start: 105537 # First page number
end: 105537 # Last page number
title: "Thermal neural networks: Lumped-parameter thermal modeling with state-space machine learning"
volume: 117
year: 2023
GitHub Events
Total
- Issues event: 2
- Watch event: 12
- Issue comment event: 2
- Pull request review event: 1
- Pull request review comment event: 2
- Pull request event: 1
- Fork event: 4
Last Year
- Issues event: 2
- Watch event: 12
- Issue comment event: 2
- Pull request review event: 1
- Pull request review comment event: 2
- Pull request event: 1
- Fork event: 4
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 4
- Total pull requests: 2
- Average time to close issues: 3 months
- Average time to close pull requests: 27 days
- Total issue authors: 4
- Total pull request authors: 2
- Average comments per issue: 3.5
- Average comments per pull request: 0.0
- Merged pull requests: 1
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 1
- Pull requests: 1
- Average time to close issues: 2 months
- Average time to close pull requests: N/A
- Issue authors: 1
- Pull request authors: 1
- Average comments per issue: 1.0
- Average comments per pull request: 0.0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- watermellon2018 (1)
- wkirgsn (1)
- reserschnell (1)
- nMaroulis (1)
Pull Request Authors
- cstockha87 (1)
- reserschnell (1)
Top Labels
Issue Labels
Pull Request Labels
Dependencies
- jupyter *
- matplotlib *
- pandas *
- seaborn *
- tensorflow >=2.1.0
- torch *
- tqdm *