Aktuelles

IT-Talk: The Student Who Knew Too Much: Representation Leakage in Distilled Smart Meter Models

04März
2026
14:00 - 15:30

Vortragender: Dejan Radovanovic (FH Salzburg)
Datum & Uhrzeit: Mittwoch, 04.03.2026, 14.00 - 15.30 Uhr 
Ort: FH Salzburg, Campus Urstein, Hörsaal 151, Urstein Süd 1, 5412 Puch / Salzburg

Knowledge distillation is a widely used technique to compress neural networks by transferring predictive behavior from a teacher model to a student model. Traditionally, the student learns from both ground-truth labels and softened teacher outputs on real input data. In this talk, we explore a more restrictive and potentially privacy-relevant setting inspired by subliminal learning.

We investigate whether task-relevant information can leak from a trained smart meter model through auxiliary outputs that were never explicitly supervised. In our setup, a teacher model is trained on weekly load profiles to predict binary household attributes (e.g., presence of a swimming pool). A student model, however, never sees real data or labels. Instead, it is trained solely to match auxiliary logits of the teacher evaluated on synthetic probe inputs.

Our preliminary results suggest that under certain conditions, such as shared initialization and specific noise distributions, the student can partially recover task performance. However, the effects are highly sensitive to architectural and experimental choices, and our findings are not yet stable enough for publication. We aim to discuss open questions, methodological challenges, and the broader implications for privacy and secure model sharing in energy informatics.