This project explores transformer-based forecasting models for intelligent fault detection in electrical power systems. The approach reframes fault detection as a prediction problem, where the model learns the normal temporal dynamics of high-frequency voltage and current waveforms and identifies deviations as anomalies. By doing so, it avoids the heavy reliance on labeled data that limits many existing machine learning approaches in protection systems.
The study evaluates several transformer architectures on a large set of physics-based simulations that represent realistic grid conditions, fault types, and operating scenarios. The results demonstrate that prediction-based attention models can achieve high detection accuracy and robustness, even under scarce data and varying grid configurations. This work provides a promising foundation for more adaptive, data-efficient, and resilient protection schemes in future power networks.