2nd International Conference on
Advanced Materials Manufacturing &
Structures (ICAMMS-25)
October
Abstract9ID:
&T4S1-16
10, 2025
ADVANCED AUTOENCODER BASED FRAMEWORKS
FOR ROBUST ANOMALY DETECTION IN TIME SERIES
DATA
*PREM ANAND T P , ABINAYA A , ABINAYA
Presenting Author L , MADHUMITHA M
ABINAYA L
UNDERGRADUATE STUDENT
RAJALAKSHMI ENGINEERING
COLLEGE
CHENNAI, TAMILNADU
ABSTARCT
Anomaly detection is vital for ensuring safety and reliability in helicopter operations, where identifying
unusual patterns can prevent failures. Machine learning models, such as Autoencoders (AE), Recurrent
Neural Networks (RNN), Denoising Autoencoders (DAE), and Variational Autoencoders (VAE), are
widely applied in this domain; however, their comparative performance remains unclear.
The dataset, sourced from Airbus SAS helicopter flight tests, comprises accelerometer readings from
multiple positions and orientations.
This study shows that Autoencoders provide the best overall performance, combining high accuracy,
precision, and recall with efficient training. RNNs, particularly suited for time-series data, excel in recall
and sensitivity but detect fewer anomalies overall. DAEs offer robustness to noise but slightly
underperform compared to AEs, while VAEs show lower accuracy and slower training, limiting their
applicability for helicopter anomaly detection.
This study advances model selection strategies, contributing to safer helicopter operations and improved
anomaly detection in critical systems.
2nd International Conference on Advanced Materials Manufacturing & Structures ICAMMS-25 – OCTOBER 9 & 10, 2025, RIT CHENNAI
2
INTRODUCTION
OBJECTIVE
Evaluate RNN, Autoencoder, DAE, and VAE for anomaly detection in time-series data.
Measure effectiveness, accuracy, and precision across models.
Determine the model that most accurately detects anomalies in helicopter data.
Use insights to enhance helicopter maintenance and operational reliability
SCOPE
This project compares RNN, Autoencoder, DAE, and VAE models for anomaly detection in helicopter
time series data. Using metrics like accuracy, precision, recall, and reconstruction error, it highlights
Autoencoder's superior performance. The models are tested with real-world helicopter sensor data,
focusing on their robustness, training efficiency, and ability to improve safety by detecting critical
anomalies.
2nd International Conference on Advanced Materials Manufacturing & Structures ICAMMS-25 – OCTOBER 9 & 10, 2025, RIT CHENNAI
3
MACHINE LEARNING MODELS
AUTOENCODER(AE)
Compresses and reconstructs normal data; high reconstruction error suggests anomalies.
DENOISING AUTOENCODER(DAE)
Trained on noisy data to enhance robustness; detects anomalies by filtering out noise.
VARIATIONAL AUTOENCODER(VAE)
Learns probabilistic representations of normal data; flags anomalies when data doesn’t fit the learned
distribution.
RECURRENT NEURAL NETWORK(RNN)
Learns time-dependent patterns in normal vibration data;flags anomalies based on prediction error.
2nd International Conference on Advanced Materials Manufacturing & Structures ICAMMS-25 – OCTOBER 9 & 10, 2025, RIT CHENNAI
4
METHODOLOGY
Development of sectional properties
Collect helicopter vibration data, normalize and downsample, and split into training, validation, and test
sets. Use data augmentation to improve noise robustness.
Model Training
Train an RNN to learn normal sequences and detect anomalies based on prediction error.Implement and
train three Autoencoder models (Standard, Denoising, Variational) to reconstruct normal patterns and detect
anomalies using reconstruction error thresholds.
Evaluation
Compare models on validation and test sets using accuracy, precision, and recall. Analyze error
distributions to evaluate noise handling and efficiency.
Recommendations
Assess each model’s effectiveness and recommend suitable options for different data conditions.
2nd International Conference on Advanced Materials Manufacturing & Structures ICAMMS-25 – OCTOBER 9 & 10, 2025, RIT CHENNAI
5
ALGORITHM
AUTOENCODER
An Autoencoder is a type of neural network used in unsupervised learning. It encodes input data into a
lower-dimensional representation (encoder) and then reconstructs the original data (decoder). The goal is to
minimize the reconstruction error (e.g., Mean Squared Error).
AUTOENCODERS ARE COMMONLY USED FOR :
Anomaly detection (flagging data with high reconstruction error)
Dimensionality reduction
Denoising (removing noise from data)
VARIANTS INCLUDES :
Denoising Autoencoder (DAE): Trains on noisy data
Variational Autoencoder (VAE): Uses probabilistic methods for more flexible data generation
2nd International Conference on Advanced Materials Manufacturing & Structures ICAMMS-25 – OCTOBER 9 & 10, 2025, RIT CHENNAI
6
ALGORITHM
RECURRENT NEURAL NETWORK
An RNN (Recurrent Neural Network) is a neural network designed to process sequential data by
maintaining a hidden state that is updated at each time step. This allows RNNs to capture temporal
dependencies and learn from past inputs.
CHARACTERISTICS
Sequential Data: Ideal for time-series, text, and speech.
Hidden State: Updates based on current input and previous state.
Long-Term Dependencies: Basic RNNs may struggle with long sequences, but variants like
LSTM and GRU address this issue.
RNN COMMONLY USED FOR
Time-Series Prediction
Natural Language Processing
Speech Recognition
2nd International Conference on Advanced Materials Manufacturing & Structures ICAMMS-25 – OCTOBER 9 & 10, 2025, RIT CHENNAI
7
PROGRAMMING TOOLS
The programming language used in this project is Python.
Python is ideal for machine learning tasks due to its rich ecosystem of libraries and frameworks,
including:
TensorFlow/Keras: For building and training Recurrent Neural Networks (RNNs),
Autoencoders (AE, DAE, VAE), and other deep learning models.
NumPy/Pandas: For data manipulation, preprocessing, and handling time-series data.
Matplotlib/Seaborn: For plotting and visualizing results such as loss curves and error
distributions.
Scikit-learn: For additional machine learning utilities, such as anomaly scoring and evaluation
metrics.
Python's flexibility and wide support in data science and machine learning make it the primary
choice for this project.
2nd International Conference on Advanced Materials Manufacturing & Structures ICAMMS-25 – OCTOBER 9 & 10, 2025, RIT CHENNAI
8
IMPLEMENTATION
DATA PREPARATION
Collect, normalize, and split helicopter vibration data into training, validation, and test sets. Apply data
augmentation and create windowed TensorFlow datasets.
MODEL DEVELOPMENT
RNN
Train on normal sequences to detect anomalies via prediction error.
Autoencoder and its variants
Use reconstruction error to detect anomalies, with each variant tailored for different robustness and
complexity needs.
TRAINING & VALIDATION
Train models on training data, using MSE loss for Autoencoders and prediction error for RNNs, and set
anomaly thresholds.
2nd International Conference on Advanced Materials Manufacturing & Structures ICAMMS-25 – OCTOBER 9 & 10, 2025, RIT CHENNAI
9
IMPLEMENTATION
EVALUATION
Test models, calculate accuracy, precision, and recall, and refine thresholds based on error analysis.
ANALYSIS & CONCLUSION
Compare model performance, recommending Autoencoders for general use and RNNs for time-series
data.
2nd International Conference on Advanced Materials Manufacturing & Structures ICAMMS-25 – OCTOBER 9 & 10, 2025, RIT CHENNAI
10
RESULT COMPARISON
Metric AE DAE VAE RNN
Validation Loss 0.0168 0.0195 0.0223 0.0205
Mean Reconstruction Error 0.0234 0.0190 0.0220 0.014
Normal Data 0.0123 0.0142 0.0161 0.010
Anomalous Data 0.0328 0.0301 0.0285 0.035
Reconstruction Error Gap 0.0205 0.0159 0.0124 0.0073
Anomaly Detection Threshold 0.0185 0.0200 0.0215 0.099
Model Accuracy 0.95 0.93 0.91 0.94
Precision 0.92 0.89 0.86 0.88
Recall 0.88 0.87 0.85 0.92
Anomaly Detection Count 120 135 150 15
Normal vs. Anomalous Error 0.0123 vs 0.0130 vs 0.0140 vs 0.010 vs
0.0328 0.0345 0.0350 0.035
Training Time (relative) fast moderate slow moderate
2nd International Conference on Advanced Materials Manufacturing & Structures ICAMMS-25 – OCTOBER 9 & 10, 2025, RIT CHENNAI
11
CONCLUSION
Autoencoders are the most balanced for general anomaly detection, offering high accuracy and
efficiency.
Autoencoders are a more sustainable method when compared to RNN, DAE, and VAE.
RNNs excel at detecting anomalies in time-series data with high recall, though they detect fewer
anomalies.
Denoising Autoencoders work well with noisy data but have slightly lower performance.
Variational Autoencoders are better for complex data, but are slower and less accurate.
Overall, Autoencoders are the most versatile, while RNNs are best for time-series
tasks.
2nd International Conference on Advanced Materials Manufacturing & Structures ICAMMS-25 – OCTOBER 9 & 10, 2025, RIT CHENNAI
12