top of page
Facetune_12-02-2023-12-50-11.jpg

Oluwaseyi.O

Machine Learning Scientist and Researcher

Advanced Control and Intelligent Systems Lab

University of Victoria

About

I am an Applied Machine Learning Scientist and Researcher with experience in using Classical Machine Learning, Deep Learning, and Deep Reinforcement Learning algorithms for creative AI applications in Manufacturing Decision Support Systems (DSS), Timeseries Data Analysis and Forecasting, Predictive Maintenance and Generative AI.

 

Latest Publication

A Transformer-based Framework For Multi-variate Time Series: A
Remaining Useful Life Prediction Use Case

In recent times, large language models (LLMs) have captured a global spotlight and revolution-
ized the field of Natural Language Processing. One of the factors attributed to the effectiveness
of LLMs is the model architecture used for training, transformers. Transformer models excel at
capturing contextual features in sequential data since time series data are sequential, transformer
models can therefore be leveraged for more efficient time series data prediction.


In this work, an encoder-transformer architecture-based framework is proposed for multi-variate time series prediction with a prognostics use case. We validated the effectiveness of the proposed
framework on all four sets of the C-MAPPS benchmark dataset for the remaining useful life
prediction task. The results of the proposed framework on the test datasets were compared with the results from 13 other state-of-the-art (SOTA) models in the literature and it outperformed them all with an average performance increase of 137.65% over the next best model.

bottom of page