Mixture of designer experts for multi-regime detection in streaming data

Evan Kriminger, José Príncipe, Choudur Lakshminarayan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Real-time streaming data takes on distinct visible patterns, known as regimes, as a result of changing external influences. Regimes corresponding to hazardous states, such as turbulent flow in oil pipelines or patients experiencing heart arrhythmias, must be identified quickly and accurately by on-line detection algorithms. In this paper, we propose a modification to the mixture of experts framework, which is traditionally used to model piecewise stationary time series. Our proposed modification allows experts to produce features specific to their designated regimes, rather than being limited to prediction error. This approach provides the flexibility to update the mixture modularly as new regimes emerge without the burden of retraining the entire mixture, as is typical in traditional classifiers. Our approach is tested on flow rate data from an oil and gas application, as well as detecting heart arrhythmias from electrocardiogram (ECG) signals. It outperforms traditional classification approaches both in terms of error rate and detector delay.

Original languageEnglish
Title of host publicationProceedings of the 20th European Signal Processing Conference, EUSIPCO 2012
Pages410-414
Number of pages5
StatePublished - 2012
Event20th European Signal Processing Conference, EUSIPCO 2012 - Bucharest, Romania
Duration: 27 Aug 201231 Aug 2012

Publication series

NameEuropean Signal Processing Conference
ISSN (Print)2219-5491

Conference

Conference20th European Signal Processing Conference, EUSIPCO 2012
Country/TerritoryRomania
CityBucharest
Period27/08/1231/08/12

Keywords

  • Detection
  • mixture of experts
  • streaming data

Fingerprint

Dive into the research topics of 'Mixture of designer experts for multi-regime detection in streaming data'. Together they form a unique fingerprint.

Cite this