# The effect of non-linear dynamic invariants in recurrent neural networks for prediction of electrocardiograms

## Date

## Authors

## Journal Title

## Journal ISSN

## Volume Title

## Publisher

## Abstract

The possibility of automatic and accurate prediction of heart failures from the analysis of electrocardiograms (EGG) could be a breakthrough in medicine, because cardiologists can sometimes identify diseases and foresee catastrophic events, but they arc not ahvays successful. However, ECG, as many other biological rhythms, are the result of complex, non-linear dynamical systems, believed by many researches to be chaotic from a mathematical point of view. Chaotic signals are extremely dependent on initial conditions; they look random or noisy, but they are the result of bounded, deterministic systems. Therefore, prediction of ECG is a real challenge.

This research focused on the ambition of finding ways to model and predict electrocardiograms using artificial neural networks. It is known that point-by-point prediction is impossible for chaotic time series. However, we were looking for a predictability that could allow a network to model the attractor associated to ECG, rather than making it able to calculate accurately each value in the future. A prediction with such capabilities could foresee bifurcations in the dynamics and hence, predict catastrophic events.

We explored the use of Lyapunov exponents (an invariant measure of the divergence of several trajectories of a dynamical system) as an aid on the training of predictors based on complex neural networks (CNN'). A CNN is a recurrent network built with harmonic generators, which are 3-node recurrent neural networks previously trained to reproduce a specific sine wave. Several predictors were designed training a feed-forward network to reproduce an ECG, using past signal values as external inputs; harmonic generators trained to reproduce the harmonic components of the signal. After that these weights were embedded in a CNN, which trained until reaching a minimum. All predictors trained using the algorithm back-propagation through time.

We found that, embedding the Lyapunov exponents using the fashion described before is not enough to make the network fully capture the dynamics of the system, but it improved its short-term prediction. Besides, we found that harmonic generators control oscillations of the trajectories in long-term predictions. None of these charactenstics are present in feed-forward networks or plain recurrent neural networks.