Seeing the iceberg under the surface
The electrocardiogram (ECG) and its standardised measures have been guiding cardiologists for more than 100 years. Today it's considered a mature technology that provides huge benefits to medical diagnosis and treatment all over the world.
The ECG is to a large extend considered to be well understood, in particular when it comes to intervals and morphologies. Still, the ECG research area continues to provide new insights into specific clinical problems such as atrial fibrillation.
There are several such specific research areas that need further attention, including: quality assessment, drug response evaluation, ECG phenotyping, etc.
Being able to create a list of what research remains may appear as the beginning of the end of the research area. But in this case, it is just steps required to open for exploration of the part of the iceberg that is hidden under the surface.
Over the last decade, ECG technology has taken big leaps forward, which have involved changing the field of ECG measurements and moving the limits of possibilities when it comes to ECG interpretation. These developments are part of a technology wave arising from the maturing of computing in all areas of science and technology.
The rise of computing has made it possible to perform analysis using automated pattern recognition, in addition to traditional rule-based algorithms.
This implies that you can teach a computer to play chess without encoding the rules of chess, but only presenting the computer with a huge amount of data from previous chess games. In similar way, a search engine infers what you mean when typing in search terms, by making use of all previous searches and selections.
Advances in machine learning algorithms opens a range of new possibilities, in particularly in the area of ECG analysis.
Interpretation of ECGs and handling of disturbances are, however, very delicate matters, and you shouldn't expect high quality ECG analysis results by simply feeding raw signals into neural networks, or other general machine learning algorithms.
We believe that the key to future automated ECG analysis is to build on the foundation of a hundred years of knowledge and experience with analysing ECGs.
Our approach is to break down the ECG into a very detailed set of content descriptors containing all the information needed for manual or automatic interpretation.
In more detail, our algorithm describes the waveform morphology, time of occurrence and activation pattern over time of every single beat or disturbance. This information is stored in an intelligible way. Making it possible to construct and investigate events, combination of and relationships between events, and trends of changes over time.
This information, characterising the ECG signal using a set of descriptors, can be interpreted by both humans and machines. And, automated signal interpretation can be achieved by plugging this information into machine learning algorithms.
It is the output of this type of ECG analysis software that will be the foundation of future automated interpretation of large ECG signal databases.
Machine learning algorithms are especially useful when coupled together with large databases containing descriptors of hundreds of thousands, or millions of signals.
Huge databases like this already exists, and they will be many more in just a few years. To understand how these huge signal databases emerge, think of how simple thumb-ECG devices can record data anywhere, at anytime.
Cheap standalone measurement devices, and smartphone coupled sensors, are now readily available for anyone who wants to record an ECG signal. There are also vendors marketing mobile devices that record ECG signals 24-7 all year around.
Just imagine the perspectives of enabling preventive heart care by ubiquitous monitoring of ECG signals using cheap and readily available measurement devices.
Before we can fully achieve this vision, we'll have to pass a few bumps in the road ahead.
Signals recorded with (actual) mobile devices are one-lead ECGs, which may limit their practical use. However, they typically record 30 seconds snippets of the signal, which is enough to analyse the heart rhythm and to find many types of events in the signals. Most interestingly, the true power of these databases is in monitoring for changes, using a huge number of recordings stored for each individual over long periods of time.
In the near future, people can, and will, record ECGs as often as they want to. Data will be collected in databases, and using these databases we will be able to answer many important questions, such as:
- What exactly happens the years before someone develops atrial fibrillation?
- What happened to those who had a certain pattern of descriptors in common three years ago?
- Which features or events are common, or not common, for those who has a certain genotype?
The new possibilities arising when searching huge ECG databases are beyond what is already known. This is the invisible part of the iceberg. To fully see it requires new generations of ECG analysis algorithms.
We are working with researchers, cardiologists, healthcare organisations and device manufacturers toward creating the future where automatic algorithms search ECG databases with millions of recordings, finding new insights that have not yet been possible for humans to see or measure.
If you want to know more about Cardiolund Research and the future of ECG analysis, don't hesitate to contact us.