Technique Used In Signal Processing Biology Essay


The time frequency distribution is a technique used in signal processing to provide information about the spectral pattern of a given signal as it changes with time. This paper investigates different approaches for implementing and analyzing TFD, in order to estimate the behavior of the heart, as recorded on an Electrocardiogram (ECG). The project is mainly concerned with the patterns of Ventricular Fibrillation (VF), and non-invasive techniques of detecting it. This report suggests different courses of action for detecting VF, the methods used are based on peak detection, averaging, Short Time Fourier Transform and Wigner distribution.

It has been found that Wigner Distribution function is a much better process of plotting TFD, since it has a better resolution, while the TFD based on STFT suffers from frequency-time resolution trade off. It has also been established that VF detection cannot rely on one method, but on a combination of elementary and more precise applications.


Lady using a tablet
Lady using a tablet


Essay Writers

Lady Using Tablet

Get your grade
or your money back

using our Essay Writing Service!

Essay Writing Service

Heart attacks accounts for the death of an estimated 111,000 people every year in UK alone [1]. Ventricular fibrillation (VF) is one of its main causes and is responsible of between 75 to 85% of sudden cardiac arrests [2]. VF occurs when the muscles in the lower chamber of the heart tremble instead of contracting properly, leading to poor blood circulation and eventually lack of oxygen in the brain, this in turn results in instant death or permanent brain damage. It would be accurate to label this condition as a silent killer, mainly because it is not detectable by the traditional pulse checks i.e. wrist and neck; furthermore, it cannot be felt by the patient himself especially if he is in a lying position, the only way VF is identified is on an Electrocardiogram (ECG).[3]

The ECG records the electrical activities of the heart, during the contraction and relaxation of its muscles, the Electrocardiograph leads pick up the changes in the electrical potential on the surface of the body, resulting in a graphic representation. In any given single heart beat the ECG shows a particular graph; therefore in a normal heart this pattern would repeat continuously, as long as the patient doesn't engage in any physical activity, this in turn produces a relatively constant and repeating set of frequencies. Based on these assumptions it could be said that a normal heart in a healthy patient produces an almost-periodic function.

There has a been a lot of research done in the biomedical engineering field to identify ways of detecting VF, and various algorithms have been developed for the purpose of spectral analysis, some of which rely on STFT and Wigner Distribution while other are based on Wavelet Transforms. These algorithms are usually implemented onto fixed or portable hardware, designed to give the patient a minor electric shock whenever VF is detected i.e. portable defibrillators. [4]

This project is concerned with detecting VF caused by abnormal heart activities i.e. palpitations, skipping beats and unusual muscle movements, and the issue would be tackled from different perspectives, such as power spectral density analysis of frequencies produced by such phenomena using Time-Frequency Distribution (TFD). This paper includes and critically evaluates various methods of computing TFD, in order to generate a detection algorithm that facilitates and could possibly assist the doctors with monitoring, and hopefully eliminate the risk of misdiagnosis.

Chapter one is constituted of general information regarding the recordings used in this project, along with filter design and data format. Along with this chapter, there are detailed plots of all the signals used in the appendix section. In chapter two, various techniques for segmentation and elementary evaluation of the ECG were introduced, the advantages and potential problems with these techniques are identified, and their improvement is discussed.

Chapter three looks at three different methods of generating TFD, namely moving average, STFT and Wigner Distribution, this chapter weighs these methods against each other, and evaluate the most suitable one for VF detection algorithm. While chapter four, suggests a detection algorithm based on all the methods used in this paper.

Chapter 1

This chapter includes useful information about the signals used in this project, as well as important terminology related to ECG analysis.

1.1 General information about ECG signals:

Lady using a tablet
Lady using a tablet


Writing Services

Lady Using Tablet

Always on Time

Marked to Standard

Order Now

Each heart beat is a result of the combinations of cardiac muscles contracting and relaxing, it manifests itself on the ECG as a continuous change in amplitude, a good analogy would be to think of the heart as a mechanical engine with a recurring event x, a normal heart produces a periodic function with a period of length of x as shown in figure 1.1, here referred to as RR interval, it is very important to take a moment here to familiarize ourselves with different components of the signal, the coming sections of the report would refer to them, especially, the "QRS complex", as well as the "T and P waves".

figure1- Normal beat.bmp

Figure1.1 shows the ECG of a typical heart beat

Using peak detection method, the start and the end of each period can be determined, based on which, the average and instantaneous heart rate can be found. In a healthy resting adult the heart beat ranges from 1 HZ to 1.5 HZ, therefore with a period of 1s to 0.67s [5], however the amplitude of the signals can vary a great range, and since a small change in the placement of the Electrocardiograph leads on the body surface of the same patient can cause a big change of voltage amplitude, it is not a reliable criteria for judging the quality level of the heart.

Two ECG recordings from different patients suffering from VF have been shown below; the idea behind plotting these figures at this stage of the report is solely to establish visual recognition of the VF patterns, and to start thinking about ways of detecting it.


Figure 1.2: shows abnormal activities in the ECG, marked as F. (ECG: 104)


Figure 1.3: shows abnormal heart activities between 40 and 60 sec. (ECG: cu35)

By comparing VF patterns in figure 1.2 and 1.3, it is very clear that it does not have a set pattern on the ECG, and it can manifest itself in different ways, again this reinforces our assumption on the fact that amplitude is not a good criteria for VF detection, nevertheless, it may be useful in detecting RR intervals in order to work out hear rates and organize the signal into sections representing a full cycle.

1.2 The Data [6]

This section describes the type of data used, the format and the general information about the recording itself, also, included in the appendices a table containing information about the patients the ECG was taken from, i.e. age, gender, and medications. The data used in this project is freely available on MIT-BIH database; these recordings were taken on analog tapes between 1975 and 1979 at the Beth Israel Hospital Arrhythmia Laboratory on a Del Mar Avionics Model 445. On the other hand the digitization took place at the MIT- Biomedical Engineering labs. The analog output was played back from the tapes on a Del Mar Avionics Model 660, then converted using an 11-bit ADC, with sample values ranging from 0-2047, the later corresponds to an amplitude range of ± 5mV, i.e. a sample with a value of 1024 is equivalent to 0 v, all of the signals used in this project have a sampling frequency of 360 Hz, (except cu35 Fs=250Hz). Afterwards an anti-aliasing band-pass filter from 0.1 to 100 Hz was used to eliminate most of the noise, and unwanted frequencies. However, some other frequencies still remained in the recordings. Figure 1.4, lists the main noisy components as well as identify their potential source.




0.2 - 0.4 Hz


Supply reel on the analog tape, frequencies increase as the tape increases on the wheel.

60 Hz


Power supply

1.96, 9.1, 42 Hz


These frequency components has a very low amplitude, and do not contribute too much noise

Figure1.4 shows the noisy frequencies that might be present in the recordings

1.3 Filtering

1.3.1 Filer design

This section describes the process of designing an FIR low pass filter (LPF), to eliminate all the noise components; first of all we need to determine what frequencies have to be kept. In a healthy adult the heart rate ranges between 60 to 80 beats per minute, i.e. (1~1.34 Hz) (3), and according to the NHS during ventricular fibrillation this may increase to 140 beats per minute i.e. . Figure 2.5 shows the impulse response and the frequency response of a digital LPF, designed in Matlab:

Lady using a tablet
Lady using a tablet

This Essay is

a Student's Work

Lady Using Tablet

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

Examples of our work

Sampling frequency = 360 Hz; and the cut-off= 15 Hz;

Cut-off in normalized frequency= 15 / 360= 0.0417;

From those parameters, the impulse response is calculated:


N=100; %sampling points





stem(hh);title('impulse response')


zplane(hh);%pzp plane



figure;plot(nu,HH);title('LPF spectrum');grid on

Figure1.5 shows the filter's response in normalized frequencies, and figure 1.6 show the PZP plot of the designed filter:



Figure 1.5: shows the impulse response of the filter Figure 1.6: PZP of the filter

1.3.2 Filter testing

In this section the spectrum of one minute segment from ECG(104) is produced, before and after filtering. The plots shown in figure 2.7 and 2.8 were generated in Matlab Speedster in Simulink. The coefficients of the filter calculated earlier were imported onto the "Sketch a digital filter" blockset. As shown in the model below:


Model 1: the model used in Simulink

Here is a comparison of both spectrums figure 1.7 and figure 1.8 shows the spectrum of the original and the filtered signal respectively:



Figure1.7: spectrum of original signal figure1.8: spectrum of filtered signal

Figure 1.7 clearly shows that our signal has frequencies attenuated after the filtering process, and that the cut off frequency is just a little over our target of 15 Hz, which is fine for our purpose.

Chapter 2

This chapter looks at basic methods for identifying abnormal activities of the heart, these methods do not necessarily detect VF; However, they can be used as a preliminary step for this purpose.

2.1. RR intervals

R-R interval technique relies on a simple algorithm for detecting the peaks of the signal, and plotting the time elapsed between each two consecutive peaks, the plot will show any irregularity in the heart beat, return a value for the its rate, as well as identifying any peaks that do not exceed a certain threshold (which medically means a weak heart beat). However, this method should only be considered as a supporting argument for the algorithm detection, on its own it does not constitute valid criteria for the patient's condition. (Please refer to appendices M-file ref: 02)

ECG 101 (figure 2.1) and cu35 (figure 1.3) are compared and their RR interval plots are shown in the following figures:


Figure 2.1 a plot of a 60 second ECG (101)


Figure2.2: RR interval plot of ECG: 101


Figure 2.3: RR Interval of ECG: CU35

Figure 2.2 demonstrates that the RR intervals are steady, apart from the beginning of the graph at 0:1 minute, where a big a jump happens, this is due to the fact that the signal does not start with an R peak. Overall, for the length of this recording, the heart shows normal activities. On the other hand figure 2.3 show erratic heart behavior, in which a few heart beats were too weak to be considered normal, this is proven by the fact that the RR-interval follows no set pattern.

This method of evaluating heart signals incorporates many issues related to the amplitudes of the signal, i.e. an RR-interval can be missed out of the plot if its amplitude is lower than that of the threshold. A way of improving the implementation of this method is to consider the standard deviation and the variance of the amplitudes in the calculation of the peak threshold. These issues will be seen to in the detection algorithm.

2.2 R-R integrals

This section is concerned with the investigation of R-R integrals method, in order to compute the average integral value of one R-R interval of any given ECG, then comparing it against the rest of the intervals for the whole length of the signal.


ECG(x): the ECG recording

W(x): a sliding Rect window used for isolating a heartbeat, and defined as Rect ()

F(x): the average value of the whole integral.


Figure 2.4: the process of RR Interval

Using the same (ECG101, fig: 2.1), a plot for a typical heart recording has been generated, and figure 2.5 shows that the signal has a relatively constant average integral ranges ±20mV, compared to Figure 2.6 showing the RR integral of the abnormal signal (Cu35 fig: 1.3) which ranges within to ± 400mV. (Refer to the appendix for M-files ref: 03),



Figure 2.5: plot of the average integral Figure: plot of the average integral

value of ECG 101 value of CU35


The R-R integral and R-R interval techniques have compatible results, they both show a steady ECG 101 and a troubled CU35, these two complementary technique are a good elementary evaluation of any hearts condition, but it does not give any indication neither of cause nor of the nature of the problem, while it can detect VF, it can also detect other defects, therefore there is a need for a more precise technique;

Mathematically the RR-integral technique has a major flaw, in the sense that two completely different shapes or signals might have equal average integral values. I.e. a triangular wave with an area of X, and a square wave with the same area X. Therefore it would not be possible to rely on this method alone for VF detection. In Chapter four there is a detailed study of this issue is taken in consideration in order to implement it in the detection algorithm. For now the integral averaging process would be sufficient.

Chapter 3

This section of the report is dedicated to devising ways of generating Time Frequency distributions using different methods, as well as critically evaluating those techniques. The goal is to design and improve the power spectral density of the ECG recording for analysis and VF detection.

3.1 Frequency Averaging by Interval

A possible way of generating TFD plots, is to take the average frequency of each RR-interval (see section 2.1), and plot it against time, while this method does not produce neither accurate nor instantaneous values for the frequencies as they change with time, but it should still give an idea about the changes as they occur in time. Ideally, the frequencies within any given RR-Interval should repeat throughout the signal, in other words a constant plot against time is an indicator of a stable heart beat, while a changing graph would indicate irregularities within a particular RR interval. Figure 3.1 shows the Time Frequency distribution of a typical heart beat recording (ECG 101: figure 3.1), while figure 3.2 displays the irregular pattern of a recording with VF (ECG 419); (mfile ref: 04)


Figure 3.1: plot of the frequency average. ECG 101


Figure 3.2: plot of the frequency average. ECG 419

Figure 3.1, shows the almost predictable trend of frequencies averages, in contrast figure 3.2 shows an irregular pattern of the average frequencies, this suggests that within ECG419, there might be VF, as well as other heart abnormal behavior.

During the averaging process, all the lost data prevents us from detecting the nature of the problem as well as exactly the time it might have occurred; therefore there is a need for continuous monitoring of the heart over the whole length of the ECG. The next part of the report examines different ways of achieving this.

3.2 Short-Time Fourier Transform (STFT)

This stage of the report investigates the application of STFT on the ECG signals for the purpose of spectral analysis, this method result in an image that shows the frequencies and their spectral density as they change with time. The data is broken into segments, and the spectral density of each segment is represented using Fast Fourier Transform (FFT):

(1) [10]


Formula (3) demonstrates the continuous process of windowing the signal, and taking the Fourier Transform of the each segment, resulting in a complex representation of each segment of the signal. An example of the STFT is shown below; the signal is a sinusoid sampled at 360 Hz, and composed of two constant frequencies (1 and 50 Hz) as per formula (2):

) ;( 2)

Using the following Matlab command, we get the STFT representation displayed in

Figure 5.1:

spectrogram(sig1,window,noverlap,nfft,fs) [11]


sig1: as defined in formula 2.

Window: sliding Hamming window function of length 128;

Noverlap: overlapping of the window function (in this case=100, equivalent to 78%)

Nfft: FFT length (in this case nfft=512)

Fs: sampling frequency = 360 Hz


Figure 3.3: Spectrogram of the sinusoid

The spectrogram in figure 3.3 clearly shows the presence of two frequencies constituting the signal (sig1), and the fact that they are constant over the length of time.

Consequently, By applying STFT to the signal ECG 101 (figure 2.1), we can theoretically detect all the frequencies within the signal, and observe the changes as they occur on the time axis, figure 3.4 shows the spectrogram of the original signal (ECG 101):


Figure 3.4: spectrogram of original ECG 101

It is clear from this spectrogram, that they are certain frequencies unrelated to the heart activities, therefore filtering would be needed, in order to clean the signal from noise.


Figure 3.5: spectrogram of the filtered ECG 101

By applying the filter designed in section 2.3.1, we have managed dispose of most the noise.

From the analysis of the two images, we confirm that the filtering process has attenuated some unwanted high frequencies within the original signal; however, the frequency values and time intervals are not discernible on both spectrograms. This issue is due to the fact that the windowing process limits the resolution of time-frequency distribution. Better frequency resolution is achieved by increasing the length of the window (i.e. we reach the perfect frequency resolution when the window tends to infinity, or the whole length of the signal), as the window length increases it covers a bigger section of the signal, therefore the accuracy of time resolution suffers, this is a well known factor of STFT Time frequency distributions (TFD), usually referred to as the Time-Frequency Trade Off. In the next part of this chapter we will illustrate this issue, and ways of improving the use of standard STFT technique on our signals.

3.3.1 Window functions

The project is concerned with biomedical signals, since this type of signals are very unpredictable, it is almost impossible to select a window type or length that is applicable to every ECG recording. The quality of the Electrocardiogram recordings rely on too many factors (the accurate placement of the 12-leads, the environment of the room, and even the type of medication), all of which are very difficult to model. This in turn has a big impact on the signal amplitudes and frequencies; furthermore, to be able to select one window type to cover all of those factors is achieved empirically.

Hamming, and Gaussian windows are deemed the most suitable for ECG analysis, and for the purpose of comparison, all of the spectrograms use a hamming window unless stated otherwise.

3.4 VF detection

Although the Time Frequency Distribution would detect disturbances in the signal, on its own it is not capable of detecting VF because it has inferior resolution at low frequencies, for that reason we would have to combine different methods. A lot of research has been done in the biomedical engineering field to determine the bandwidth of VF, it has been found that most of VF disturbances occur within 4~10 Hz. [12], [13] Theoretically a band pass with that particular bandwidth should be able to detect such frequencies, figure 3.6 shows the details of a BPF designed with such a bandwidth of 4 to 8 Hz in Matlab Speedster toolbox:


Figure 3.6: 4 10 Hz BPF designed in Matlab speedster

In this section we will combine the use of this BPF filter and the LPF already designed in chapter 1 section 1.3, first to recover the potential VF at frequencies between 4~14 Hz, then to detect their presence in the original signal using STFT, which would result in a confirmation of VF detection.

An ECG recording with confirmed VF has been selected for this purpose; in the case of positive results it would be possible to generalize the method to analyze all signals. The recording used in this case is ECG 419 (from the MIT-BIH Malignant Ventricular Fibrillation Database), sampled at 250Hz, with notable VF at 0:35 min and between 0:40 and 0:50 min. as shown in figure 3.7:

54.bmpFigure 3.7 Time plot of ECG419

The following model explains the procedure of extracting the signal with possible VF using a BPF, before comparing it with the spectral analysis of the original low pass filtered signal :


LPF, Fc=15Hz

BPF, Face= 410Hz

ECG Signal

Spectrogram for TFD distribution

The output of the band-pass filter, contains the range of frequencies that might be triggered by ventricular fibrillation, we should expect a high spectral density (referred to as dense red color in a spectrogram) at 35sec and between 40 and 50 sec, therefore confirming the detection of VF. This assumption is confirmed in the spectrum of the filtered signal and the spectrogram of the original signal in figures 3.8 and 3.9 respectively.

Output of the BFP

X axis: frequencies in Hertzhpf250.bmp


Figure3.8 filtered signal spectrum figure3.9 spectrogram of signal ECG: 419

This method seems to be able to detect VF; yet to confirm the results, more comparisons and examples are needed, this time the recordings have been taken over 1 hour. Figures 3.10, 3.11 and 3.12 show the spectrogram and spectrum of the BPF output applied to the ECGs 425, 421 and 419 respectively, these recordings were taken from the Ventricular Fibrillation Database, therefore they do contain VF, on the other hand Figure 3.13 and 3.14 show the spectrogram of ECG 101 and 100, these signals are both clean from any VF disturbances.



Figure 3.10 spectrum and spectrogram of ecg425



Figure 3.11 spectrum and spectrogram of ecg421



Figure 3.12 spectrum and spectrogram of ecg419



Figure 3.13 spectrogram of ECG101 Figure 3.14 spectrogram of ecg100


It is clear from the Time Frequency distribution in the first three figures , that the power spectral density is not stable throughout the length of the signal, there seem to be some irregular disturbances. Suppose that those disturbances were introduced by noise, the pattern of such noise should be almost periodic or at least stable on the spectrogram, therefore those glitches in the signal could not have possibly originated from noise. Since we are only plotting the spectrograms of frequencies between 4 and 8Hz, the TFD would represent any potential Ventricular Fibrillations. To prove this theory even more, clean signals have been processed the same way, and figures 3.13 and 3.14, show a stable power spectral density pattern.On the other hand the accuracy of the values on the spectrogram still suffer from resolution issues, if the detection algorithm is to use TFD as a detection tool, it would need a much higher spectrogram resolution, in the next section we would look at improving those issues.

3.5 Wigner Distribution Function (WDF)

The Wigner Distribution Function was first developed by Eugene Wigner in 1932; it has been used to rectify mechanical engines (1), and since then the function's capability for producing high resolution TFD has extended to a wide variety of signal analysis. [12]For a given discreet signal x (k):

(1) [13]


WD(x (k,w)): Wigner distribution of a signal x(k);

T/2 : amount of delay or time shift applied to x(k)

Equation (1), can be simplified in terms of FFT, as shown in equation (2):

As a general definition, WDF is the Fourier Transform of the signal's auto-correlation function. This particular transform has cross term implications, but since the signals used are always real, and the sampling frequency is much higher than the highest frequency in the ECG, these problems do not apply [14] [15].

In order to generate the auto-correlation function of an ECG recording, the discreet signal is firstly filtered using the LPF designed in section 1.3, secondly, by applying the autocorrelation Blockset in Simulink. For the purpose of evaluation and comparison, the new TFD using WDF will be compared with that of STFT; the following model in figure 3.15 explains the procedure:


Figure 3.15: the model used for auto correlating the signal for WD.

The next figures show the TFD of ECG101 and 419, using STFT and WD, to prove the improvement of the resolution trade off, all the spectrograms used have the same parameters:

Window=hamming(128), with an overlap of 50% (Noverlap=64)

Nfft= 512;

Fs=360Hz for ecg101/ Fs= 250Hz for ecg419

The spectrograms of ECG 101 and ECG 419 using WDF and STFT are given in figures 3.16 and 3.17 respectively:

Wigner Distribution


Short-Time Fourier Transform


Figure3.16: spectrograms of ECG 101

Wigner Distribution


Short-Time Fourier Transform


Figure 3.17: spectrograms of ECG 419

We see from all the figures that WD method has a superior Time and Frequency resolution, in figure 3.16, the frequency values at each time interval are neatly split from each other on the WD spectrogram, while on the STFT, all the frequencies are merged together making the low frequency hard to distinguish and detect. Figure 3.17 represents ECG419, as mentioned before this recording has notable VF at 30 seconds as well as between 40 and 50. Note how those irregular patterns manifest themselves in both spectrograms, the starting and finishing time of the VF is much more accurate using WD method.

In Time Frequency Distribution the accuracy of time, and the superiority of frequency resolution in WD compared to STFT, make this technique the most suitable to be implemented in a detection algorithm.