Fetus Heartbeat Detction Engineering Essay

Published:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

The quality of sound signal is often degraded by different types of noise. This degradation can be removed by different methods. Adaptive Noise Canceller has important applications in the digital signal processing, the ALC (Adaptive Noise Canceller) is used to de-noise the noise corrupted signal through adaptive filtering techniques. The fetus heartbeat contaminated by the mother's heartbeat is considered and different adaptive algorithms implemented in the Single Input Single Output (SISO) and Multiple Input Single Output (MISO) operations do the filtering out of the noise. MATLAB simulation results have confirmed the MISO and SISO operational procedure using different algorithms for the detection of Fetus Heartbeat.

Index Terms- SISO, MISO, Least Mean Square (LMS), Leaky Least Mean Square (LLMS), Normalized Least Mean Square (NLMS), Recursive Least Square (RLS), Adaptive Noise Canceller (ANC), Adaptive Line Enhancer (ALE).

The characteristics of the Input signal, the noise and physical system dynamics changes with time in many situations of the in signal processing. In fact we need to have digital filters to adapt to these changes so to get the desired output. It can be achieved by changing the filter's coefficients in response to the various changes in the input signal. So the filters whose changing coefficients i.e. changes with the passage of time are called adaptive filters. Adaptive filters have two basic elements (1). A digital filter (2). An adaptive algorithm. The digital filter has to produce an output in response to an input, while the adaptive algorithm adjusts the digital filter's coefficients. We have some parameters associated with the filtering process. The signal d(n) is the desired signal while the input and output are referred to as x(n) , y(n) respectively. The signal has its estimation error denoted by d(n). All the three parameters are interrelated by the equation as e(n)=x(n)-y(n). Adaptive filter have adaptive algorithm designed in such a way as to get minimized some objective function achieved or constructed form the error signal. We have many different filtering schemes like LMS, NLMS, LLMS, RLS etc. our aim is to get extracted the desired signal of our interest. Adaptive filers some basic applications are: i. System identification ii. Sinusoidal tracking iii. Noise cancellation iv. Computational complexity v. Rate of convergence vi. Numerical robustness vii. Misadjustment

In digital communication systems adaptive filters have very essential and important role. Adaptive filters are important due to their stability which is controlled by ensuring that the filtering coefficients are bounded Adaptive filters are simple and have efficient algorithms e.g. LMS, NLMS, LLMS, RLS etc we use them for adjusting the filtering coefficients. Adaptive filters have good performance of the algorithms. Design criteria of the adaptive filters are fulfilled by the filters 'performance.

1.1. Adaptive Noise Cancellation ANC

The aim of a noise canceller is to get an estimate of the fetus signal from the noise corrupted heartbeat. The fetus signal has a degraded quality due to the disturbance caused by the mother's heartbeat. The disturbance is also as a result of the muscular activities and fetus movement but the mother's heartbeat causes the main disturbance and as such it is removed to get a clearer ECG of the fetus. In ANC, different adaptive algorithm may be implemented in the Adaptive filter. The algorithms used comprises of the LMS, LLMS, NLMS, and RLS algorithms. The output thus obtained from the ANC is usually the input in the ALE. ANC enhances the fetus by cancelling the noise. In our Project we have implemented ANC for removing the 50 Hz hum present due to power supply. So we have created dnew and is given to the adaptive algorithms for better results

Here d(n) is used to be estimated from the noise corrupted signal x(n). Without any information it is not possible to separate noise signal v1(n) from d(n). So the reference signal v2(n),which is correlated with the v1(n) and estimate the noise v1(n). Then the estimated v1(n) is subtracted from x(n) and get the estimated d(n).

e(n)= d^(n)=x(n)-v1^(n)

Here in above diagram Thoracic signal is taken as reference input. The difference of filter output and primary signal is error signal. The output of ANC is applied to as input to ALE.

Adaptive Line Enhancer ALE

ALE is used to remove the noisy signal from the periodic signal; in our case the mother's heartbeat is the noise to be removed to prevent interference with the fetus signal. The output from the ANC is fed into the ALE to remove the 50HZ hum present in the signal. The error signal in the ANC is applied as the input signal in the ALE. The reference signal is delayed and fed into the adaptive filter. ALE is used to remove the noisy signal from the periodic signal, in our case, the mother's heart beat is the noise to be removed from the Fetus Heartbeat.

2. Algorithms

2.1. The LMS algorithm

The weight vector update equation of LMS algorithm is developed from the steepest descent adaptive filter. It is given by the equation

µ E{ e(n)}

In this E{ e(n)} is generally unknown

Weight Vector Update Equation for Single Reference Input:

Wn+1 = Wn +μ E{e(n)x*(n)}

Wn+1 = Wn +μ e(n)x*(n)

Error signal: e(n)=y(n)-d(n)

Weight Vector Update Equation for Multiple Signal Input:

W1n+1(k)=W1n(k)+μ e(n)x*(n)

W2n+1(k)=W2n(k)+μ e(n)x*(n)

W3n+1(k)=W3n(k)+μ e(n)x*(n)

W4n+1(k)=W4n(k)+μ e(n)x*(n)

For Four Reference Signals:

X1, X2, X3, X4 reference inputs of Adaptive Filter Coefficients W1,W2,W3,W4,

e(n) =overall error of the system

For (jointly WSS) processes μ = step size

0<µ<2/λmax

Wn(k)= Filter Coefficient Vector ,

x*(n)= Filter Input vector

Then the weight vector update equation is

(n)

Kth-coefficient updates equation

(n-k)

limn-infE(wn)=w=Rx-1.rdx

When wn converges in the mean to wn=Rx-1rdx.

Advantages

It is easy to code

It has less computational complexity

It is stable and reliable performance against different signal conditions

Disadvantages

It is not fastest to converge due to Eigen value spread

Excessive Mean square error (MSE)

2.2. The NLMS algorithm

NLMS) is a special case of standard LMS algorithm, where we need to optimize the step size for fast convergence as the cost higher computational complexity. At each iteration the step size is optimized.

[||x(n)||2] is small for some small length of time of the input signal. So we add a small number (-) i.e. {β/||x(n)||2+- }so the optimal search is no longer exist in this case. The computational complexity is increased due to ||x(n)||2.

Update coefficients filter equation for Single Input

wn+1=wn+ β x*(n)/[(||x(n)||2) ]*e(n)

Step Size: NLMS has a time-varying step size μ(n) that improve the convergence speed of the adaptive filter.

0<µ<2/xH(n)x(n)= 0<µ<2/[||x(n)||2]

µ(n)= β/xH(n)x(n) = β/[||x(n)||2]

β is normalized step size : 0<β<2

For Calculating: RX = Autocorrelation matrix is unknown so to estimate λmax

λmax : tr(Rx)=[(p+1)][ E{|x(n)|2}]

0<µ<2/[(p+1)][E{|x(n)|2}]

Time varying step size

Effect of normalization: to change the magnitude but not the direction of the gradient vector.

It also controls gradient noise amplification.

Weight Vector Update Equation for Multiple Signal Input:

Wn+1= Wn+ β x*(n)/[(||x(n)||2)] *e(n)

W1n+1=W1n+ β x*(n)/[(||x(n)||2)] *e(n)

W2n+1=W2n+ β x*(n)/[(||x(n)||2) ]*e(n)

W3n+1=W3n+ β x*(n)/[(||x(n)||2) ]*e(n)

W4n+1=W4n+ β x*(n)/[(||x(n)||2)] *e(n)

For Four Reference Signals:

X1,X2,X3,X4 Vectors of Adaptive Filter Coefficients W1,W2,W3,W4, e(n) =overall error of the system

2.3. The LLMS Algorithm

When Rx- autocorrelation matrix has zero Eigen values, then LMS algorithm has undamped and undriven modes. We introduce a leaky factor (γ) into the autocorrelation matrix RX in LLMS Algorithm. We also white noise having variance γ=Ϭx2 for the overflow control of the Filter's coefficients. Due to overflow case the autocorrelation matrix RX has very small Eigen values which make it Singular. This problem is overcome by adding a constant factor (γ) with each of the Eigen values of RX. the values become larger. The coefficients of the LLMS algorithm are biased as compared to standard LMS Algorithm.

Weight Vector Update Equation for Single Reference Input:

Wn+1 = (1-μγ)*Wn+μ e(n)x*(n)

0<γ<<1 and 0<μ<2/λmax+γ

Weight Vector Update Equation for Multiple Signal Input:

W1n+1 = (1-μγ)*W1n+μ e(n)x*(n)

W2n+1 = (1-μγ)*W2n+μ e(n)x*(n)

W3n+1 = (1-μγ)*W3n+μ e(n)x*(n)

W4n+1 = (1-μγ)*W4n+μ e(n)x*(n)

The effect of the leaky factor:

Enforces the filter coefficients to zero if the error e(n) or the I/P signal x(n) becomes zero. To enforce any un-damped modes of the system to zero.

Constraints on the step size:

0<μ<2/λmax+γ that converges in the mean

Drawback of Leaky LMS:

For stationary process, leaky-coefficient produces bias in the steady state solution.

2.4. The RLS Algorithm

In Recursive Least Square algorithm, a squared error is minimized. The main reason that the Least square is adopted is that the knowledge of the autocorrelation and cross correlation of the input x(t) and the desired signal d(n) is not required to calculate the squared error. The Least square error is calculated directly from the exact values of the input signal and the desired signal. Both signals are treated differently for optimal result; the filter coefficients are different for both signals. The input to the RLS algorithm is deterministic.

ξ(n)=E{ |e(n)|2},

The weight update equation

1:

Wn =Wn-1+α(n)*g(n),

α(n)=d(n)-Wn- T*x(n).

Where sgain vector g(n) = z(n)/[λ +xT(n)*z(n)]

z(n)=p(n-1).x*(n)

p(n)=Rx(n)

α (n) is the priori error.

Exponentially weighted RLS:

RLS is able to track non stationary processes because of the exponential weighting.

ξ(n)=∑ i { n-i|e(i)|2

Where λ is an exponentially weighted factor with intervals; 0 λ 1

For growing window RLS: λ 1

Sliding Window RLS: This is effective for fastly changing non-stationary processes.

The Least mean square error is minimized over a finite window.

ξL+1(n)=∑ i=n-L-1 { e(i)|2}

Drawbacks of RLS: Due to the high computational complexity it is unsuitable for many high speed applications.

Advantages of RLS: It has fast convergence and Zero Misadjustment.

3. Project Implementation

3.1. A Brief Introduction

The Adaptive filtering is done using SISO and MISO since the input signals are fed in to the adaptive filter using these two means. Both techniques show different results.

3.2. SISO Implementation

For SISO, the abdominal signals is summed and the average is taken .The average of the abdominal signals is taken as primary signal. The thoracic signals are added then. Difference of both the signals is calculated and is fed back to Filter, Adaptively Filtering is performed.

3.3. MISO Implementation

Primary signal: The average of abdominal input signals is considered as primary signal. Thoracic Signals: input's applied to different filters. Summation: Filter's Output's of these thoracic signals and the primary signal are added. Error Calculation: the error is fed back to the filter. Adaptive Filtering is performed.

4. MAT LAB Simulation Results

The different Adaptive algorithms give different simulation result. The RLS is observed to remove the noise best. ANC has been implemented to remove the 50 Hz hum due to power supply. Since no 50HZ hum is present in the signal ALE is not necessarily needed here. The plots for both SISO and MISO using LMS, NLMS, and RLS algorithms are shown below. The simulation results for LMS, NLM, and RLS algorithms are shown in the figures below using SISO input technique.. The RLS algorithm is shown to give better result. From the results of MISO input technique using the LMS, NLMS AND RLS algorithms, it is observed that the RLS is also best in noise removal.

4.1. SISO Implementation (ANC-LMS)

4.2. SISO Implementation (ANC-NLMS)

4.3. SISO Implementation (ANC-LLMS)

4.4. SISO Implementation (ANC-RLS)

4.5. MISO Implementation (ANC-LMS)

4.6. MISO Implementation (ANC-NLMS)

4.7. MISO Implementation (ANC-LLMS)

4.8. MISO Implementation (ANC-RLS)

5. CONCLUSION:

This paper shows how the fetus ECG can be clearly obtained for medical purposes. The ANC and ALE is implemented, using SISO and MISO input techniques. The Results shows that the RLS algorithm gives a better result in noise cancelling. Also MISO input technique is better when the results are observed.

For future work other algorithms like F-xLMS and BLMSFFT can also be implemented.

Writing Services

Essay Writing
Service

Find out how the very best essay writing service can help you accomplish more and achieve higher marks today.

Assignment Writing Service

From complicated assignments to tricky tasks, our experts can tackle virtually any question thrown at them.

Dissertation Writing Service

A dissertation (also known as a thesis or research project) is probably the most important piece of work for any student! From full dissertations to individual chapters, we’re on hand to support you.

Coursework Writing Service

Our expert qualified writers can help you get your coursework right first time, every time.

Dissertation Proposal Service

The first step to completing a dissertation is to create a proposal that talks about what you wish to do. Our experts can design suitable methodologies - perfect to help you get started with a dissertation.

Report Writing
Service

Reports for any audience. Perfectly structured, professionally written, and tailored to suit your exact requirements.

Essay Skeleton Answer Service

If you’re just looking for some help to get started on an essay, our outline service provides you with a perfect essay plan.

Marking & Proofreading Service

Not sure if your work is hitting the mark? Struggling to get feedback from your lecturer? Our premium marking service was created just for you - get the feedback you deserve now.

Exam Revision
Service

Exams can be one of the most stressful experiences you’ll ever have! Revision is key, and we’re here to help. With custom created revision notes and exam answers, you’ll never feel underprepared again.