[Home]History of Linear prediction

HomePage | Recent Changes | Preferences

Revision 3 . . (edit) December 14, 2001 9:10 pm by (logged).153.190.xxx
Revision 2 . . December 14, 2001 8:35 pm by Tbackstr [added more content]
Revision 1 . . December 14, 2001 5:12 am by Tbackstr [stub]
  

Difference (from prior major revision) (minor diff, author diff)

Removed: 5d4


Changed: 8c7
x'(n) = sum_i=1^p a_i x(n-i)
:x'(n) = sum_i=1^p a_i x(n-i)

Changed: 12c11
e(n) = x(n) - x'(n)
:e(n) = x(n) - x'(n)

Added: 14a14,24
These equations are valid for all types of (one-dimensional) linear prediction. The differences are found in the way the parameters a_i are chosen.

The most common choice in optimisation of parameters a_i is the root mean square criterion which is also called the autocorrelation criterion. In this method we minimise the expected value of the squared error E{e^2(n)}, which yields the equation

:sum_i=1^p a_i R(i-j) = -R(j), for 1 <= j <= p,

where R is the autocorrelation of signal x(n) defined as R(i) = E{x(n)x(n-i)}.

The above equations are called the normal equations or Yule-Walker equations. In matrix form the equations can be equivalently written as

R*a = r, where autocorrelation matrix R is a symmetric? Toeplitz? matrix with elements r_i,j = R(i-j), vector r is the autocorrelation vector r_j = R(j), and vector a is the parameter vector of a_i.

Added: 15a26,27
Optimisation of the parameters is a wide topic and a large number of other approaches have been proposed.
Still, the autocorrelation method is the most common and it is used, for example, for speech coding in the GSM standard.

Changed: 17c29
This is a humble start on a big topic. Please continue.
Solution of the matrix equation R*a = r is computationally a relatively expensive process. The [Gauss algorithm]? for matrix inversion is probably the oldest solution but this approach does not efficiently use the symmtric properties of R and r. A faster algorithm is the [Levinson recursion]? proposed by N. Levinson in 1947, which recursively calculates the solution. Later, Delsarte et al proposed an improvement to this algorithm called the [split Levinson recursion]? which requires about half the number of multiplications and divisions. It uses a special symmetrical property of parameter vectors on subsequent recursion levels.

HomePage | Recent Changes | Preferences
Search: