A Reconstruction Algorithm for Speech Compressive Sensing Using Structural Features

Expand
  •  Shanghai Key Laboratory of Navigation and LocationBased Services,
    Shanghai Jiao Tong University

Online published: 2017-09-20

Supported by

 

Abstract

 It is difficult to reconstruct speech signal after compressive sampling because coefficients of the signal in transforming domain aren’t sparse enough. In this paper the speech signal was recovered from compressed samples in the frequency domain using structural features. Two hidden variables, amplitude and state, are defined for each modified discrete cosine transforming (MDCT) coefficient of the speech signal. The probability density function of the amplitude of the MDCT coefficient is represented using a Gaussian mixture model, and the continuity of the states along the frequency axis is modeled through a first order Markov chain,the continuity of the amplitude along the frequency axis is modeled through GaussMarkov process. The joint posterior distribution of coefficient, amplitude and state is represented by the factor graph, on which the posterior mean of the coefficient is obtained using Turbo message passing method, and then the speech can be reconstructed. After compressive sampling the MDCT coefficients of a speech segment, we reconstructed the signal using our proposed algorithm and other stateoftheart algorithms for comparison. The results showed that our proposed algorithm achieved best reconstruction quality under different frames and compressive ratios. The spectrogram showed that the energy distribution of reconstructed signal using our algorithm was the most similar to the original signal’s energy distribution. It can be seen that better reconstruction accuracy can be obtained using the continuity along frequency axis and Turbo message passing method.

Cite this article

JIA Xiaoli,JIANG Xiaobo,JIANG Sanxin,LIU Peilin .  A Reconstruction Algorithm for Speech Compressive Sensing Using Structural Features[J]. Journal of Shanghai Jiaotong University, 2017 , 51(9) : 1111 -1116 . DOI: 10.16183/j.cnki.jsjtu.2017.09.014

References

 [1]CANDS E J, WAKIN M B. An introduction to compressive sampling[J]. IEEE Signal Processing Magazine, 2008, 25(2): 2130.
[2]HILL P R, KIM J H, BASARAB A, et al. Compressive imaging using approximate message passing and a Cauchy prior in the wavelet domain[C]∥International Conference on Image Processing, Phoenix:IEEE, 2016: 25142518.
[3]LEE D. MIMO OFDM channel estimation via block stagewise orthogonal matching pursuit[J]. IEEE Communications Letters, 2016, 20(10): 21152118
[4]SUN B, FENG H, CHEN K F, et al. A deep learning framework of quantized compressed sensing for wireless neural recording[J]. IEEE Access, 2016, 4(99): 51695178.
[5]LEE K, BRESLER Y,JUNGE M. Subspace methods for joint sparse recovery[J]. IEEE Transactions on Information Theory, 2012, 58(6): 36133641
[6]ZINIEL J, SCHNITER P. Efficient highdimensional inference in the multiple measurement vector problem[J]. IEEE Transactions on Signal Processing, 2013, 61(2): 340354.
[7]ZINIEL J, SCHNITER P. Dynamic compressive sensing of timevarying signals via approximate message passing[J]. IEEE Transactions on Signal Processing, 2013, 61(21): 52705284.
[8]FE′VOTTE C, TORRE′SANI B, DAUDET L, et al. Sparse linear regression with structured priors and application to denoising of musical audio[J]. IEEE Transactions on Audio, Speech, and Language Processing, 2008, 16(1): 174185.
[9]SCHNITER P. Turbo reconstruction of structured sparse signals[C]∥44th Annual Conference on Information Sciences and Systems (CISS). New Jersey: Princeton University, 2010: 16.
[10]DONOHO D L, MALEKI A, MONTANARI A. Message passing algorithms for compressed sensing[J]. Proceedings of the National Academy of Sciences of the United States of America, 2009, 106(45): 1891418919.
[11]VILA J P,SCHNITER P. Expectationmaximization Gaussianmixture approximate message passing[J]. IEEE Transactions on Signal Processing, 2013, 61(19): 46584672.
[12]BECK A,TEBOULLE M. A fast iterative shrinkage thresholding algorithm for linear inverse problems[J]. Society for Industrial and Applied Mathematics, 2009, 2(1): 183202.
Options
Outlines

/