350 rub
Journal Biomedical Radioelectronics №2 for 2023 г.
Article in number:
A method of window filtering of speech signals based on decomposition into empirical modes for systems for assessing a person's psychoemotional state
Type of article: scientific article
DOI: https://doi.org/10.18127/j15604136-202302-05
UDC: 621.391
Authors:

A.K. Alimuradov1, P.P. Churakov2, A.Y. Tychkov3, S.Y. Tverskaya4

1-4 Penza State University (Penza, Russia)
 

Abstract:

The problem of noisy speech signals is the most important in the tasks of recognition, voice control, speech authentication, speech-to-text conversion, etc. In practice, all speech signals are noisy to some extent and, depending on the noise level, can significantly distort the study results. A new way of window filtering of speech signals based on an improved complete multiple decomposition into empirical modes with adaptive noise is proposed. A study was made of the filtering method on signals noisy with white, pink and brown noise with a signal-to-noise ratio from -5 to 15 dB with a step of 5 dB. There is an increase in speech intelligibility by 10.56, 7.12 and 10.96 dB for white, pink and brown noise, respectively.

In accordance with the research results, it was concluded that the proposed window filtering method can be successfully tested in systems for assessing a person's psycho-emotional state by speech.

Pages: 32-37
For citation

Alimuradov A.K., Churakov P.P., Tychkov A.Y., Tverskaya S.Y. A method of window filtering of speech signals based on decomposition into empirical modes for systems for assessing the psycho-emotional state of a person. Biomedicine Radioengineering. 2023. V. 26. № 2. P. 32–37. DOI: https:// doi.org/10.18127/j15604136-202302-05 (In Russian)

References
  1. Huang X. et al. Spoken language processing: A guide to theory, algorithm, and system development. Prentice hall PTR, 2001.
  2. Schuller B., Batliner A. Computational paralinguistics: emotion, affect and personality in speech and language processing. John Wiley & Sons. 2013.
  3. Huang N. E., Attoh-Okine N. O. The Hilbert-Huang transform in engineering. CRC Press, 2005.
  4. Huang N.E., Zheng Sh., Steven R.L. The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis. Proceedings of the Royal Society of London. 1998. A 454. P. 903–995.
  5. Touati H., Khaldi K. Speech denoising by adaptive filter LMS in the EMD framework. 2018 15th International Multi-Conference on Systems, Signals & Devices (SSD). IEEE. 2018. P. 1–4.
  6. Shen W., Yu Y., Ling L., Ren J., Zhu Q. Speech Noise Reduction by EMD-LMS. IEEE 7th International Conference on Computer Science and Network Technology (ICCSNT). Dalian. China. Oct. 19–20. 2019. P. 485–488.
  7. Bouchair A., Amrouche A., Selouani S.-A., Hamidia M. Empirical Mode Decomposition for Speech Enhancement. International Conference on Electrical Sciences and Technologies in Maghreb (CISTEM). Algiers. Algeria. Oct. 28–31. 2018. P. 1–4.
  8. Colominasa M.A., Schlotthauera G., Torres M. E. Improved complete ensemble EMD: a suitable tool for biomedical signal processing. Biomed. Signal Proces. 2014. V. 14. P. 19–29.
  9. Alimuradov A. K. i dr. Sposob opredeleniya formantnoj razborchivosti rechi dlya ocenki psihoemocional'nogo sostoyaniya operatorov sistem upravleniya s vysokoj stepen'yu otvetstvennosti. Izmerenie. Monitoring. Upravlenie. Kontrol'. 2019. № 4 (30). S. 58–69 (In Russian).
  10. Alimuradov A.K., Tychkov A.Yu. Primenenie metoda dekompozicii na empiricheskie mody dlya issledovaniya vokalizovannoj rechi v zadache obnaruzheniya stressovyh emocij cheloveka. Vestnik Permskogo nacional'nogo issledovatel'skogo politekhnicheskogo universiteta. Elektrotekhnika, informacionnye tekhnologii, sistemy upravleniya. 2020. № 3 (35). S. 7–29 (In Russian).
  11. Svidetel'stvo o gosudarstvennoj registracii bazy dannyh № 2016620597 (RF). Verificirovannaya baza rechevyh komand dlya sistem golosovogo upravleniya. Programmy dlya EVM, bazy dannyh, topologii integral'nyh mikroskhem. A.K. Alimuradov. 2016 (In Russian).
Date of receipt: 09.02.2023
Approved after review: 17.02.2023
Accepted for publication: 03.03.2023