Search Results/Filters    

Filters

Year

Banks


Expert Group


Full-Text


Issue Info: 
  • Year: 

    2024
  • Volume: 

    13
  • Issue: 

    25
  • Pages: 

    93-125
Measures: 
  • Citations: 

    0
  • Views: 

    19
  • Downloads: 

    0
Abstract: 

In traditional speech processing, feature extraction and classification were conducted as separate steps. The advent of deep neural networks has enabled methods that simultaneously model the relationship between acoustic and phonetic characteristics of speech while classifying it directly from the raw waveform. The first convolutional layer in these networks acts as a filter bank. To enhance interpretability and reduce the number of parameters, researchers have explored the use of parametric filters, with the SincNet architecture being a notable advancement. In SincNet's initial convolutional layer, rectangular bandpass filters are learned instead of fully trainable filters. This approach allows for modeling with fewer parameters, thereby improving the network's convergence speed and accuracy. Analyzing the learned filter bank also provides valuable insights into the model's performance. The reduction in parameters, along with increased accuracy and interpretability, has led to the adoption of various parametric filters and deep architectures across diverse speech processing applications. This paper introduces different types of parametric filters and discusses their integration into various deep architectures. Additionally, it examines the specific applications in speech processing where these filters have proven effective.

Yearly Impact: مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View 19

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesDownload 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesCitation 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesRefrence 0
litScript
telegram sharing button
whatsapp sharing button
linkedin sharing button
twitter sharing button
email sharing button
email sharing button
email sharing button
sharethis sharing button