مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

video

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

sound

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Persian Version

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View:

147
مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Download:

168
مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Cites:

Information Journal Paper

Title

Utilizing Gated Recurrent Units to Retain Long Term Dependencies with Recurrent Neural Network in Text Classification

Pages

  89-102

Abstract

 The classification of text is one of the key areas of research for natural language processing. Most of the organizations get customer reviews and feedbacks for their products for which they want quick reviews to action on them. Manual reviews would take a lot of time and effort and may impact their product sales, so to make it quick these organizations have asked their IT to leverage machine learning algorithms to process such text on a real-time basis. Gated Recurrent Units (GRUs) algorithms which is an extension of the Recurrent Neural Network and referred to as gating mechanism in the network helps provides such mechanism. Recurrent Neural Networks (RNN) has demonstrated to be the main alternative to deal with sequence classification and have demonstrated satisfactory to keep up the information from past outcomes and influence those outcomes for performance adjustment. The GRU model helps in rectifying gradient problems which can help benefit multiple use cases by making this model learn long-term dependencies in text data structures. A few of the use cases that follow are – sentiment analysis for NLP. GRU with RNN is being used as it would need to retain long-term dependencies. This paper presents a text classification technique using a sequential Word Embedding processed using gated recurrent unit sigmoid function in a Recurrent Neural Network. This paper focuses on classifying text using the Gated Recurrent Units method that makes use of the framework for embedding fixed size, matrix text. It helps specifically inform the network of long-term dependencies. We leveraged the GRU model on the movie review dataset with a classification accuracy of 87%.

Multimedia

  • No record.
  • Cites

  • No record.
  • References

  • No record.
  • Cite

    APA: Copy

    Chandra, Nidhi, Ahuja, Laxmi, Khatri, Sunil K., & Monga, Himanshu. (2021). Utilizing Gated Recurrent Units to Retain Long Term Dependencies with Recurrent Neural Network in Text Classification. JOURNAL OF INFORMATION SYSTEMS AND TELECOMMUNICATION (JIST), 9(2 (34)), 89-102. SID. https://sid.ir/paper/683071/en

    Vancouver: Copy

    Chandra Nidhi, Ahuja Laxmi, Khatri Sunil K., Monga Himanshu. Utilizing Gated Recurrent Units to Retain Long Term Dependencies with Recurrent Neural Network in Text Classification. JOURNAL OF INFORMATION SYSTEMS AND TELECOMMUNICATION (JIST)[Internet]. 2021;9(2 (34)):89-102. Available from: https://sid.ir/paper/683071/en

    IEEE: Copy

    Nidhi Chandra, Laxmi Ahuja, Sunil K. Khatri, and Himanshu Monga, “Utilizing Gated Recurrent Units to Retain Long Term Dependencies with Recurrent Neural Network in Text Classification,” JOURNAL OF INFORMATION SYSTEMS AND TELECOMMUNICATION (JIST), vol. 9, no. 2 (34), pp. 89–102, 2021, [Online]. Available: https://sid.ir/paper/683071/en

    Related Journal Papers

  • No record.
  • Related Seminar Papers

  • No record.
  • Related Plans

  • No record.
  • Recommended Workshops






    Move to top
    telegram sharing button
    whatsapp sharing button
    linkedin sharing button
    twitter sharing button
    email sharing button
    email sharing button
    email sharing button
    sharethis sharing button