مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Persian Verion

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

video

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

sound

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Persian Version

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View:

794
مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Download:

220
مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

Cites:

Information Journal Paper

Title

Approximation of the reservoir permeability model using ILS-DLA learning algorithm and LARS sparse coding method

Pages

  159-174

Abstract

 Summary In this paper, the task is to return from a set of multiplicities from a model to obtain an approximation of that model using Sparse approximation. The term ‘ approximation’ indicate the sufficiency of an interpretation that is close enough to the true mode, i. e. reality. In geosciences, the multiplicities are provided by multiple-point statistical (MPS) methods. Realistic modeling of the earth interior demands for more sophisticated geostatistical methods based on true available images, i. e. the training images. Among the available MPS methods, the DisPat algorithm is a distance-based MPS method, which generates appealing realizations for stationary and non-stationary training images by classifying the patterns based on distance functions using kernel methods. Advances in non-stationary image modeling is an advantage of the DisPat method. Realizations generated by the MPS methods form the training set for the Sparse approximation. The Sparse approximation is comprising of two steps, Sparse coding and dictionary update, which are alternately used to optimize the trained dictionary. Model selection algorithms like LARS are used for Sparse coding. LARS optimizes the regression model sequentially by choosing a proper number of variables and adding the best variable to the active set in each iteration. The ILS-DLA Dictionary learning algorithm addresses the internal structure of the dictionary by considering the overlapping or non-overlapping blocks and the inversion task according to the internal structure of the trained dictionary. The ILS-DLA is fast in the sense that it inverts smaller blocks constructing the trained dictionary rather than inverting the entire dictionary. The trained dictionary is sequentially updated by alternating between Sparse coding and dictionary training steps. According to the experiments, the compressed sparsity-based image model is superior to 90% of the generated realizations by 90% probability...

Cites

  • No record.
  • References

  • No record.
  • Cite

    APA: Copy

    HOSSEINI, MOHAMMAD, & RIAHI, MOHAMMAD ALI. (2019). Approximation of the reservoir permeability model using ILS-DLA learning algorithm and LARS sparse coding method. JOURNAL OF RESEARCH ON APPLIED GEOPHYSICS, 5(1 ), 159-174. SID. https://sid.ir/paper/268637/en

    Vancouver: Copy

    HOSSEINI MOHAMMAD, RIAHI MOHAMMAD ALI. Approximation of the reservoir permeability model using ILS-DLA learning algorithm and LARS sparse coding method. JOURNAL OF RESEARCH ON APPLIED GEOPHYSICS[Internet]. 2019;5(1 ):159-174. Available from: https://sid.ir/paper/268637/en

    IEEE: Copy

    MOHAMMAD HOSSEINI, and MOHAMMAD ALI RIAHI, “Approximation of the reservoir permeability model using ILS-DLA learning algorithm and LARS sparse coding method,” JOURNAL OF RESEARCH ON APPLIED GEOPHYSICS, vol. 5, no. 1 , pp. 159–174, 2019, [Online]. Available: https://sid.ir/paper/268637/en

    Related Journal Papers

    Related Seminar Papers

  • No record.
  • Related Plans

  • No record.
  • Recommended Workshops






    Move to top
    telegram sharing button
    whatsapp sharing button
    linkedin sharing button
    twitter sharing button
    email sharing button
    email sharing button
    email sharing button
    sharethis sharing button