Search Results/Filters    

Filters

Year

Banks



Expert Group











Full-Text


Issue Info: 
  • Year: 

    2013
  • Volume: 

    44
Measures: 
  • Views: 

    167
  • Downloads: 

    149
Abstract: 

BY P-POWER (OR PARTIAL P -POWER) TRANSFORMATION, THE LAGRANGIAN FUNCTION IN NONCONVEX optimization PROBLEM BECOMES LOCALLY CONVEX. IN THIS PAPER, WE PRESENT A NEURAL NETWORK BASED ON AN NCP FUNCTION FOR SOLVING NONCONVEX optimization PROBLEM. ONE OF THE IMPORTANT FEATURES OF THIS NEURAL NETWORK IS THE ONE-TO-ONE CORRESPONDENCE BETWEEN ITS EQUILIBRIA AND KKT POINTS OF THE non-convex optimization PROBLEM; IN THE OTHER WORDS, THE NEURAL NET-WORK IS PROVED TO BE STABLE AND CONVERGENT TO AN OPTIMAL SOLUTION OF THE ORIGINAL PROBLEM. FINALLY, EXAMPLES ARE PROVIDED TO SHOW THE APPLICABILITY OF THE PROPOSED NEURAL NETWORK.

Yearly Impact:   مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View 167

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesDownload 149
Issue Info: 
  • Year: 

    2019
  • Volume: 

    7
  • Issue: 

    1
  • Pages: 

    69-85
Measures: 
  • Citations: 

    0
  • Views: 

    185
  • Downloads: 

    73
Abstract: 

Newton method is one of the most famous numerical methods among the line search methods to minimize functions. It is well known that the search direction and step length play important roles in this class of methods to solve optimization problems. In this investigation, a new modi cation of the Newton method to solve uncon-strained optimization problems is presented. The significant merit of the proposed method is that the step length k at each iteration is equal to 1. Additionally, the convergence analysis for this iterative algorithm is established under suitable conditions. Some illustrative examples are provided to show the validity and applicability of the presented method and a comparison is made with several other existing methods.

Yearly Impact: مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View 185

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesDownload 73 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesCitation 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesRefrence 0
Issue Info: 
  • Year: 

    2011
  • Volume: 

    37
  • Issue: 

    1
  • Pages: 

    171-198
Measures: 
  • Citations: 

    0
  • Views: 

    421
  • Downloads: 

    178
Abstract: 

We present an effective algorithm for minimization of locally nonconvex Lipschitz functions based on mollifier functions approximating the Clarke generalized gradient. To this aim, first we approximate the Clarke generalized gradient by mollifier subgradients.To construct this approximation, we use a set of averaged functions gradients. Then, we show that the convex hull of this set serves as a good approximation for the Clarke generalized gradient.Using this approximation of the Clarke generalized gradient, we establish an algorithm for minimization of locally Lipschitz functions. Based on mollifier subgradient approximation, we propose a dynamic algorithm for finding a direction satisfying the Armijo condition without needing many subgradient evaluations. We prove that the search direction procedure terminates after finitely many iterations and show how to reduce the objective function value in the obtained search direction. We also prove that the first order optimality conditions are satisfied for any accumulation point of the sequence constructed by the algorithm. Finally, we implement our algorithm with MATLAB codes and approximate averaged functions gradients by the Monte-Carlo method. The numerical results show that our algorithm is effectively more efficient and also more robust than the GS algorithm, currently perceived to be a competitive algorithm for minimization of nonconvex Lipschitz functions.

Yearly Impact: مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View 421

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesDownload 178 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesCitation 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesRefrence 0
Issue Info: 
  • Year: 

    2014
  • Volume: 

    24
  • Issue: 

    -
  • Pages: 

    1120-1133
Measures: 
  • Citations: 

    2
  • Views: 

    106
  • Downloads: 

    0
Keywords: 
Abstract: 

Yearly Impact: مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View 106

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesDownload 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesCitation 2 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesRefrence 0
Author(s): 

Araboljadidi Narges

Issue Info: 
  • Year: 

    2019
  • Volume: 

    4
  • Issue: 

    3
  • Pages: 

    197-208
Measures: 
  • Citations: 

    0
  • Views: 

    442
  • Downloads: 

    0
Abstract: 

In this paper, we present a method for charaterizing the solution set of nonconvex optimization problems via their dual problems. In fact, the constrainted optimization problem which is considerd has pseudoconvex and locally Lipschitz functions, which are not necessarily convex and smooth, and include a wide class of non-convex non-smooth functions. In the proposed method, a dual problem is formulated to characterizations of the solution set of the primal problem in a mixed type of Wolfe type and Mond-Weir type. First, we introduce some of the properties of the Lagrangian functions associated to these problems and then we explain the proof of the characterization of their solution sets.

Yearly Impact: مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View 442

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesDownload 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesCitation 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesRefrence 0
Issue Info: 
  • Year: 

    2024
  • Volume: 

    15
  • Issue: 

    5
  • Pages: 

    239-245
Measures: 
  • Citations: 

    0
  • Views: 

    10
  • Downloads: 

    0
Abstract: 

In the presented paper, we investigate efficient solutions to optimization problems with multiple criteria and bounded trade-offs. A nonlinear optimization problem to find the relationships between the upper bound for trade-offs and objective functions is presented. Due to this problem, we determine some properly efficient points that are closer to the ideal point. To this end,  we apply the extended form of the generalized Tchebycheff norm. Note that all the presented results work for general problems and no convexity assumption is needed.

Yearly Impact: مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View 10

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesDownload 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesCitation 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesRefrence 0
Author(s): 

KAZMI KALEEM RAZA

Journal: 

MATHEMATICAL SCIENCES

Issue Info: 
  • Year: 

    2013
  • Volume: 

    7
  • Issue: 

    -
  • Pages: 

    1-5
Measures: 
  • Citations: 

    0
  • Views: 

    318
  • Downloads: 

    95
Abstract: 

In this paper, we propose a split nonconvex variational inequality problem which is a natural extension of split convex variational inequality problem in two different Hilbert spaces. Relying on the prox-regularity notion, we introduce and establish the convergence of an iterative method for the new split nonconvex variational inequality problem. Further, we also establish the convergence of an iterative method for the split convex variational inequality problem. The results presented in this paper are new and different form the previously known results for nonconvex (convex) variational inequality problems. These results also generalize, unify, and improve the previously known results of this area.

Yearly Impact: مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View 318

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesDownload 95 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesCitation 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesRefrence 4
Author(s): 

NOOR M.A.

Journal: 

optimization LETTERS

Issue Info: 
  • Year: 

    2009
  • Volume: 

    3
  • Issue: 

    3
  • Pages: 

    411-418
Measures: 
  • Citations: 

    1
  • Views: 

    125
  • Downloads: 

    0
Keywords: 
Abstract: 

Yearly Impact: مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View 125

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesDownload 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesCitation 1 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesRefrence 0
Author(s): 

AGRELL P.J. | TIND J.

Issue Info: 
  • Year: 

    2001
  • Volume: 

    16
  • Issue: 

    2
  • Pages: 

    129-147
Measures: 
  • Citations: 

    1
  • Views: 

    122
  • Downloads: 

    0
Keywords: 
Abstract: 

Yearly Impact: مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View 122

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesDownload 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesCitation 1 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesRefrence 0
Author(s): 

BAGIROV A.M.

Issue Info: 
  • Year: 

    2014
  • Volume: 

    5
  • Issue: 

    1
  • Pages: 

    1-14
Measures: 
  • Citations: 

    0
  • Views: 

    303
  • Downloads: 

    176
Abstract: 

Here, an algorithm is presented for solving the minimum sum-of-squares clustering problems using their difference of convex representations. The proposed algorithm is based on an incremental approach and applies the well-known DC algorithm at each iteration. The proposed algorithm is tested and compared with other clustering algorithms using large real world data sets.

Yearly Impact: مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic Resources

View 303

مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesDownload 176 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesCitation 0 مرکز اطلاعات علمی Scientific Information Database (SID) - Trusted Source for Research and Academic ResourcesRefrence 0
litScript
telegram sharing button
whatsapp sharing button
linkedin sharing button
twitter sharing button
email sharing button
email sharing button
email sharing button
sharethis sharing button