Back propagation algorithm performs gradient descent only in the weight space of a network with fixed topology. The Number of layers, Neurons and network weights have important influence on network performance. So algorithms that can find appropriate network architecture automatically are thus highly desirable. Researchers have proposed different algorithms for determining optimum size of neural networks. Meybod and Beigy introduced the first learning automata based algorithms, called WSA algorithm. This algorithm by turning off the unimportant weights, not only reduces network complexity but also increases network generalization ability, at the beginning, all weights of the network are on and contribute to learning. The on weights, whose absolute values are less than a threshold value, are penalized and those whose absolute value is larger than another threshold value, are rewarded. The on weights, whose absolute values lie between these two threshold values, neither rewarded, nor penalized. By choosing optimum values for these values we can obtain, networks with minimum number of weights which can learn training patterns with acceptable error and generalization ability. In this paper we introduce a new learning automata based algorithm, called A WSA for adaptation of parameters of WSA algorithm. Also, a new algorithm called MWSA is introduced to determine important weights in the multi layer neural networks. These algorithms are applied to number of problems such as recognition of English number. Persian printed numbers recognition, second order discrete time nonlinear function approximation and Persian phoneme recognition. The results obtained show that the proposed algorithms have better performance than, other existing algorithms.