TAILIEUCHUNG - A study to improve a learning algorithm of neural networks
In the process of finding an optimal algorithm to minimize the convergence time of the solution or avoiding the weak minima, local minima, the problems are starting to study the characteristics of the error surface. For the complex error surface as cleft-error surface, that its contours are stretched, bent forming cleft and cleft shaft, the old algorithms can not be paper proposes an algorithm to improve the convergence of the solution and the ability to exit from undesired areas on the error surface. | Nguyễn Hữu Công và Đtg Tạp chí KHOA HỌC & CÔNG NGHỆ 93(05): 53 - 59 A STUDY TO IMPROVE A LEARNING ALGORITHM OF NEURAL NETWORKS Cong Huu Nguyen*1, Thanh Nga Thi Nguyen2, Ngoc Van Dong3 1 Thai Nguyen University, 2College of Technology – TNU, 3 Ha Noi Vocational College of electrical mechanical ABSTRACT Since the last mid- twentieth century, the study of optimization algorithms, especially on the development of digital computers, is increasingly becoming an important branch of mathematics. Nowadays, those mathematical tools are practically applied to neural networks training. In the process of finding an optimal algorithm to minimize the convergence time of the solution or avoiding the weak minima, local minima, the problems are starting to study the characteristics of the error surface. For the complex error surface as cleft-error surface, that its contours are stretched, bent forming cleft and cleft shaft, the old algorithms can not be paper proposes an algorithm to improve the convergence of the solution and the ability to exit from undesired areas on the error surface. Keywords: neural networks, special error surface, local minima, optimization, algorithms BACKGROUND* In the process of finding an optimal algorithm to minimize the convergence time of the solution or to avoid tweak m inima, local minima, the problems are starting to study the characteristics of the error surface and take it as a starting point for improvement or propose a new training algorithm. When mentioning about the neural networks, trained network quality is usually offered (supervised learning). This related quality function and led to the concept of network quality surface. Sometimes, we also call the quality surface by other terms: the error surface, the executing surface. Figure 1 shows an error surface. There are some special things to note for this surface such as: the slope is drastically changing on the parameter space. For this reason, it will be difficult to choose
đang nạp các trang xem trước