A Parallel Hopfield Neural Network Guaranteeing the Convergence
作者单位:Department of Computer Science and Center for AI Research Korea Advanced Institute of Science and Technology(KAIST) 373-1 Kusong-DongYusung-Gu Taejon 305-701Korea
会议名称:《IJCNN International Joint Conference on Neural Networks》
会议日期:1992年
学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 081104[工学-模式识别与智能系统] 08[工学] 0835[工学-软件工程] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)]
摘 要:正This paper aims to operate the Hopfield neural network model in parallel by means of a new state updating rule,called the majority *** rule is characterized by that a neuron in a parallel Hopfield model changes its state only when the net-input is considerably larger(or smaller) than its threshold regardless of the states of other neurons that operate simultaneously. While the original Hopfield network may not converge if it operate in parallel,the convergence of the proposed parallel Hopfield network is guaranteed thanks to the novel concept of majority *** convergence property of the proposed network is theoretically proved by showing that the energy of network is always monotonically non-decreasing when a state transition with majority protocol is *** addition,this paper demonstrates its usefulness by applying the parallel Hopfield model to well-known combinatorial optimization problems.