A new metaheuristic optimization algorithm, inspired by biological nervous systems and artificial neural networks (ANNs) is proposed for solving complex optimization problems. The proposed method, named as Neural Network Algorithm (NNA), is developed based on the unique structure of ANNs. The NNA benefits from complicated structure of the ANNs and its operators in order to generate new candidate solutions. Being an algorithm without any effort for fine tuning initial parameters and statistically superior can distinguish the NNA over other reported optimizers. It can be concluded that, the ANNs and its particular structure can be successfully utilized and modeled as metaheuristic optimization method for handling optimization problems.
Based on the ANNs terminology, the NNA is an adaptive unsupervised method for solving optimization problems. Unsupervised in NNA means there is no clue and information of global optimum and the solutions have been updated by learning from the environment. The NNA is a single-layer perceptron optimization method having self-feedback. Figs. 1 t0 3 show more details regarding the NNA.
Fig. 1. Schematic view of generating new pattern solutions.
Fig. 2. Processes of the NNA.
Fig. 3. Schematic view for the performance of the NNA.
Lecture on introducing the NNA and its strategies for solving optimization problems by Dr. Sadollah (Spring 12 June 2020- 1399/03/23), Invited by Soft Computing Research Society (SCRS), New Delhi, India.
Teaching NNA in English (You can download the video via the below link):
You can download the NNA Power Points for your presentation in English:
Interested readers may download open source codes of the NNA using the below link:
Also, for solving constrained optimization problems, scholars may use the penalty function method applied on the standard unconstrained NNA code.
Some Related Publications: