Date of Award

6-16-2016

Document Type

Thesis

Degree Name

Master of Science in Computer Science

Department

Department of Electrical and Computer Engineering

First Advisor

Gilbert L. Peterson, PhD.

Abstract

Machine learning algorithms have become a ubiquitous, indispensable part of modern life. Neural networks are one of the most successful classes of machine learning algorithms, and have been applied to solve problems previously considered to be the exclusive domain of human intellect. Several methods for selecting neural network configurations exist. The most common such method is error back-propagation. Backpropagation often produces neural networks that perform well, but do not achieve an optimal solution. This research explores the effectiveness of an alternative feed-forward neural network weight selection procedure called synaptic annealing. Synaptic annealing is the application of the simulated annealing algorithm to the problem of selecting synaptic weights in a feed-forward neural network. A novel formalism describing the combination of simulated annealing and neural networks is developed. Additionally, a novel extension of the simulated annealing algorithm, called anisotropicity, is defined and developed. The cross-validated performance of each synaptic annealing algorithm is evaluated, and compared to back-propagation when trained on several typical machine learning problems. Synaptic annealing is found to be considerably more effective than traditional back-propagation training on classification and function approximation data sets. These significant improvements in feed-forward neural network training performance indicate that synaptic annealing may be a viable alternative to back-propagation in many applications of neural networks.

AFIT Designator

AFIT-ENG-MS-16-J-060

DTIC Accession Number

AD1054216

Share

COinS