Date of Award

9-1999

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Department of Mathematics and Statistics

First Advisor

Dennis W. Quinn, PhD

Abstract

Reducing a neural network's complexity improves the ability of the network to be applied to future examples. Like an overfitted regression function, neural networks may miss their target because of the excessive degrees of freedom stored up in unnecessary parameters. Over the past decade, the subject of pruning networks has produced non-statistical algorithms like Skeletonization, Optimal Brain Damage, and Optimal Brain Surgery as methods to remove connections with the least salience. There are conflicting views as to whether more than one parameter can be removed at a time. The methods proposed in this research use statistical multiple comparison procedures to remove multiple parameters in the model when no significant difference exists. While computationally intensive, the Tukey-Kramer method compares well with Optimal Brain Surgery in pruning and network performance. When the Tukey-Kramer method has inefficient sampling requirements, Weibull distribution theory alleviates the computation burden of bootstrap resampling with single sample analysis, while maintaining comparable network performance.

AFIT Designator

AFIT-DS-ENC-99-02

DTIC Accession Number

ADA368072

Comments

The author's Vita page is removed.

Share

COinS