Date of Award
Doctor of Philosophy (PhD)
Department of Mathematics and Statistics
Dennis W. Quinn, PhD
Reducing a neural network's complexity improves the ability of the network to be applied to future examples. Like an overfitted regression function, neural networks may miss their target because of the excessive degrees of freedom stored up in unnecessary parameters. Over the past decade, the subject of pruning networks has produced non-statistical algorithms like Skeletonization, Optimal Brain Damage, and Optimal Brain Surgery as methods to remove connections with the least salience. There are conflicting views as to whether more than one parameter can be removed at a time. The methods proposed in this research use statistical multiple comparison procedures to remove multiple parameters in the model when no significant difference exists. While computationally intensive, the Tukey-Kramer method compares well with Optimal Brain Surgery in pruning and network performance. When the Tukey-Kramer method has inefficient sampling requirements, Weibull distribution theory alleviates the computation burden of bootstrap resampling with single sample analysis, while maintaining comparable network performance.
DTIC Accession Number
Duckro, Donald E., "Multiple Comparison Pruning of Neural Networks." (1999). Theses and Dissertations. 5116.