Date of Award
Doctor of Philosophy (PhD)
Department of Operational Sciences
Kenneth W. Bauer, Jr., PhD
This research advances feature and model selection for feedforward neural networks. Feature selection involves determining a good feature subset given a set of candidate features. Model selection involves determining an appropriate architecture number of middle nodes for the neural network. Specific advances are made in neural network feature saliency metrics used for evaluating or ranking features, statistical identification of irrelevant noisy features, and statistical investigation of reduced neural network architectures and reduced feature subsets. New feature saliency metrics are presented which provide a more succinct quantitative measure of a features importance than other similar metrics. A catalogue of feature saliency metric definitions and interrelationships is also developed which consolidates the set of available metrics for the neural network practitioner. The statistical screening procedure developed for identifying noisy features involves statistically comparing the saliency of candidate features with the saliency of a known noisy feature. The neural network selection algorithms are developed by posing the neural network model as a nonlinear regression statistical model, and using the likelihood ratio test statistic within a back-wards sequential procedure to search for a parsimonious model with equivalent prediction accuracy. Additionally, a comprehensive statistically-based methodology is developed for identifying both a good feature set and an appropriate neural network architecture for a specific situation.
DTIC Accession Number
Steppe, Jean M., "Feature and Model Selection in Feedforward Neural Networks" (1994). Theses and Dissertations. 6586.