Date of Award

6-1994

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Department of Operational Sciences

First Advisor

Kenneth W. Bauer, Jr., PhD

Abstract

This research advances feature and model selection for feedforward neural networks. Feature selection involves determining a good feature subset given a set of candidate features. Model selection involves determining an appropriate architecture number of middle nodes for the neural network. Specific advances are made in neural network feature saliency metrics used for evaluating or ranking features, statistical identification of irrelevant noisy features, and statistical investigation of reduced neural network architectures and reduced feature subsets. New feature saliency metrics are presented which provide a more succinct quantitative measure of a features importance than other similar metrics. A catalogue of feature saliency metric definitions and interrelationships is also developed which consolidates the set of available metrics for the neural network practitioner. The statistical screening procedure developed for identifying noisy features involves statistically comparing the saliency of candidate features with the saliency of a known noisy feature. The neural network selection algorithms are developed by posing the neural network model as a nonlinear regression statistical model, and using the likelihood ratio test statistic within a back-wards sequential procedure to search for a parsimonious model with equivalent prediction accuracy. Additionally, a comprehensive statistically-based methodology is developed for identifying both a good feature set and an appropriate neural network architecture for a specific situation.

AFIT Designator

AFIT-DS-ENS-94-1

DTIC Accession Number

ADA280709

Comments

The author's Vita page is omitted.

Share

COinS