Date of Award

6-1995

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Department of Mathematics and Statistics

First Advisor

Mark E. Oxley, PhD

Abstract

Researchers rely on the mathematics of Vapnik and Chervonenkis to capture quantitatively the capabilities of specific artificial neural network (ANN) architectures. The quantifier is known as the V-C dimension, and is defined on functions or sets. Its value is the largest cardinality 1 of a set of vectors in Rd such that there is at least one set of vectors of cardinality 1 such that all dichotomies of that set into two sets can be implemented by the function or set. Stated another way, the V-C dimension of a set of functions is the largest cardinality of a set, such that there exists one set of that cardinality which can be shattered by the set of functions. A set of functions is said to shatter a set if each dichotomy of that set can be implemented by a function in the set. There is an abundance of research on determining the value of V-C dimensions of ANNs. In this document, research on V-C dimension is refined and extended yielding formulas for evaluating V-C dimension for the set of functions representable by a feed-forward, single hidden-layer perceptron artificial neural network. The fundamental thesis of this research is that the V-C dimension is not an appropriate quantifier of ANN capabilities.

AFIT Designator

AFIT-DS-ENC-95J-01

DTIC Accession Number

ADA297408

Share

COinS