Date of Award


Document Type


Degree Name

Master of Science


Department of Operational Sciences

First Advisor

Matthew J Robbins, PhD.


We develop a Markov decision process (MDP) model to examine military evacuation (MEDEVAC) dispatch policies in a combat environment. The problem of deciding which aeromedical asset to dispatch to which service request is complicated by threat conditions at the service locations and the priority class of each casualty event, assuming MEDEVAC requests arrive sequentially, with the location and the priority of each casualty known upon arrival. The United States military uses a 9-line MEDEVAC request system to classify casualties using three priority levels. An armed escort may be required depending on the threat level indicated by the request. The proposed MDP model indicates how to optimally dispatch ambulatory helicopters to casualty events in order to maximize the steady-state system utility. Utility depends on casualty numbers, priority classes, and the locations of MEDEVAC units and casualty event. Instances of the dispatching problem are solved using a value iteration dynamic programming algorithm. Computational examples investigate optimal dispatch policies under different threat situations and potential armed escort delay.

AFIT Designator


DTIC Accession Number