Date of Award

3-2022

Document Type

Thesis

Degree Name

Master of Science in Operations Research

Department

Department of Operational Sciences

First Advisor

Phillip R. Jenkins, PhD

Abstract

Members of the armed forces greatly rely on having an effective and efficient medical evacuation (MEDEVAC) process for evacuating casualties from the battlefield to medical treatment facilities (MTF) during combat operations. This thesis examines the MEDEVAC dispatching problem and seeks to determine an optimal policy for dispatching a MEDEVAC unit, if any, when a 9-line MEDEVAC request arrives, taking into account triage classification errors and the possibility of having blood transfusion kits on board select MEDEVAC units. A discounted, infinite-horizon continuous-time Markov decision process (MDP) model is formulated to examine such problem and compare generated dispatching policies to the myopic policy of sending the closest available unit. We utilize an approximate dynamic programming (ADP) technique that leverages a random forest value function approximation within an approximate policy iteration algorithmic framework to develop high-quality policies for both a small-scale problem instance and a large-scale problem instance that cannot be solved to optimality. A representative planning scenario involving joint combat operations in South Korea is developed and utilized to investigate the differences between the various policies. Results from the analysis indicate that applying ADP techniques can improve current practices by as much as 29% with regard to a life-saving performance metric. This research is of particular interest to the military medical community and can inform the procedures of future military MEDEVAC operations.

AFIT Designator

AFIT-ENS-MS-22-M-166

DTIC Accession Number

AD1173047

Share

COinS