Abstract: With the advent of nuclear weapons in the middle of the 20th century, the United States and other foreign powers have pursued nuclear nonproliferation and nuclear counterproliferation policies. Nuclear nonproliferation and counterproliferation efforts focus on a wide variety of issues including preventing the spread of nuclear materials and nuclear weapons technology to countries that do not yet have them and neutralizing or mitigating threats posed by those who have illegitimately obtained or are in the process of obtaining nuclear materials or a nuclear device. The aim of this dissertation is to propose statistical methods that can address technical challenges in implementing nonproliferation and counterproliferation policy. We focus on the problems of urban nuclear source localization, arms-control monitoring, and proliferation detection in dual-use scientific research (research the can be used for military or civilian applications). Effective urban source localization requires constructing a configuration of radiation detectors that is optimal for estimating the source location. We propose constructing a Gaussian process surrogate model of the mutual information design criterion used for configuration selection. We demonstrate that optimizing this inexpensive surrogate model leads to effective source localization over the space of all possible configurations for a simulated source localization scenario in downtown Washington, DC. For the arms control monitoring problem, we propose a novel Bayesian hidden Markov model that represents a concealed arms-development process being monitored as well observable data emitted by the process. Our model reparameterizes the HMM transition probabilities in terms of activity duration and allows us to answer key questions about the monitored process. We also propose a novel framework for incorporating subject matter expertise about the process and potential noise into the inference. We demonstrate the proposed approach on an underground explosives test case study. In addressing the dual-use technology problem, we propose two new variational inference loss functions for fitting a deep learning text classification model with uncertainty quantification. The framework is computationally tractable for large models and meets important uncertainty quantification objectives including producing predicted class probabilities that reflect our prior conception of how
different classes are related.
0 Replies
Loading