Reviewed by
Walid Nasrallah
American University of Beirut
The introduction, subtitled “Contemplating a National Strategy for Modeling and Simulation”, offers a case study in how not to advocate for a cause. The author overstates the expenses of the desired outcome by comparing it to the space race of the 1960s. Next, the proposed benefits of the program are undersold through an over-generalized and diffuse classification into “global” and “domestic” tentative use cases. The coup de grace amounts to a self-insult, where the first concrete step in this “National Strategy” consists of helping the field of “Modeling and Simulation” define itself as a legitimate academic discipline. Would any venture capitalist invest in a start-up that has a nice patent, but which asks for more than it needs (because you paid Facebook that much), does not know its value proposition to a user, and hopes the venture capitalist can tell it what to make and then somehow convince people to buy it?
The next chapter is a cogent reminder of the need to consult a dictionary when picking up an obscure term from a book or article. The author starts off with a decent description of von Neumann/Morgenstern Decision Theory, noting correctly that its many present-day adherents use the adjective “prescriptive” to qualify its intended uses (Howard, 1988). The meaning of the term “prescriptive” and its synonym (in this sense) “normative” is that the theory tells decision makers how they ought to act if they intend to adhere to rules that, on the surface, appear to be the only guarantee of internal consistency. The author may have neglected to keep this definition in mind, since, upon bringing up Kahneman and Tversky’s (1979) decision science (Prospect Theory; see also Kahneman, 2013), he emphatically notes that decisions observed and described by researchers do not follow the formulas of normative decision theory. Descriptive and Normative are not meant to match, otherwise they would not need to be studied separately. Merely knowing the title of a popular exposition of Kahneman’s work, “Predictably Irrational” (see Ariely, 2010), and contemplating the distinction between predictability (which makes Prospect Theory a science) and rationality (which makes it a different branch of science from normative Decision Theory) would have saved this chapter from being written. The rest of the chapter focuses on the domain of military decision making and delves into details of an unjustifiably complex generalization of Prospect Theory without any thoughts to the combinatorial repercussions of the multiplicity of variable types. The chapter ends with a sample problem with four decisions. None of the decisions are presented in game-theoretic form, i.e. with a perceived benefit from knowing what actions an adversary might take. An astute reader might use this as an example of precisely where a concise decision theoretic model would do the most good.
Let us skip to chapter six, entitled “Business Process Modeling”. Its main error lies in underestimating the difficulty of validating a model. The authors correctly describe how rare it is for an initial iteration of model building to match observations of past behavior. But, once a past behavior is reasonably approximated, it’s full steam ahead. This may hold true in mathematical modeling, where inference by induction can be made as rigorous as one wants. In statistical modeling, there are well known error sources that an experienced statistician can point to and attempt to alleviate. But in simulation, the model builder has a lot of freedom to make the model do anything whatsoever. There is only one form of learning that can come from a model that is purely for simulation: if the simulation displays a behavior, then the model builder can know that the assumptions of the model are sufficient to account for that behavior. Simulation cannot tell you what aspects are necessary, only what is sufficient. Conflating the two is the source of many novice simulation errors. To their credit, the authors do describe Dynamic Systems modeling, which is purely mathematical, but by subsuming this description among simulation methodologies, they short-change a student reader’s understanding of both types of modeling.
Chapter seven is about finite elements analysis, which hardly belongs to the same topic; yes, it simulates reality, but like Newtonian physics, the simulation is close to exact, and like quantum physics, can be made arbitrarily exact along any single dimension.
The final chapter deals with a definition. In true scholarly form, the authors cite multiple sources for definitions of the term under question, “Interoperability”. Unfortunately, the extent to which multiple wordings of the definitions indicate any divergence of opinion is greatly exaggerated. The chapter demonstrates a “straw man” attack on a problem that seems manufactured: interoperability of interoperability definitions. A semantic equivalent to the whole chapter is probably found in a fragment of the first sentence in the summary: “[…] interoperability […] is of great importance in the military domain.” This importance is brought home by the multiplicity of US DoD acronyms used in the chapter, probably enough to get a reader into trouble, but not enough to get out.
To wrap up, the “Handbook” as a whole does a fine job of enumerating techniques (and their military acronyms), but offers no insight into their distinctions, strengths, weaknesses, or incompatibilities. The techniques are organized under distinct domains of use, but there is much redundancy and no cross-referencing. One gets the impression that a word-count goal was presented to the different authors, and the topics were shuffled randomly and given to authors who misconstrue the basic tenets of every topic. The articles that came out would not survive a day’s scrutiny on a crowd-sourced publication venue like Wikipedia, nor get any up-votes on an interactive medium like Quora or Reddit.
Readers interested in actually learning about real world applications of modeling and simulation would get more structure from a keyword web search. A novice might precede the keyword search by internalizing some basic definitions, conspicuously missing from the “Handbook”: A model is a way of selectively organizing information to pose a specific set of questions about the real world. Advance knowledge of the questions you which to pose helps keep the models form becoming impossibly complex. One way of “solving” a model to obtain answers to questions is simulation. Simulation can be variable-based and stochastic or agent-based (also stochastic). Social networks are one of many ways to organize agents in an agent-based simulation. Simulation can show that a model can give a certain answer (sufficient conditions) but cannot guarantee an explanation of the answer (necessary conditions). For that, you need mathematical solutions. Mathematical ways to solve a model encompass, among others, Game Theory and Systems Dynamics, and can be discrete or continuous (whereas simulation is always discrete). Even with mathematical models, the mapping of the model variables to real-world variables is a challenge in social sciences, less so in physical sciences. With that, you can start a productive search of how versatile, useful, and frustrating simulation and modeling can be.
HOWARD, R. A. (1988). Decision Analysis: Practice and Promise. Management Science 34 (6): 679–95. doi:10.1287/mnsc.34.6.679.
KAHNEMAN, D. (2013). Thinking, Fast and Slow. Farrar, Straus and Giroux, New York.
KAHNEMAN, D. and A. Tversky (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica 47 (2): 263–91. doi:10.2307/1914185.