ASD TECHNICAL REPORT 61-276 A SURVEY AND SUMMARY OF MATHEMATICAL AND SIMULATION MODELS AS APPLIED TO WEAPON SYSTEM EVALUATION RANDALL E. CLINE OPERATIONS RESEARCH DEPARTMENT INSTITUTE OF SCIENCE AND TECHNOLOGY THE UNIVERSITY OF MICHIGAN OCTOBER 1961 FINAL REPORT CONTRACT AF 33(616)-7317 PROJECT 5101 TASK 50951 AERONAUTICAL SYSTEMS DIVISION AIR FORCE SYSTEMS COMMAND UNITED STATES AIR FORCE WRIGHT-PATTERSON AIR FORCE BASE, OHIO 200 - December 1961 - 15-623

d4/ UH:/ ff^'i.. / FOREWORD Contract AF 33(616)-7317 between the United States Air Force and The University of Michigan extended from 1 May 1960 to 30 June 1961 and was budgeted for the equivalent of one full-time research worker. The contract was monitored by Aeronautical Systems Division, Directorate of Systems Dynamic Analysis, Synthesis and Analysis Division. Mr. R. H. Sudheimer served as the project engineer. Work at The University of Michigan was carried out in the Institute of Science and Technology, Operations Research Department, under the direction of Mr. E. H. Smith, Department Head. Mr. R. E. Cline served as principal investigator. ASD TR61-276

ABSTRACT The report contains a description of work performed on Contract AF 33(616)-7317. The major objective of the contract was to conduct a survey in order to obtain information concerning the utilization of mathematical modeling and simulation as techniques for weapon system evaluation as employed by the principal organizations engaged in this work. The report includes a description of the information collection program, a summary and classification of information obtained and conclusions concerning methods by which better information dissemination may be accomplished. Fifty-six uniform abstracts of weapon system modeling programs and an extensive bibliography of published reports are included in the appendices. From this survey it is concluded that many weapon system evaluation programs are not well documented and that information flow between organizations is generally poor. The following recommendations are made: (1) There is a need for a theoretical examination of the effects of replacing distributions by expected values in digital computer simulation models. (2) There should be future development and utilization of modular digital computer codes. (3) Standards for documenting and abstracting programs should be established which would have tri-service approval and adoption. ASD TR61-276 ii

TABLE OF CONTENTS 1. Introduction 2. Outline of the Research Program 3* The Survey Program 3 4. A Classification of Weapon System Evaluation Programs and Summary of Techniques Employed 5. Remarks on the Conference on Weapon System Evaluation Techniques 10 6. Summary and Conclusions 11 Appendix A. Summary of Trips and Organizations Appendix B. Mail Questionnaire 16 Appendix C. Abstracts J.8 Appendix D. Report List 133 Appendix E. Summary of Conference on Weapon System Evaluation Techniques 1 List of References A T 28 ASD TR61-276 iv

LIST OF ABBREVIATIONS AADCP Army Air Defense Command Post ABM Air Battle Model A/C Aircraft ADC Air Defense Command ADSID Air Defense System Integration Division AFBMD Air Force Ballistic Missile Division AFCOA Air Force Chief of Operations Analysis AFDAP Air Force Development and Planning AFESG Air Force Electronic System Division ARDC Air Research and Development Command ARGMA Army Ground Missile Agency ATABE Automatic Target-to-Battery Evaluation ASD Aeronautical Systems Division ASM Air to Surface Missile BC Battery Commander BOPP BOMARC Pattern Patrol BMD Ballistic Missile Division BMEWS Ballistic Missile Early Warning System BRL Ballistic Research Laboratory BUSANDA Bureau of Ships and Accounts BUWEPS Bureau of Weapons CAMP Cost of Alternative Military Programs CAP Combat Air Patrol CCDD Air Force Command and Control Development Division CCM Counter-Counter Measures CEIR Corporation for Economic and Industrial Research CEP Circular Probable Error CIC Combat Information Center D/A Department of the Army DCA Defense Communications Agency ECM Electronic Counter Measures ESP Event Sequenced Program ESS Experimental SAGE Sector GE/OD General Electric Ordnance Department GE/TEMPO General Electric-Technical Military Planning Operation G/S Guidance System GUISE Guidance System Evaluation IAWR Institute for Air Weapons Research IBM International Business Machines ICBM Intercontinental Ballistic Missile ASD TR61-276 v

MBM Missile Battle Model MISER Manned Interceptor SAGE Evaluation Routine MIT Massachusetts Institute of Technology MS Maintenance System NORAD North American Air Defense Command NORDAM Northrop Damage Assessment Model OEG Operations Evaluation Group OMEGA Operations Evaluation Model Group Air Force ORO Operations Research Office OTAC Ordnance Tank-Automotive Command PACAF pacific Air Force PADS Passive-Active Data Simulation PRC Planning Research Group R and D Research and Development RCAF Royal Canadian Air Force SABOT SAGE BOMARC Test SAC Strategic Air Command SAFE Strategy And Force Evaluation SAGE Semi-Automatic Ground Environment SASS SAGE ATABE Simulation Study SATIN SAGE Air Traffic Integration SDC System Development Corporation SIF Selective Identification Feature SMITE Simulation Model of Interceptor Terminal Effectiveness SOP Standing Operating Procedure SRG Systems Research Group STAGE Simulation of Total Atomic Global Exchange TAPRE Tracking in an Active-Passive Radar Environment TDDL Time Division Data Link TDG/TNA Track Data Generation/Tracking and Analysis USAEPG United States Army Electronic Proving Ground USAF United States Air Force USAFE United States Air Force Europe USAFESD United States Air Force Electronic Systems Division USASADEA United States Army Signal Air Defense Engineering Agency USCONARC United States Continental Army Command WADC Wright Air Development Center WADD Wright Air Development Division WARM Weapons Assignment Research Model WSEG Weapons Systems Evaluation Group WSL Weapons Systems Laboratory ASD TR61-276

LIST OF ABSTRACTS IN APPENDIX C Abstract No. Organization Designation Page No. 1 Aeronutronic Systems, Incorporated Shillelagh 18 2 Aeronutronic Systems, Incorporated Anti-tank Minefield Study 20 3 Applied Physics Laboratory Fleet Air Defense 22 4 Bendix Systems Division Eagle 2 5 Corporation for Economic and Combat Simulation Model 27 Industrial Research 6 Chance-Vought Corporation Manned Space Flight Simulator 29 7 Chance-Vought Corporation Manned Space Flight Simulator 31 (Fixed Base) 8 General Electric-TEMPO MARK I Study 34 9 Hughes Aircraft Company SMITE 36 10 Institute for Air Weapons Research SCRAMBLE Model 38 11 MIT-Lincoln Laboratory SIMPLEX Study 39 12 The MITRE Corporation XOVER 41 13 The MITRE Corporation SASS 43 14 The MITRE Corporation TLQ-8 Simulation 4 15 The MITRE Corporation BOPP 47 16 The MITRE Corporation MISER 49 17 The MITRE Corporation SASTRO 51 18 The MITRE Corporation TAPRE 53 19 The MITRE Corporation SABOT 20 The MITRE Corporation PADS 57 21 The MITRE Corporation TDG/TNA 39 22 The MITRE Corporation BOBCAT 61 23' Norair Division-Northrop Corporation MS Model 64 24 Norair Division-Northrop Corporation Radar Reflectivity Model 66 25 Norair Division-Northrop Corporation SM-62A 68 26 Norair Division-Northrop Corporation NORDAM 70 27 Norair Division-Northrop Corporation Maintenance Study 72 28 Norair Division-Northrop Corporation Terminal Attack Model 74 29 Operations Research Office Small Unit Ground Combat 76 30 Operations Research Office THEATERSPIEL 78 31 Operations Research Office Study 43.4 81 32 Operations Research Office Maintenance Model 83 33 Operations Research Office Simulation of SAM Systems 85 34 Planning Research Corporation Project P-18 87 35 Planning Research Corporation Mobility of Ballistic Missile Forces 89 36 Radio Corporation of America BMEWS 91 37 The RAND Corporation Movement of Military Air Cargo 93 ASD TR61-276 vii

38 The RAND Corporation STRATEGIC PLANNER 9 39 The RAND Corporation MUSTARD I and II 97 40 The RAND Corporation SAFE 99 41 The RAND Corporation Alternative Central War 101 Strategies 42 Stanford Research Institute Project 2351 103 43 Strategic Air Command, Hq Event Sequenced Program 105 44 Strategic Air Command, Hq Project Rothamsted 107 45 Strategic Air Command, Hq Comparison of GAM-77 and GAM-87 108 46 Strategic Air Command, Hq 59A War Game 110 47 System Development Corporation WARM 112 48 System Development Corporation GUISE 1J4 49 System Development Corporation Air Surveillance Tracking 116 Subsystem 50 System Development Corporation System Training Program 118 51 Systems Research Group MILITRAN I 120 52 Systems Research Group Marine Air Defense Simulation 122 Model 53 Technical Operations, Incorporated Air Battle Model II 124 54 United States Air Force, Hq Simplified Penetration Model 126 55 The University of Michigan ATABE Model 128 56 Weapon Systems Evaluation Group Fallout Model 131 ASD TR61-276 viii viii

1. INTRODUCTION With the advent of large-scale, complex, man-machine systems, many organizations have been using modeling and simulation as techniques for system evaluation. Because of an emphasis on military programs, much of this work has been oriented toward the evaluation of performance of existing weapon systems and the prediction of performance of future weapon systems. Although references to reports on non-military applications of modeling and simulation have been collected in a number of bibliographies, no corresponding assimilation of military appliation of tar acts has been attempted. Moreover, although classified symposia are held by various organizations at which military applications of modeling and simulation are reported, many programs are not reported, and papers presented are generally brief and describe only the broad aspects of programs. Consequently, it is currently a difficult task to ascertain what problems have been considered and the techniques which have been employed. The present report contains the results'of a survey of weapon system evaluation techniques conducted for the United States Air Force by The^-sniversity of Michigan under contract AF 33(616) 7317. By agreement with the monitoring organization, the survey was generally limited to programs in progress or completed since 1957; and major emphasis was directed toward programs which seemed pertinent to Air Force weapon systems. Within the limitations of the contract, neither a complete census of organizations engaged in weapon system evaluation nor complete abstracting of all programs of organizations contacted was attempted. Thus, this report provides references to only a number of the programs in progress or completed since 1957 for which information was obtained. 2. OUTLINE OF THE RESEARCH PROGRAM2 As a guide in conducting the survey, the following types of information were to be obtained about various models whenever possible. (a) Criteria used for judging system performance by use of the model. (b) Important parameters, variables, and operating procedures taken into account by the model. 1References 1 and 2 2The outline of the type of information to be obtained which appears in this section is abstracted from the formal statement of work, dated 6 April 1960. In this outline, the term "model" is employed in the generic sense of mathematical or simulation model. Manuscript released by author June 1961 for publication as an ASD Technical Report. ASD TR61-276

(c) Description of the model used to relate the criteria to the parameters, variables, and operating procedures. This includes such things as: (1) Logical flow charts when available. (2) Form of input and output to the model. (3) Experimental procedures using the model. (4) Generality of the model. (d) Procedures used to validate the model and to test the ability of the model to predict system performance. (e) The pattern of construction for the model in terms of building blocks, classified according to similar levels of importance, their relation and assembly, and how they are expressed, for example, as appropriate routines, subroutines, or tables. (f) A description of how the model was used and how well it met the need and specifications of the interested agencies. (g) Estimates of time and costs needed for the construction of the model and for performing experiments on the model. Using the information collected, programs of the various agencies contacted were to be abstracted whenever possible, and each abstract was to contain the following breakdown of information: (a) Title and date of model. (b) Objective of model. (c) Type of model or analysis. (d) Identification. (1) Agency sponsoring project. (2) Agency responsible for the research. (3) Principal investigators. (e) Inputs-weapon systems, operating procedures, parameters. (f) Outputs. (1) Principal results. (2) Parameter variation studies. (g) Availability of reports and programs. (h) Type of analysis performed. 2

(i) Type and model of computer used. (j) Current use of the model. Further, this information was to be assimilated in an attempt to draw preliminary conclusions about the general methods of model construction and weapon system evaluation. 3. THE SURVEY PROGRAM In conducting the survey, contacts with organizations were established through visits by project personnel. These visits, listed in Appendix A with other pertinent information, were for the purpose of obtaining information identifying programs appropriate for inclusion in the survey, principal personnel and relevant reports. Whenever feasible, individual programs were discussed in detail with the principal investigator or other cognizant persons. For an organization such as RAND or ORO in which many programs are in progress, discussions of programs in detail were clearly impossible within the limited time scheduled for a visit. However, programs of these organizations are generally well documented and known to other persons engaged in modeling and simulation work. Consequently, in this report only a sample of programs of RAND and ORO are included. In addition to the difficulties in discussing programs in detail resulting from time limitations, it was found in many organizations that in the case of completed programs, the persons who had performed the work were frequently no longer employed with the organization; and in the case of programs in progress, the principal investigator often had not completely formulated the problem or decided upon the technique of analysis to be employed. Because of the impracticability of discussing programs in detail, it was apparent that detailed information could best be obtained from documents and other written material. With respect to existing policies for documenting programs, wide variations were observed both among and within organizations. Whereas some programs are documented in reports containing readable technical descriptions of the techniques of analysis employed, others contain only summaries of the gross details or, at the opposite extreme, many technical details but no general description. (This latter type of document is best exemplified by reports containing program codes for digital computers but no accompanying flow charts.) Finally, for a number of programs discussed with persons at organizations visited, no formal reports had been prepared. This was found for completed programs as well as programs still in progress. In some cases, pertinent material was never processed beyond the form of hand written notes. 3

In addition to the problem of variability in policies for documenting programs, a number of organizations were visited at which reports, when prepared, were considered proprietary by either the contracting organization or the sponsor, and were subject to limited distribution. These limitations in distribution of reports for proprietary reasons were not only found between military services but also within divisions of a service. Subsequent requests for documents of this type were refused in all but one case. The difficulties encountered in obtaining reports and in interpreting them when obtained led to the construction of a questionnaire which was used to supplement information from both interviews and reports. This questionnaire, exhibited in Appendix B, was employed both at organizations visited and in contacting organizations to which visits were not made. Since the format of the questionnaire was found to have general applicability for summarizing various programs, it was also employed in the preparation of abstracts for inclusion in the present report. Whenever possible, these abstracts were prepared directly from parts A through D of completed mail questionnaires with a minimum of editorial changes. Appended to these abstracts are general remarks concerning related work of other organizations. These abstracts, included in Appendix C, provide the basis for much of the summarization of information presented in Section 2 of this report. A report list of the reports and reprints acquired during the survey is included in Appendix D. In addition, references to other reports are contained in various abstracts. However, a number of these reports (especially internal reports) were consideredas proprietary at the time at which they were requested. In Section 4, the information collected during the survey using a combination of visits to organizations, reports acquired, and completed mail questionnaires will be summarized and discussed. For this purpose classifications of programs based on the general techniques of analysis employed and the utilization of weapon systems or subsystems considered will be introduced. Although these classifications are based on descriptive information and to a large extent reflect subjective interpretations of similarities and differences among the techniques of analysis, it it hoped that this first attempt at structuring and characterizing modeling and simulation programs will motivate interest in expanding and improving such classifications. 4

4. A CLASSIFICATION OF WEAPON SYSTEM EVALUATION PROGRAMS AND SUMMARY OF TECHNIQUES EMPLOYED Although the abstracts of programs in Appendix C exhibit a wide variety of problems in which the use of modeling and simulation as techniques for weapon system evaluation has been employed, similarities among various programs can be discerned. For this purpose the following general classification of programs in terms of the basic types of analyses employed is introduced with reference to appropriate abstracts in Appendix C. I Man-machine simulation (6) (7) (17)3 II Game-type simulation (30) (31) (40) (50) III Analytical models (1) (2) (26) (35) (45) (56) IV Digital computer simulation models 1. Air defense models (3) (4) (13) (19) (22) (33) (42) (44) (52) (55) 2. Penetration models (10) (46) (54) 3. Two-sided war games (9) (16) (28) (29) (43) (53) 4. Maintenance models (8) (23) (27) (32) (34) 5. Network flow models (11) (37) 6. Miscellaneous (12) (14) (15) (18) (20) (21) (24) (25) (36) (38) (41) (47) (48) (49) (51) As indicated in this summary, the majority of programs abstracted are classified as digital computer simulation models. Thus, although the techniques of analysis employed in programs designated as types I, II, and III are briefly discussed, no further delineation of subclassifications is attempted. Within the context of the present discussion the category of programs designated as manmachine simulation consists of those programs in which pieces of equipment with human operators are integrated directly into a laboratory prototype to represent corresponding or related portions of a real system. As illustrated by the programs abstracted, man-machine simulation in a laboratory environment combines men and equipment with analog and/or digital computers to simulate Throughout this section numbers in parenthesis refer to correspondingly designated abstracts in Appendix C. 5

other components of the system. Since simulations of this type which permit the investigator to observe the performance of equipment and human operators under various conditions are frequently employed in system design and development programs, the majority of these programs are conducted in engineering groups. Therefore, in view of the work of persons contacted during the survey, few were directly concerned with man-machine simulations. Although similar to man-machine simulation programs in the use of human participants, gametype simulations are distinguished on the basis of the increased emphasis on the role of the participant to employ experience and judgment in making decisions. Most frequently constructed for their educational benefits, game-type simulations essentially provide laboratory environments in which the participant is required to make decisions similar to those to be encountered in a real situation. In contrast to man-machine simulation, computers (if employed) in game-type simulation generally serve only as a means for rapidly storing, processing, and displaying pertinent information and not to represent portions of the system per se. Game-type simulations are illustrated by the abstracts of SAFE and INDIGO in Appendix C. Also, references to game-type simulations of the Logistics Systems Department of RAND are included in the report list. From a general viewpoint, the categories designated as analytical models and computer simulation models differ from man-machine and game-type simulations in that representations in the models must be completely specified. That is, in programs of types III and IV, all relevant factors concerning men and/or machines are represented abstractly, and the results of human decisions, judgment, reaction times, etc., are considered only in the contexts specified a priori by the model builder. Analytical models ("paper and pencil" models) are considered as analyses in which outputs are obtained using mathematical and/or statistical tools with hand calculation to derive closed functional forms. In general, these forms are derived in terms of one or more parameters as exemplified by the abstracts in Appendix C. A computer, if employed in conjunction with an analytical model, is used in estimating parameters or in performing parameter variation studies. In view of the diversity of mathematical and statistical techniques employed in analytical models and the comparatively small number which were considered as major programs within the organizations contacted, no further classification is considered to distinguish among the analytical models. Within the general partitioning of computer simulation models indicated above, one finds certain similarities among air defense models, penetration models, and two-sided war-games. These 6

similarities become apparent when it is observed that within each abstract the inputs and factors can generally be partitioned into (1) measures which characterize the offensive forces, (2) measures which characterize the defensive forces, and (3) measures which characterize interactions of offensive and defensive forces. More specifically, although dependent in part upon the general purpose of the program, the measures which characterize offensive forces, defensive forces, and interactions of offensive and defensive forces can be categorized in the following manner: For Offensive Forces (1) Number and types of offensive launching platforms with designation of weapons and/or decoys to be deployed. (2) Initial location and characterizations of motion of offensive launching platforms (either in absolute terms or relative to other platforms). (3) Criteria for weapon release (including deployment rates - if appropriate). (4) Alternative tactics and conditions under which weapons are to be employed. (5) Availability and characteristics of equipment for countermeasures, etc. For Defensive Forces (1) Number and types of defensive launching platforms with designation of weapons to be deployed. (2) Initial location of defensive launching platforms and, if appropriate, a characterizae tion of their motion. (3) Rates of fire (measured directly or in terms of time delays). Various modes of operation and firing doctrines to be employed. (4) Availability and characteristics of equipment to counter countermeasures. For Interactions of Offensive and Defensive Forces (1) For the various offensive weapons-defensive targets combinations the probabilities of accomplishing various levels of physical damage or degradation in operation. (2) For the various defensive launching platform-offensive launching platform (and/or weapon) combinations, the probabilities of acquisition and lock-on, and kill by defensive weapons. (3) Changes in the above measures as tactics of the offensive and/or defensive modes of operation. 7

Within the various abstracts in Appendix C one finds in general that, for air defense models, the measures which characterize the defensive forces are considered in detail, whereas the measures which characterize offensive forces are simplified. In the case of penetration models the converse is generally true. For two-sided war games emphasis on detailed as opposed to simplified representations is largely dependent upon the proposed uses of the model. For example, in the twosided global air war considered in Air Battle Model II (53) there iS a general symmetry in the assumed structure of opposing forces. Thus, details considered in the interaction of offensive forces of one side with defensive forces of the other are generally identical to the converse interaction. However, in two-sided war games such as a fighter-bomber duel (9), this symmetry does not exist. Consideration of detailed and simplified representations produce a spectrum of models ranging from completely stochastic models in which each variable is introduced using assumed or experimentally measured distributions to deterministic models in which each distribution is replaced by an expected value. Persons who are concerned with long range forecasts tend to have little use for statistical distributions since it is not feasible to do more than make educated guesses as to the value range of parameters in a system still in the predesign stage. On the other hand, in evaluating an existing system many of the distributions can be measured, and it is sometimes quite important to make use of these distributions in the analysis. The intermediate points of this spectrum of models consist of inputs, some of which are introduced in terms of distributions and others introduced simply as expected values. When persons responsible for models of this type were questioned, they expressed no doubts concerning the legitimacy of their decisions as to which inputs should be treated one way and which should be treated the other. In each case, however, the decisions seem to have been made on the basis of factors peculiar to the particular problem. Moreover, no person in this survey could cite even a special theory which would give conditions under which a distribution can be replaced by an expected value without masking the effects of other variables. Since many of the completed questionnaires did not specify which inputs were introduced stochastically, no delineation of types of models on this basis can be made. A second criterion of interest in distinguishing among various models in the three subclassifications is the technique by which the system is updated in time. 8

In examining the methods employed to update the systems in models having time as an independent variable, two general techniques have been observed. These are (1) to update the system periodically using a constant updating interval of time, and (2) to update the system whenever a significant event occurs. (This is sometimes called the "time status record method" or "event sequence method" for updating.4) Using periodic updating, the total period in which a system is to be operative is partitioned into finitely many intervals of equal length, say At, and updating is done in each interval. With the minor variations noted below, this general type of updating is employed in the programs in abstracts (10), (53) and (54). In some models, however, it is expedient to adopt an alternative procedure of updating only when significant events occur. This is a particularly appropriate technique for updating in models Df systems in which system performance is closely related to human intervention. For example, in 3 model of a system in which a human operator initiates an operation at a specified time, say t, 7hich requires time, t,, (t may be sampled from a distribution), there is no need to consider /2. iim again until time t = t + to. This is opposed to the first method in which the updating is periodic and the human operator might be examined one or more times before the operation was completed. Formally, the time status record method comprises a collection of upcoming events, aach labeled with its time of occurrence and filed in order of increasing time. Each stage of.he computation may generate additional events and times. Following each stage the computation loves to the event with smallest time label. Thus, the interval of time used is not constant but s dependent upon events or sequences of events. This type of updating is employed in the prorams summarized in abstracts (3) and (55). In addition to this general distinction among programs on the basis of the use of fixed ime interval or the time status record for updating information, there are also variations among he programs of each type. For programs using fixed time interval updating,this includes variaions such as the introduction of major and minor program cycles in which updating time interval f lengths of At and kAt are employed (where k is of the form k = l/n, with n a fixed positive nteger). Thus, when there are few significant changes in a system, time is increased by incre4 Reference 3 9

ments At. However, for critical portions of the program in which more detail is required concerning sequences of events, the time is increased by increments of k t. For example, this type of updating is employed in the SCRAMBLE model (10) in which At = 6 minutes and kIAt = 1.5 minutes. For programs in which the time status record is employed, variations are introduced by the different procedures adopted in preparing inputs. In the ATABE model (55) a minimal number of key events are specified a priori and subsequent chains of events dependent on the outcome of key events are generated only as required. The contrasting philosophy employed in some programs, e.g., the Missile Battle Model5, is to specify as many events as possible prior to starting computations within the main portion of the program. Although the choice between these methods for using the time status record lacks theoretical justification, it was noted by persons using the latter method that the maximal -list of precalculated events can be employed efficiently in models which are to be replications several times. Within the abstracts of programs in Appendix C categorized as maintenance models, one finds general similarities in the inputs and variables in the models. However, the degree of similarity is dependent upon criteria such as the echelon structure of the system, the amount of detail introduced (in terms of distributions as opposed to expected values), and the number of cost factors considered. Because of lack of information it can only be conjectured that if sufficiently detailed descriptions (e.g., flow charts) of these models were available, a subclassification which would indicate basic similarities and differences among the models could be constructed. Analogous to the programs on which maintenance models are constructed, the network flow models have general similarities in terms such as numbers and designation of nodes and connecting links. Although these suggest the possibility of constructing subclassifications, again sufficient information was not obtained. The only observation made concerning the miscellaneous category is to call attention to the general remarks appended to the abstract of MILITRAN I (51). 5. REMARKS ON THE CONFERENCE ON WEAPON SYSTEM EVALUATION TECHNIQUES As a part of contract AF 33(616)-7317, a Conference on Weapon System Evaluation Techniques was held on 11, 12, 13 May 1961 at The University of Michigan's Dearborn Center in Dearborn, s Reference 4 10

Michigan. The purpose of this conference was to bring together representatives from various organizations to discuss the use of modeling and simulation as techniques for weapon systems evaluation. Because of the wide variation in the background, interests and orientation of persons attending the conference, and the variety of weapon systems being considered by the organizations represented, there were few points on which agreement was unanimous. There was, however, discussion of a number of general and specific points. These points are enumerated in a summary of the conference included as Appendix E. 6. SUMMARY AND CONCLUSIONS The results and conclusions of this survey of weapon system evaluation techniques are as follows: (1) A number of problems were ascertained which should be investigated. These include (a) a theoretical examination of the effects of replacing distributions by expected values in digital computer simulation models, and (b) the further development and utilization of modular computer codes. (2) Many weapon system evaluation programs are not well documented, and information flow between organizations is generally poor. As a result, persons in various organizations are found to be working on closely related problems without the benefits of mutual exchange of information. This is compounded by the restrictions resulting from both proprietary and security reasons, and from the preoccupation of persons with their own work. (3) With regard to (2) it is concluded that a program should be established for the purpose of disseminating information concerning the uses of modeling and simulation as techniques for weapon system evaluation. In view of the similarities among programs concerning weapon systems, such a program should have tri-service approval. (4) Standards with- tri-service approval and adoption for documenting and abstracting programs should be established. As observed in Section 4, no person contacted could cite a theory to give conditions under which a distribution can be replaced by an expected value without masking the effects of other variables. Since this problem is basic to any decision between constructing detailed or simplified models, the need for theoretical study is obvious. With respect to the construction of modular programs 11

and general tools to be employed in digital computer coding, the programs of Technical Operations, Inc., and Systems Research Group are orientated in this direction. However, until this information is disseminated, the tools cannot be generally adopted by other organizations. In view of the time required to establish working contacts with organizations, any continuation or expansion of an information collection program of the type described in this report should be initiated before the list of persons at organizations in Appendix A becomes obsolete. Moreover, any subsequent w6rk should be planned in such a way as to make maximal use of questionnaires. Since many programs are executed on a yearly basis, a suggested output of additional information collection program is to publish annual supplements to the present report with whatever refinements or extensions of the preliminary classification are appropriate. In conjunction with such a program, a central library of reports on modeling and simulation programs should be established, either within a government agency or by contractual agreement, to provide persons engaged in this type of work with information. Conceivably, the function of a library program could range from simply preparing and distributing lists of programs, reports and principal investigations to the acquisition and dissemination of detailed information such as reports and subroutines for computer programs. However, in view of the problems of limited distribution of reports for proprietary reasons, it is probably impossible to establish a program of the latter type. As indicated in Section 3, there are wide variations in existing policies for documenting programs. Consequently, standards should be established to specify a format for preparing reports and abstracts and the types of information to be included. Since many problems concerning weapon systems within one military service have possible application to similar or related problems of the other services, these standards should be established and adopted by all services. 12

APPENDIX A: Summary of Trips and Organizations Visited ORGANIZATION VISITED DATE OF VISIT IST PERSONNEL ORGANIZATION PERSONNEL Stanford Research Institute 7/25/60 R. Cline A. Brown Menlo Park, California R. Thrall W. Madow C. Perry A. Christman G. Evans I. Yabroff L. Davies L. Low D. Guthrie, Jr. Lockheed Aircraft Corporation 7/26/60 R. Cline P. Taulbee Menlo Park, California R. Thrall A. Block Go denBroeder C. Boll Sylvania Electronics Defense Laboratory 7/26-27/60 R. Cline F. Russo Mountainview, California R. Thrall B. Wambsganss J. Harley L. Hunter R. Krolick F. Proschan Stanford University 7/27/60 R. Thrall A. Bowker Stanford, California H. Solomon J. Lieberman H. Chernoff The RAND Corporation 7/28/60 R. Cline E. Quade Santa Monica, California 8/5/60 R. Thrall N. Dalkey L. Wegner D. McCarvey C. Sturdevant J. Peterson F. Tonge F. Eldridge Air Force Ballistics Division 7/29/60 R, Cline N. Hu Inglewood, California R. Thrall Systems Development Corporation 7/29/60 R. Cline J. Hedenberg Santa Monica, California 8/5/60 R. Thrall W. Warren W. Karush R. Carmichael R. Knight R. Deutsch R. Kao General Electric, TEMPO 8/1/60 R. Cline E. Fullenwider Santa Barbara, California R. Thrall D. Taylor W. Thompson F. Jackson Douglas Aircraft Corporation 8/2/60 - i.:..- R. Cline R. Stillinger Santa Monica, California Ro Thrall F. Eastman B. James Aeronutronic Systems, Incorporated 8/2/60 R. Cline H. Weiss Santa Ana, California R. Thrall W~ Barnum 13

Space Technology Laboratories, Inc. 8/3/60 R. Cline R. Rector Hawthorne, California R. Thrall G. Welch R. Hull Norair Division 8/3)5/60 R. Cline J. Taylor Northrop Corporation Ro Thrall J. Gehrig Hawthorne, California J. Paris E. Kamiya J. Oliver D. Gouze P. Chaiken G. Ivanoff S. VanNorman Ramo-Wooldridge, Incorporated 8/4/60 R. Cline J. Salzer Canoga Park, California R. Thrall A. Vazsonyi W. Bauer Planning Research Corporation 8/4/60 R. Cline J. Gordon Los Angeles, California R. Thrall I. Garfunkel Ramo-Wooldridge, Incorporated 10/17/60 R. Cline F. Marzocco Canoga Park, California J. Dulin R. Perkins Hughes Aircraft Corporation 10/19/60 R. Cline H. Crowder Culver City, California P. Kennard B. Seid Lockheed Aircraft Corporation 10/20/60 R. Cline S. Frey Burbank, California W. Jones J. Kahn Applied Physics Laboratory 11/15/60 R. Cline C. Meyer Silver Spring, Maryland R. Thrall M. Wadde4l Operations Research Office 11/15/60 R. Cline L. Rumbaugh Bethesda, Maryland R. Thrall and others Hq Air Research & Development Command 11/15/60 R. Cline P. English Andrews Air Force Base, Maryland R. Thrall H. Harris CEIR 11/16/60 R. Cline H. Fassberg Arlington, Virginia R. Thrall T. Goldman J. Faucett L. Tepper Weapons System Evaluation Group 11/16/60 R. Cline S. VanVoorhis Office of the Assistant Secretary of Defense R. Thrall H. Everett Washington, D. C. Operations Evaluation Group 11/16/60 R. Cline D. Mela Office of Chief of Naval Operations R. Thrall Navy Department Washington, D. C. US Army Strategy & Tactics Analysis Group 11/17/60 R. Thrall A. DeQuoy (STAG) Washington, D. C. Federal Systems Division 11/17/60 R. Thrall B. Oldfield IBM Corporation Rockville, Maryland 14

Technical Operations, Incorporated 11/17/60 R. Cline S. Harrison Washington, D. C. J. Jenkins Office of Naval Research 11/17/60 R. Cline S. Shtulman Department of the Navy S. Selig Washington, D. C. Analytic Services Incorporated 11/17/60 R. Cline J. Garafola Alexandria, Virginia R. Thrall B. Geehter R. Alterowitz J. Matheson J. Berteman Institute for Air Weapons Research 1/9/61 R. Cline N. Painter Chicago, Illinois J. Miller W. Searcy M. Taibleson Air Battle Analysis Division 1/26/61 R. Cline A. Daniels Headquarters USAF R. Thrall W. Jones Washington, D. C. IBM Research Center 4/10/61 R. Cline J. Navarro Yorktown Heights, New York J. Miller R. Shaw Radio Corporation of America 4/11/61 R. Cline A. Davies Moorestown, New Jersey M. Hawley United Aircraft Corporation 4/11/61 J. Miller A. Sherman E. Hartford, Connecticut H. Hesse Systems Research Group 4/12/61 R. Cline H. Shapiro Mineola, L.I., New York E. Levine The MITRE Corporation 4/13/61 R. Cline M. Phelps Bedford, Massachusetts J. Miller J. Dominitz J. Gallant Technical Operations, Incorporated 4/13/61 R. Cline R. Langevin Burlington, Massachusetts J. Miller D. Batten R. Behnke Do Meals Lincoln Laboratories 4/14/61 J. Miller A. Armenti Lexington, Massachusetts J. Nolan R. Prosser 0. Selfridge CONFERENCE ATTENDED DATE IST PERSONNEL Third War Games Symposium 10/6-8/60 R. Cline The University of Michigan R. Thrall Symposium on Mathematical Optimization 10/18-20/60 R. Cline Techniques The RAND Corporation Santa Monica, California Seventh International Meeting of Institute 10/20-22/60 R. Thrall of Management Sciences New York, New York 15

APPENDIX B: Questionnaire A. Identification of contract on which model is being constructed: 1. What agency is sponsoring the contract? 2. What agency is responsible for monitoring the research on the contract? 3- What is the official title of the contract? 4. What name is used within your organization to designate the contract? 5 When was work on the model initiated? 6. What is the target date for completion of the model? 7* Who is the principal investigator assigned to the research? B. Description of model: 1. What physical system or subsystem is being modeled? 2. What is the principal objective of the model? LI System evaluation D__ System design I__ Other (please specify) 3. What type of model is being constructed? LI Analytical Computer simulation (a) Real time I Fast time I (b) Type and size of computer used? (c) Approximate time required for one run of the model on the computer? I- Other (please specify) 4. (a) What input information is necessary in order to use the model? (b) How are these inputs introduced into the model (in terms of distributions, expected values, etc.)? 5 What are the variables in the model? 6. What outputs are obtained from the model? 7- What criteria are employed within the model to evaluate system performance or system design? C. Validation and use of the model: 1. (a) Have any experiments been completed using the model? If so, please specify. (b) Who was responsible for these experiments? 16

2. Have any tests been performed to validate the model? 3. (a) Are any experiments or tests being planned? (b) Who will be responsible for these tests or experiments? 4. What has been the general response of the sponsor to this work? 5. (a) If this model has been completed, what agency received the results? (b) For what purposes have the model and significant results been used (either by the agency receiving the results or by other agencies)? D. Documentation: 1. Are there any published or unpublished reports available which describe the use of this model? (please identify) 2. Are there logical flow charts available which can be used, either in conjunction with the documentation listed above or separately, to aid in summarizing the pertinent features of the models? E. General Discussion: 1. What is the general philosophy within your organization relative to the construction of detailed models as opposed to simplified models? 17

APPENDIX C: Abstracts ABSTRACT (1) Aeronutronic Systems, Inc. A. Identification: 1. Agency sponsoring contract: OTAC 2. Agency monitoring research: ARGMA 3. Official title of contract: DA-04-495-ORD-1835 4. In-house designation: Shillelagh 5. Date of initiation: January 1959 6. Completion: Model under continual revision throughout program 7- Principal investigator: W. C. Barnum B. Description of model: 1* Physical system being modeled: Tank armament 2. Principal objective of model: System evaluation, system design. During initial stages of the contract the modeling effort was directed toward assistance in system design. At this point in the program the main effort is toward system evaluation. 3. Type of model: Analytical 4. Input information: Inputs describe the characteristics of friendly and enemy armor, armament and tactics of employment, as well as the natural environment. At various stages of development the inputs have been expected values and are now being replaced by distribution as these are being known. In the case of analog simulation, inputs have been supplied by actual missile hardware. 5. Variables in model: Armor Target size Target vulnerability Target speed Number of targets Probability of selection Armament Rate of fire Accuracy Lethality Number of rounds available Range Natural environment Altitude Temperature Visibility Terrain 18

6. Outputs from model: Probability of kill 7- Criteria to evaluate system performance: Expected number of successful engagements. C. Validation and use of the model: 1. Experiments completed: Analytical calculations are being confirmed by analog simulations. 2. Tests to validate model: None 3. Future experiments and/or tests: Actual test firings and field tests will be conducted with the complete system. Aeronutronic will conduct R and D tests. Army Ordnance will conduct engineering tests. Fort Knox, Ky. will conduct user tests. 4. Response of sponsor: Independent modeling at BRL, Aberdeen Proving Grounds have generally confirmed the results. D. Documentation: 1. Published or unpublished reports: Aeronutronic Internal Reports 4-S-7, 4-S-61, 4-S-84 2. Logical flow charts available: In the above documents~ E. General Discussion: The following comments on the construction of models, included with the completed questionnaires, expresses the general philosophy adopted by this group: The amount of time and expense required to program a detailed model is not returned unless one completely understands the nature and validity of the model outputs. A computer is easily capable of swamping the worker with outputs. On several instances it has been necessary to return to simplified models to understand the results obtained from the detailed model. Further, the detailed model requires inputs in the nature of distributions. Generally, the form of these distributions are unknown as well as their parameters. The use of simplified submodels gives an insight into the nature of these detailed parameters. In time it is then possible to construct the larger model, built upon the foundation of a number of simplified models. 19

ABSTRACT (2) Aeronutronic Systems, Inc. A. Identification: 1. Agency sponsoring contract: Picatinny Arsenal 2. Agency monitoring research: Picatinny Arsenal 3- Official title of contract: DA-04-495-501-ORD-1776 4. In-house designation: Antitank Minefield Study 5. Date of initiation: August 1959 7* Principal investigator: Benjamin Lapidus B. Description of model: 1. Physical system being modeled: The systems being modeled are various combinations of mines/fuses/fields/tactics/tanks possible for antitank warfare. 2. Principal objective of model: System evaluation, system design. The intent of the model is to evaluate the system but the evaluation will, of course, aid. in future system design. 3. Type of model: Analytical 4. Input information: Inputs describe characteristics of individual mines/fuse combinations versus individual tanks. These characteristics include costs of the mine/fuse combination in various terms, for example, cost or weight. How inputs are introduced: Expected values 5* Variables in model: Mine/Fuse Weight Cost Production Logistic Installation Reliability Tank Vulnerability to each mine/fuse Effective tank width Field Field physical arrangement Distance between mines in a row Distance between rows Length of front Number of rows Random Tactic Enemy technique to bridge field 6. Outputs from model: Minimum number cost or weight of mines per yard of front to obtain a given level of destruction. 20

7- Criteria to evaluate system performance: Minimum cost to obtain a given level of destruction of enemy tanks. C. Validation and use of the model: 1. Experiments completed: None 2. Tests to validate model: None, however, the input data has been obtained through field tests. 4. Response of sponsor: The sponsor believes the results of sufficient value to be distributed within the various R and D and user agencies. 5- (a) Agency receiving results: Picatinny Arsenal (b) Use of model and significant results: Unknown D. Documentation: 1. Published or unpublished reports: Aeronutronic Report S-836 2. Logical flow charts available: Available in above report. 21

ABSTRACT (3) Applied Physics Laboratory A. Identification: 1. Agency sponsoring contract: U.S. Navy - Bureau of Naval Weapons 3* Official title of contract: Nord 7386 5. Date of initiation: February 1959 6. Completion: May 1960 7. Principal investigator: R. J. Hunt B. Description of model: 1. Physical system being modeled: Any system of anti-air guided missile batteries can be used in the simulation with emphasis on fleet air defense. 2. Principal objective of model: The principal objective is to evaluate doctrines of weapon assignment over a spectrum of attack conditions for both a single guided missile ship and for a fleet of ships with several guided missile ships. 3. Type of model: Computer simulation, fast time. Type of computer: IBM 7090 with 32,000 words of core storage. Time required for one run: 5 seconds - The program uses about 20,000 words of storage and about 5000 lines of coding excluding input and output routines. 4. Input information: Defense Inputs Number of batteries Number of missiles in storage for each battery Number of radars for each battery or for each ship Launcher reload time Number of batteries and number of battery types Battery position Salvo size Attack Inputs As many as three target types are allowed - each type is defined by the inputs listed below: Target speed Spacing between successive aircraft Distribution of decoys Altitude Carriers of air to surface missiles are noted ASM or bomb release range Jamming or non-jamming Inputs which depend on attack-defense combination Non-decoy kill probability Decoy kill probability Maximum and minimum intercept ranges for each battery type and each target Missile time of flight Target acquisition time Kill recognition time Number of surviving non-decoys Maximum allowable current engagements Maximum detection range Doctrine for target selection 22

There are many variations for missile assignment doctrines which can be arranged by suitable adjustments of inputs. Basically, for a given battery and a given target the range of missile fire can be divided into as many as 12 int rvals. For each interval maximum and minimum ranges are defined as well as acquisition time, maximum current engagements allowed and whether targets are to be selected on the basis of the least engaged nearest or randomly from those live targets in the interval. Least engaged nearest doctrine is used for non-jamming targets and random doctrine for jamming targets. For each battery, the sequential order of selecting intervals is defined. How inputs are introduced: Kill probabilities, acquisition time and missile times of flight are input as a function of intercept ranges. Ai? inputs are expected values. 5 Variables in model: Any of the inputs in 4. above can be cola-idered as variables. However, as pointed out in 6. below, the independent variable output is the number of bombers and the dependent variable output is the probability of at least N non-decoy penetrators surviving to weapon release point. Although any of the inputs in 4. may be parameters, target spacing, acquisition time, number of batteries, and salvo size are the most frequent parameters for other fixed conditions of the attack and defense. 6. Outputs from model: For a particular set of inputs describing a specific attackdefense situation, a raid of bombers of known size attacks the defense. A random number is drawn for each salvo fired by the defense and compared with the kill probability to determine the future eligibility of the selected target. For each selected raid size, the battle is fought an arbitrary number of times (but fixed by input), say one hundred. The penetration probability is computed as the ratio of the number of times the defense is penetrated by N attacking bombers to the number of times the battle is fought for a given raid size. N is an input. The program is arranged to investigate those values of raid size which are of interest. That is, attention is concentrated on raid sizes which give penetration probabilities between zero and one. The output from the machine consists of printout (1) of all inputs, (2) of the probability of at least N penetrators for each.raid size and (3) for each raid size the number of missiles fired for those runs which result in penetration and the number fired for those runs which do not result in penetration. 7. Criteria to evaluate system performance: The criterion used to evaluate system performance is the probability that at least N non-decoy attackers penetrate beyond a weapon release point or a minimum engagement range as a function of the number of attackers in the raid. C. Validation and use of the model: 1. Experiments completed: Experiments have been run to investigate (1) the best salvo size to use for missile ships, (2) the efficacy of equipment scheduling in simulations, (3) the advisability of assigning zones of engagement priority for targets near the defended point (4) the effects on the defense of simulataneous attacks from high and low altitudes and different speeds, and (5) the effects of non-uniform target spacings. Mr. R. J. Hunt was responsible for these experiments. 2. Tests to validate model: No tests have been performed to validate the model. 3. Future experiments and/or tests: Future tests are planned. Mr. R. J. Hunt will be responsible for these tests. 4. Response of sponsor: The results of the simulation are of interest to the staff members of the Applied Physics Laboratory and to cognizant members of Naval Bureau of Weapons who are concerned with the analysis and usefulness of the developments undertaken by the Laboratory as well as those more complex tactical problems associated with large scale missile systems, ships and task forces. These people seem to be more than satisfied with the efforts and results of the simulation. 5. See 4. above. 23

D. Documentation: 1. Published or unpublished reports: Applied Physics Laboratory/Johns Hopkins University Internal Memorandum CLA-837 May 24, 1960 by R, J. Hunt (Confidential) 2. Logical flow charts available: None published. E. General Discussion: In this model it is assumed that the attack is of sufficiently short duration that the movements of ships can be neglected. Hence, with fixed battery positions the model is quite similar to the ATABE model (55)+ Other programs identified include: (1) Combat Air Patrol Model (Documented in Applied Physics Laboratory/Johns Hopkins University, Internal Memorandum CLA-835) (2) Air Battle Analyzer (3) Digital Computer War Games (4) ECM (Analog) Simulation (5) Polaris Model (6) Deterrence Model 24

ABSTRACT (4) Bendix Systems Division A. Identification: 1, Agency sponsoring contract: Navy 2. Agency monitoring research: Bendix and Navy 3- Official title of contract: Eagle Missile 4. In-house designation: Eagle 5. Date of initiation: April 1960 6. Completion: December 1960 7* Principal investigator: Mr. Paul V. Ponce B. Description of model: 1. Physical system being modeled: Outer perimeter defense through long endurance missile fighters. Inner perimeter defense through surface-to-air missiles. 2. Principal objective of model: System evaluation and system design. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704. Time required for one run: 1 minute. 4. Input information: Raid models System configuration parameters Model parameters Type of output desired How inputs are introduced: Distributions are sampled Expected values are sampled if normally distributed. 5. Variables in model: Maximum detection range of surveillance system. Number and position of surveillance radar, radar capability Bomber type, speed of threat, spacing, decoy density, type of weapons delivered, number of weapons delivered, countermeasures used, etc., target cross section Number and position of CAP, number of missiles carried by each interceptor, missile range, AI-radar range, etc. CIC procedures, and time-delays Radar and tactical time delays Kill probability of missile Defense level desired Accuracy of sampling desired Defense sector desired Defense environment desired, etc. 25

6. Outputs from model: Penetration or survival probability All aborts All kills All assignments 7. Criteria to evaluate system performance: Penetration probability as a function of raid size Abort probability, etc. C. Validation and use of the model: 1. Experiments completed: Sensitivity analysis and measure of effectiveness. P. Ponce was responsible for these experiments. 3. Future experiments and/or tests: Future tests are planned. P. Ponce will be responsible. 4. Response of sponsor: Very' favorable. 5- (a) Agency receiving results: Navy Bureau of Weapons (b) Use of model and significant results: Study of defense by land based systems and carrier based systems. 26

ABSTRACT (5) C -E -I -R (Corporation for Economic and Industrial Research) A. Identification: 1. Agency sponsoring contract: United States Army Electronic Proving Ground (USAEPG) 2. Agency monitoring research: Systems Division, CDEV, USAEPG 3. Official title of contract: War Gaming for Signal Combat Systems 4. In-house designation: Combat Simulation Model 5. Date of initiation: 1 April 1958 6. Completion: Prototype June 1961 7. Principal investigator: Dr. L. R. Ford, Jr. (shortly to be Mr. J. K. Rocks) B. Description of model: 1. Physical system being modeled: General purpose division size model of ground combat - primary emphasis on communications electronics systems. 2. Principal objective of model: System evaluation. 3. Type of model: Computer simulation, fast time: Type and size of computer used: IBM 709 (32K) Time required for one run: Real time: Simulated time:: 2:1 4. Input information: Authors note: A seventeen page FORTRAN listing of inputs and variables was attached to the completed questionnaire received from Dr. Ford. Without accompanying descriptive information it was impossible to ascertain the most pertinent inputs and variables in the model. In general terms, inputs include complete descriptions of tactical deployment, unit composition, communication system, doctrine of utilization, etc., for Red and Blue forces. Variables include factors such as the movement of troop units, characteristics of messages transmitted, and effectiveness of weapons against various types of units. 6. Outputs from model: As specified by user, it is possible to output any of the variables at any time interval which is a multiple of one minute. (It is planned to have a builtin data reduction capability - but this is not at present in existence.) 7. Criteria to evaluate system performance: Primary: Tactical performance in terms of success of mission, casualties incurred or inflicted, time required to complete mission, etc. Secondary: Transmission delays, system overloads, etc. C. Validation and use of the model: 1. Experiments completed: Debugging and sensitivity testing only. T. Caldwell was responsible for these tests. 2. Tests to validate model: Tests have been performed to validate the model. 3- Future experiments and/or tests: Future experiments are planned. J. K. Rocks (CEIR) in conjunction with Lt. Col. C. D. Harding (USAEPG) will be responsible for these tests. 4. Response of sponsor: In general, enthusiastic, tempered with some trepidation as to their ability to make reasonable use of the model. 27

E. General Discussion: Other communication models have been developed at Lincoln Laboratory (11), RAND, and Technical Operations, Inc. 28

ABSTRACT (6) Chance Vought Corporation A. Identification: 1. Agency sponsoring contract: Chance Vought Corporation 2. Agency monitoring research: Chance Vought Corporation - Astronautics Division 3* Official title of contract: Manned Space Flight Simulator Facility 4. In-house designation: Same as above 5. Date of initiation: 1 August 1960 6, Completion: 1 August 1961 7- Principal investigator: W. B. Luton B. Description of model: 1. Physical system being modeled: Hypothetical boost glide, one man space vehicle. (Also capable of lunar vehicle simulation) 2. Principal objective of model: System evaluation, system design. Includes research for man-machine integration, display and control optimization, and development of flight techniques. For a variety of space vehicle configurations and missions including boost, orbit, orbit transfer, satellite rendezvous, cis-lunar, lunar orbit, lunar landing, lunar boost, re-entry, and landing phases. 3. Type of model: Computer simulation, real time. Type and size of computer: Analogdigital combination including 560 amplifiers, high speed small digital (2500 word memory) 12 input and 16 output channel analog-digital-analog conversion unit vehicle motion computed in six degrees of freedom. Time required for one run: Duration of the simulated mission in real-time. Includes limited three-degree-of-freedom, moving base, cockpit simulator. Also includes a pitching base for + 90 rotation for longitudinal load factor simulation to + 1 "g" and a horizon-star field optical projection. 4. Input information: Translational equations of motion Rotational equations of motion Forces and moments Dynamic pressure Aerodynamic terms and data Vehicle orientation Moving base Horizon-star projector Display parameters Control parameters Orbital target parameters Vehicle configuration The specific inputs needed to use the model are: (a) Definition of primary gravity body - radius, acceleration of gravity at surface, rotation rate. (b) Definition of reference inertial plane - angle relative to equatorial plane of primary gravity body. 29

(c) Initial position of vehicle - longitude, latitude, course heading (geographic, etc.) relative to primary gravity body and displacement from and course heading relative to reference inertial plane. (d) Primary body longitude of vernal equinox at initial time. (e) Motion of other moving bodies (target, etc.) relative to primary gravity body. (f) Atmospheric density variation with altitude. (g) Vehicle configuration -reference area, dimensions, etc. (h) Aerodynamic force and moment coefficients. (i) Thrust forces (function of specific engine and altitude above primary body surface, inclinations to vehicle axes and moment arms). (j) Control systems equations and gains - aerodynamic, reaction, and boost. How inputs are introduced: Items (a) thru (f) above are introduced in digital, computed at approximately 1/2 second intervals. Items (g) thru (j) real-time analog computation introduced in digital solutions at 1/20 2 1/2 second intervals. 5. Variables in model: Control displacement Display parameters Moving base parameters Pitching base parameters Horizon-star projection parameters Orbital target parameters Flight path parameters Space location parameters 6. Outputs from model: Navigation track Altitude versus velocity trajectory Velocity versus distance to destination Orbital parameters Parameter differentials between vehicle and target Time versus all significant parameters (Altitude, velocity, fuel quantity, attitude, etc.) 7. Criteria to evaluate system performance: Primary criterion employed is how successfully a simulation flight is performed. Records provide basis for evaluating. C. Validation and use of the model: 3+ Future experiments and/or tests: Tests will be conducted to validate model performance. Experiments of a general research nature are being planned. W. B. Luton will be responsible for these tests. 4. Response of sponsor: The facility is expected to provide a high capability for life sciences and space mechanics research. Also training and training methods. D. Documentation: 1. Published or unpublished reports: None. Pertinent features are described in a brochure entitled "Manned Space Flight Simulator." 30

ABSTRACT (7) Chance Vought Corporation A. Identification: 1. Agency sponsoring contract: Chance Vought Corporation 2. Agency monitoring research: Chance Vought Corporation - Astronautics Division 3. Official title of contract: Manned Space Flight Simulator - Fixed Base 4. In-house designation: Fixed Base Simulator 5. Date of initiation: October 1958 6. Completion: Was first completed February 1959, updated in April 1960 and again in March 1961 7. Principal investigator: (Currently) A. D. Schaezler B. Description of model: 1. Physical system being modeled: (Currently) DYNA SOAR I Boost Glide Vehicle 2. Principal objective of model: System evaluation, system design. Includes research for man-machine integration, display and control optimization, and development of flight techniques: for a variety of space vehicle configurations and missions including boost, orbit, re-entry, orbital rendezvous, lunar orbit, lunar landing, lunar boost, and earth landing phases. 3. Type of model: Computer simulation, real time. Type and size of computer: Analog - 560 amplifier capacity, current program is using 300 amplifiers. Time required for one run: Duration of the simulated mission in real-time. Includes a fixed base cockpit simulator. 4. Input information: Translational equations of motion Rotational equations of motion Forces and moments Dynamic pressure Aerodynamic terms and data Vehicle orientation Display parameters Control parameters Orbital target parameters Vehicle configuration The specific inputs needed to use the model are: Definition of primary gravity body - radius, acceleration of gravity at surface, rotation rate. Definition of inertial reference plane - angle relative to equatorial plane of primary gravity body. Displacement from and course heading of vehicle relative to reference inertial plane. Motion of other moving bodies (targetetc.) relative to primary gravity body. Atmospheric density variation with altitude. Vehicle configuration - reference area, dimensions, etc. Aerodynamic force and moment coefficients. 31

Thrust forces Control systems equations and gains - aerodynamic, reaction, and boost. 5. Variables in model: Control displacement Display parameters Orbital target parameters Flight path parameters Space location parameters 5. Outputs from model: Navigation track Altitude versus velocity trajectory Velocity versus distance to destination Orbital parameters Parameter differentials between vehicle and target Time versus all significant parameters (Altitude, velocity, fuel quantity, attitude, etc.) 7. Criteria to evaluate system performance: Primary criterion employed is how successfully a simulation flight is performed. Records provide basis for evaluating. C. Validation and use of the model: 1. Experiments completed: Four experiments have been completed and one is currently being conducted involving nearly 400 simulated flights. The first experiment was conducted to investigate the feasibility of manual control of DYNA SOAR I during orbit, re-entry, and hypersonic glide phases of flight. (1959) The second experiment was conducted to investigate minimum display requirements for DYNA SOAR I. (1959) The third experiment was conducted to make further study of once around orbital flight in DYNA SOAR I. (1959) (The above experiments were conducted by Chance Vought during participation in the Boeing Airplane Company Phase I DYNA SOAR I Study program, AF 33(600)-37706.) The fourth experiment was conducted by Chance Vought as a research and development program to further investigate orbit, re-entry, and hypersonic glide flight, and in addition, manual flight techniques for escape velocity re-entries and satellite rendezvous (terminal phase) were investigated. The fifth experiment is currently being conducted by Chance Vought as a sub-contract for Minneapolis-Honeywell as a part of DYNA SOAR Guidance System Analysis, Minneapolis-Honeywell Purchase Order No. 89-45498. W. B. Luton, H. E. Sewell, and A. D. Schaezler were responsible for these experiments. 2. Tests to validate model: Only those tests necessary to demonstrate performance in accordance with analytical prediction have been made. 3. Future experiments and/or tests: There are two firm programs scheduled for completion during the first half of 1961. A simulator study for BMD titled "Permanent Satellite Base and Logistics Study," ARDC-SR17532. A simulator study for WADD titled "Study of the Flight Control Requirements for a Manned Lunar Vehicle," AF 33(616)-7626, Project No. 8219, Task 86215. J. W. Bilodeau and F. T. Gardner will be responsible for these experiments. 32

4. Response of sponsor: The response has been very favorable from all agencies who have monitored or reviewed the experiments. 5. (a) Agency receiving results: The agencies who have received results of the experiments ar e Boeing Airplane Company and United States Air Force. (b) Use of model and significant results: Results have been used to date as general research data in formulating conclusions in space vehicle study programs. D. Documentation: 1. Published or unpublished reports; Chance Vought Aircraft Reports: E9R - 12003 "DS-I Cockpit Simulator Tests - 60 of Freedom" E9R - 12109 "Minimum Display Studies Using DYNA SOAR I Cockpit Characteristics Simulator" E9R - 12110 "Additional Once Around Orbit Investigation Using DYNA SOAR I Cockpit Characteristics Simulator" AST/EOR - 13016 "Fixed Base Simulation Studies of Atmospheric Re-entries" 2. Logical flow charts available: Pertinent features are described in a brochure entitled "Manned Space Flight Simulator" 33

ABSTRACT (8) GE TEMPO A. Identification: 1. Agency sponsoring contract: Subcontract from General Electric Ordnance Department, prime contract BUWEPS. 2. Agency monitoring research: BUSANDA 3. Official title of contract: 4. In-house designation: MARK I Guidance Operational Availability Study 5. Date of initiation: February 1960 6. Completion: June 1960 7. Principal investigator: W. B. Thompson B. Description of model: 1. Physical system being modeled: FBM (Polaris) System - missile subsystem 2. Principal objective of model: System evaluation 3. Type of model: Computer simulation. Type and size of computer: IBM "650". Time required for one run: 20 minutes. 4. Input information: Cruise length Spares provisioned Operating and non-operating failure rates Checkout interval and length of one checkout Remove and replace times for G/S How inputs are introduced: Failure rates in the form of distributions, others in the form of constants or expected values. 5. Variables in model: Checkout interval Number of spares of each module provisioned 6. Outputs from model: Operational availability of battery during patrol (ratio of up time to total time) Ending inventories of each module Total missile-hours of downtime accrued Number of missiles down because of undetected failures during each interval between missile checks Number and identity of missiles "out of action" for the duration of the patrol as a result of shortages of spare modules 7. Criteria to evaluate system performance: Operational availability level achieved Number of missiles lost for duration of patrol 34

C. Validation and use of model: 1. Experiments completed: Roughly 100 patrols have been run. W. B. Thompson, TEMPO OR Group, was responsible. 2. Tests to validate model: SSB(N) George Washington and Patrick Henry are validating. 3. Future experiments and/or tests: see above. GE/TEMPO and GE/OD will compare actual with simulated patrols. 4. Response of sponsor: Favorable 5. (a) Agency receiving results: General Electric Ordnance Department, and Bureau of Naval Weapons, SP-23. (b) Use of model and significant results: Provisioning of Polaris MARK I Guidance System. D. Documentation: 1. Published or unpublished reports: Thompson, W. B., Operational Availability of the Polaris MARK I Guidance System (RM60 TMP-36) (Confidential). August 31, 1960. 2. Logical flow charts available: See RM60 TMP-36 (Confidential) E. General Discussion: Other programs identified include: 412 L Information Processing Study Cost of Alternative Military Programs (CAMP) 35

ABSTRACT (9) Hughes Aircraft A. Identification: 1. Agency sponsoring contract: USAF 2. Agency monitoring research: Various USAF agencies 3. Official title of contract: No single contract 4. In-house designation: None 5. Date of initiation: 1957 7. Principal investigator: P. H. Kennard B. Description of model: 1. Physical system being modeled: Manned interceptors versus enemy bombers; called SMITE (Simulation Model of Interceptor Terminal Effectiveness). 2. Principal objective of model: System evaluation, system design, and tactical recommendations to USAF agencies. 3. Type of model: Fast time. Type and size of computer: IBM 704, 709, 7090 -32,000 words of core storage, no tapes, no drums used. Time required for one run: Nominal 100 trials (one run) takes about 1 minute on 704/709; on 7090 the average time = 30 seconds for 100 trials. 4. Input information: Inputs to the model consist of parameters describing the initial system geometry with distributions of vectoring errors, range and aspect angles of an ideal rollout point with respect to the target (s), target (s) and interceptor (s) flight path parameters with distributions of initial speeds and headings, radar detection distributions based on single scan probability of detection, time delay distributions, distributions of countermeasure effects, interceptor (s) and target (s) logic parameters for desired tactics, missile flight time parameters, armament kill probabilities based on launch geometry and fire control system parameters. How inputs are introduced: Some parameters introduced as fixed values, others in terms of distributions (normal and log-normal). 5. Variables in model: Interceptor parameters - speed, heading tactics logic Target parameters - speed, heading, radar cross-section, countermeasures effects. Fire control system parameters - logic parameters governing flight control and fire control equations. Armament parameters - probability of kill, flight times, acceptable launch error. Ground environment - initial geometry (tactics) 6. Outputs from model: Standard outputs - After N trials or one run consists of probability of detection, conversion and kill (P k), mean detection range, 90th percentile of detection range, sort of detection range or each trial, mean range at launch or success, 90th percentile and sort of mean combat time, standard deviation of combat time, sort of aspect angle or angle between headings at launch for each trial. Special outputs - After one trial - Success or failure of trial, target and interceptor position and heading, detection range, lock-on range, launch range, missile information, combat time. Special output after each iteration will enable user to plot target and interceptor trajectories. 36

7. Criteria to evaluate system performance: The criterion of effectiveness chosen is P the probability of detection, conversion and kill, which is the probability that the given interceptor will be able to detect the target, convert to a.desired tactic, prepare and launch the armament and destroy the target. C. Validation and use of the model: 1. Experiments completed: Numerous experiments have been completed to obtain measures of effectiveness of weapon systems in various tactical situations. Extensive parametric studies also have' been completed. Model has been used by many engineers. 2. Tests to validate model: Comparison of model results with sample flight test data have agreed favorably. 3. Future experiments and/or tests: Continuing use. P. H. Kennard will be responsible for tests. 4. Response of sponsor: Widely accepted. Copied by several other companies. 5. (a) Agency receiving results: Various USAF agencies (b) Use of model and significant results: Determination of weapon system design parameters Evaluation of systems effectiveness Determination of tactical employment recommendations Model has been integrated by MITRE Corporation into the MISER program. D. Documentation: 1. Published or unpublished reports: Report No. TM-601, Manned Interceptor Effectiveness Evaluation Model, Hughes Aircraft Company, 15 April 1958, Confidential. 2. Logical flow charts available: None available in documents for external distribution. E. General Discussion: Other programs identified include: Massed Raid Model Interceptor Snap-up Model Simulation of Maintenance Shop Operation 37

ABSTRACT (10) Institute for Air Weapons Research A. Identification: 1. Agency sponsoring contract: United States Air Force 2. Agency monitoring research: Weapons Guidance Laboratory, WADC (now ASD) 3. Official title of contract: Contract AF 33(616)-3274 4. In-house designation: SCRAMBLE Model 5. Date of initiation: 1956-7 6. Completion: 1959 7. Principal investigator: Program under the direction of Dr. John W. Wester, Jr., Associate Director, IAWR. B. Description of model: 1. Physical system being modeled: Penetrability-survivability of SAC against Soviet air defense system in 1965 era. 2. Principal objective of model: Basically system evaluation. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704. Time required for one run: 4 hours, exclusive of input preparation. D. Documentation: 1. Published or unpublished reports: IAWR Report 59R13 provides a general introduction to the SCRAMBLE model. More detailed summaries are contained in IAWR Report 59R14, 15'; 16 and 17 with appropriate references to other material. 2. Logical flow charts available: Logical flow charts with three levels of detail are included in IAWR Reports 59R19 and 20 and IAWR Document 59D30M. E. General Discussion: Most of the descriptive material concerning the SCRAMBLE model is contained in classified reports. Particularly good general discussions of the various portions of the model are contained in the documents cited in D. 1 above. 38

ABSTRACT (11) Massachusetts Institute of Technology Lincoln Laboratory A. Identification: 1. Agency sponsoring contract: USCONARC 2. Agency monitoring research: USCONARC 3. Official title of contract: SIMPLEX Study (The work on the model is only a small part of the contract.) 4. In-house designation: SIMPLEX Study 5. Date of initiation: May 1959 6. Completion: Indefinite 7. Principal investigator: Mr. David F. Clapp B. Description of model: 1. Physical system being modeled: An abstract communications network. 2. Principal objective of model: The model was constructed in order to study the effectiveness for communications in a hostile environment of routing techniques involving a probablistic element. 3. Type of model: Analytical, computer simulation; fast time. Type and size of computer: IBM 709/7090 with 32 K memory. Approximate time required for one run: Run time varies between 15 minutes and one hours, depending on size of network, traffic rate and length of simulated time interval. 4. Input information: Characteristics of the nodes: Identification number Size of storage Receive-transmit capabili t Network interconnections Message generation rates Message lengths How inputs are introduced: Messages are generated by selection from a Poisson distribution whose mean is specified. Message lengths are chosen from a discrete approximation -to an exponential distribution whose mean is specified. Net work interconnections are given by naming the nodes at the ends of the links. 5. Variables in model: Number of nodes Interconnections between nodes Traffic rates Routing techniques (via program modification) Buffer storage lengths 39

6. Outputs from model: Number of messages in network as function of time (mean, SD, maximum, minimum) and histogram Traverse time of messages (mean, SD, maximum, minimum) and histogram Lost messages through buffer overflow at each node, as they occur 7. Criteria to evaluate system performance: Percentage of messages lost as function of utilization factor. Mean traverse time as function of utilization factor (where util. factor = input rate service rate C. Validation and use of model: 1. Experiments completed: Various runs using random routing and networks of 50, 100 nodes with three and nine connections per node. Various runs using directory routing and tree networks of 121, 85 and 156 nodes with three, four and five branches per node, respectively. Mr. David F. Clapp was responsible for these experiments. 3. Future experiments and/or tests: Tests of a mixed random-directory routing technique have been planned for the near future. Mr. David F. Clapp will be responsible for these tests. 4. Response of sponsor: Favorable. 5. (a) Agency receiving results: USCONARC (b) Use of model and significant results: None as yet. D. Documentation: 1. Published or unpublished reports: Prosser, R. T. Routing Procedures in Communications Networks Part I: Random Procedures MIT Lincoln Laboratory Group, Report 22G-0041, 12 July 1960 Routing Procedures in Communications Networks Part II: Directbry Routing (forthcoming) Clapp, D. F. On a Communications Network Simulation Program aMT Lincoln Laboratory Group, Report 22G-0016, 11 May 1960 40

ABSTRACT (12) The MITRE Corporation A. Identification: 1. Agency sponsoring contract: Air Force Command and Control Development Division (CCDD) 2. Agency monitoring research: A MITRE "in-house" project. 3- Official title of contract: AF 33(600)39852 4. In-house designation: XOVER 5. Date of initiation: July 1960 6. Completion: August 1960 7* Principal investigator: L. S. Hager B. Description of model: 1. Physical system being modeled: Passive triangulation equipments in a SAGE multi-radar multi-jammer ECM environment. 2. Principal objective of model: System evaluation. The objective is to predict the amount and type of passive data available to a SAGE D.C. in an ECM environment. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704 with 16 K core registers. Time required for one run: Approximately 1/4 real time. 4. Input information: Radar locations Radar antenna patterns Aircraft paths Jammer power Time scale to be used How inputs are introduced: All deterministic values 5- Variables in model: Radar antenna patterns Passive logic Aircraft paths Jammer power Time interval for analysis 6. Outputs from model: Times that each jammer is detected by each site. Amount of jammer power reaching the sites. Range and azimuth of all aircraft from each site. 7. Criteria to evaluate system performance: None - this program was designed merely to predict system performance based upon known radar characteristics. C. Validation and use of the model: 1. Experiments completed: Many runs were made duplicating live tests, and parameters were established enabling prediction for a higher density jamming environment. J. Roberts was responsible for these experiments. 41

2. Tests to validate model: Successful tests have been performed. 3i Future experiments and/or tests: None at present. A. Milbert will be responsible for these tests. 5. (a) Agency receiving results: USAFESD - 416L Project Office (b) Use of model and significant results: To explain radar behavior in an ECM environment to predict system capability for large jamming raids. D. Documentation: 1. Published or unpublished reports: W-3636 "Simulation Study of Passive Inputs," 11 January 1961 (SECRET) E. General Discussion: Abstracts of MITRE (12 - 22), RCA (36) and SDC (47 "50) programs illustrate programs which are closely associated with hardware design and development. 42

ABSTRACT (13) The MITRE Corporation A. Identification: 1. Agency sponsoring contract: Air Defense System Integration Division (ADSID) 2. Agency monitoring research: A MITRE "in-house" project 3. Official title of contract: AF 33(600)39852 4. In-house designation: SASS (SNORT-ATABE Simulation Study) 5. Date of initiation: October 1959 6. Completion: July 1960 7. Principal investigator: M. M. Phelps B. Description of model: 1. Physical system being modeled: The system being modeled is the NIKE-Missile Master System operating in the ATABE (Automatic Target And Battery Evaluation) option. 2. Principal objective of model: System evaluation, system design, to determine initial SAGE parameter values. 3* Type of model: Computer simulation, fast time. Type and size of computer: IBM 704 - 32K memory - all of core is used - symbolic deck consists of about 13,500 cards. Time required for one run: 10 replications of a 20 target raid, each replication taking 50 minutes real time, about 15 minutes. 4. Input information: System geometry parameters ATABE logic parameters Aircraft parameters Missile parameters Radar parameters Tracking parameters Man/machine parameters Aircraft flight paths Battery parameters How inputs are introduced: SAGE radar errors - normally distributed Human delay times - rectangularly distributed Kill and lock-on - determined by comparing probabilities with pseudo-random numbers Batteries may be randomly put out of action with a given probability The remaining inputs are deterministic in nature 5. Variables in model: The variables and inputs are the same. 6. Outputs from model: Octal correction (if any) printouts and parameter printouts. Periodically during the running of a test the means and variances of the following data are recorded: 43

Fire units available, assigned, and out of action. Intercepts and kills (expected and actual). Assignments, targets (with priorities), point and area penetrators, systems load score, underassignment. In addition the current fire scale is recorded. At the completion of each test the above data is printed out together with the following: A list showing the actual point penetrators for each replication. Total number of target-frames. Number of intercepts, expected and actual kills for the AJAX missiles. The number of targets not shot at. A table of fire units showing the means and variances of each type of fire missile fired by each fire unit, together with the mean and variances of the number of their assignments and total assignment times. The means and variances of the number of various types of ineffective assignments. Also included are frequency, probability density and cumulative probability distributions of: Shots per engagement. Distance of intercept point from fire unit per intercept. Distance of first intercept from defense center per track. Distance of intercept point from defense center per kill. Battery-frames of underassignment per track. Engagement time per engagement. Shots per engagement. In addition, various detailed data is available under sense switch control. 7* Criteria to evaluate system performance: The primary system performance criterion used is the percentage of targets killed before reaching bomb release line. C. Validation and use of the model: 1. Experiments completed: None, other than validation experiments below. M. M. Phelps and J. 0. Gallant were responsible for these experiments. 2. Tests to validate model: Many comparisons were made between live tests and model runs. The model has been tentatively validated. D. Documentation: 1. Published or unpublished reports: Plans for SNORT-ATABE Simulation Study (SASS) TM-128 (MITRE Technical Memorandum, 16 February 1959 - Confidential) SASS Program User's Manual - TM-2767 (MITRE Technical Memorandum, 20 May 1960 - Confidential) Table Listing for SASS - TM-2926 (MITRE Technical Memorandum, 9 December 1960 - Unclassified) 2. Logical flow charts available: One very gross flow chart is given in TM-2767 above. 44

ABSTRACT (14) The MITRE Corporation A. Identification: 1. Agency sponsoring contract: Air Force Electronic Systems Division (USAFESD) 2. Agency monitoring research: A MITRE "in-house" project. 3- Official title of contract: AF 33(600)39852 4. In-house designation: TLQ-8 Simulation 5. Date of initiation: February 1960 6. Completion: June 1960 7. Principal investigator: J. Rial B. Description: 1. Physical system being modeled: AN/TLQ-8 Jammer locator and a special SAGE tracking program logic. 2. Principal objective of model: System evaluation, system design. To design a suitable tracking logic and evaluate its performance. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704 with 1K core registers. Time required for one run: 5 minutes (single track, 50-100 frames) 4. Input information: Aircraft positions versus time Error distributions of various TLQ-8 processes Tracking parameters to be optimized How inputs are introduced: RMS values of flat probability distributions. 5. Variables in model: AN/TLQ-8 error parameters Target maneuver parameters Initiation parameters Tracking parameters 6. Outputs from model: Averages of many runs for: Per cent tracks lost Heading errors Speed errors Track positions Data positions 7. Criteria to evaluate system performance: Heading, speed, and distance errors Track disassociation from radar data 45

C. Validation and use of the model: 1. Experiments completed: Runs have been performed to optimize tracking logic and parameters. Evaluation of this tracking logic was determined. J. Rial was responsible for these experiments. 2. Tests to validate model: Live system tests were run from October to December 1960. 3. Future experiments and/or tests: None at present. A. J. Milbert will be responsible for these tests. 4. Response of sponsor: Favorable 5. (a) Agency receiving results: The MITRE Corporation and USAFESD. (b) Use of model and significant results: To optimize the TLQ-8 program logic and to determine the feasibility of the system prior to live testing. D. Documentation: 1. Published or unpublished reports: TM-2547 Operational and Mathematical Specifications for a TLQ-8 Tracking Logic. TM-2769 Error Analysis of TLQ-8 and Triangulation Systems. W-3429 TLQ-8 Simulation and Tracking Program. W-3430 A Simulation Study of TLQ-8 Tracking. The above documents were published by The MITRE Corporation. 46

ABSTRACT (15) The MITRE Corporation A. Identification: 1. Agency sponsoring contract: Air Defense System Integration Division (ADSID) 2. Agency monitoring research: A MITRE "in-house" project. 3. Official title of contract: AF 33(600)39852 4. In-house designation: BOPP (BOMARC Pattern Patrol) 5. Date of initiation: October 1958 6. Completion: June 1960 7. Principal investigator: Robert E. Smith B. Description of model: 1. Physical system being modeled: SAGE control of BOMARC B missile in an ECM environment. 2. Principal objective of model: System evaluation. 3* Type of model: Computer simulation, fast time. Type and size of computer: IBM 704 with 32K core plus 3 input tapes. Time required for one run: Approximately 10 to 50 minutes, depending on the number and size of raids and on the amount of data output desired. Runs require approximately 1/3 real time in normal mode of operation. 4. Input information: System geometry Missile flight parameters Radar parameters Guidance parameters Target parameters are fed into another simulation program (PADS) which simulates radar data, including noise jammers, corresponding to the target characteristics. Targets can total 100 aircraft. The output tape (from PADS) is used as input to another simulation (TAPRE) which generates the raid outlines for the jammed environment. Both the PADS output and the TAPRE output are used as inputs to BOPP. How inputs are introduced: Parameters are entered as both distributions and as expected values. 5+ Variables in model: Target information: Raid size and shape Number of jamming and non-jamming targets Target altitude and flight path Target spacing Radar information: Site locations Radar reflection characteristics 47

Missile information: Number of salvos to be fired Launch site location Time delay between salvos Raster option seeker 6. Outputs from model: Number of missiles launched, time of flight. Number of lock-ons, kills, as well as location (X,Y) of missile at kill. Number of missiles with seekers on during a subframe. Number of missiles dropped due to simulated tracking. 7. Criteria to evaluate system performance: Kills/missile C. Validation and use of the model: 1. Experiments completed: None. The model is available —however, no experiments are planned at this time. D. Documentation: 1. Published or unpublished reports: Specifications for BOPP Simulation TM-2753 The MITRE Corporation (Confidential) 2. Logical flow charts available: Logical flow charts are available in the document cited above. 48

ABSTRACT (16) The MITRE Corporation A. Identification: 1. Agency sponsoring contract: Department of National Defence, Canada. Programmed by H. S. Gellman & Co., Ltd. Lincoln Laboratory aided in specifying the model. 2. Agency monitoring research: A MITRE "in-house" project. 4. In-house designation: MISER (Manned Interceptor SAGE Evaluation Routine) 5. Date of initiation: May 1958 6. Completion: January 1959 7. Principal investigator: J. Dominitz B. Description of model: 1. Physical system being modeled: MISER simulates the complete engagement of a single manned interceptor against a single target in a SAGE environment. 2. Principal objective of model: System evaluation, system design. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704, 32K memory; approximately 16K locations are used by the simulation. Time required for one run: An average intercept takes approximately 40 seconds. 4. Input information: System geometry parameters Radar parameters Target parameters Aircraft parameters Tracking parameters Guidance parameters Interceptor profile parameters How inputs are introduced: Both expected values and distributions are utilized. 5. Variables in model: The variables are the same as the inputs described in 4(a). 6. Outputs from model: Standard output is a series of frequency distributions for various values saved over a series of runs having the same initial geometry and differing only in a random fashion. The per cent of runs resulting in detection, lock-on, and kill, as well as the number terminated due to target loss, interceptor loss due to tracking, or tactic impossible are also standard outputs. The quantities saved as frequency distributions are: Track crossing angle at rollout; successful, unsuccessful and combined runs. Time to go to intercept at rollout; successful, unsuccessful and combined runs. Azimuth look angle at rollout, detection, no lock-on. Azimuth look angle at rollout, no detection. Elevation look angle at rollout, detection, no lock-on. Elevation look angle at rollout, no detection. Minimum separation, detection, no lock-on. Minimum separation, no detection. Azimuth look angle at minimum separation, detection, no lock-on. Azimuth look angle at minimum separation, no detection. Elevation look angle at minimum separation, detection, no lock-on. Elevation look angle at minimum separation, no detection. 49

Time at intercept, successful, unsuccessful and combined runs. Aspect angle computed at fire. Separation between target and interceptor in X and Y direction, successful, unsuccessful, combined. Table outputs are printed including: Computation counts per interval. Drops per interval. In addition, various detailed data is available under sense switch control. 7. Criteria to evaluate system performance: The primary criterion used to evaluate system performance is the percentage of runs resulting in detection, conversion and kill. Of secondary importance is the time to intercept. C. Validation and use of the model: 1. Experiments completed: A number of tests have been run using MISER to investigate the effects of various guidance parameters and tactics on intercept success. J. Dominitz was responsible for these tests. 2. Tests to validate model: A brief try was made to calibrate MISER but not enough useful field data was collected to validate the model. 3* Future experiments and/or tests: None at present. J. Dominitz will be responsible for these tests. 5. (a) Agency receiving results: The results of the studies utilizing this model have been published in various documents. Many organizations have access to this work. Some of these are: ADSID, ADC, SDC, Hughes, and McDonnell Aircraft. (b) Use of model and significant results: Changes to several of the SAGE guidance parameters and tactics have been recommended. D. Documentation: 1. Published or unpublished reports: MISER User's Manual, The MITRE Corporation TM-147-1 and S.1. A Model of the Operation of the Manned. Interceptor in a SAGE Environment, The MITRE Corporation TM-2507. MISER Progress Reports #2 and #3, The MITRE Corporation TM-437 and W-3016. Initial Test Plans for Verification of Proposed Tactics Parameters, The MITRE Corporation TM-421. TDDL ECM Simulation Tests, MITRE Corporation W-2868 2. Logical flow charts available: MISER Programmer's Guide, The MITRE Corporation, TM-174 50

ABSTRACT (17) The MITRE Corporation A. Identification: 1. Agency sponsoring contract: Air Defense Systems Integration Division (ADSID) 2. Agency monitoring research: A MITRE "in-house" project. (Originally Lincoln Laboratory) 3- Official title of contract: AF 33(600)39852 4. In-house designation: SASTRO 5. Date of initiation: January 1958 6. Completion: January 1959 7- Principal investigator: S. Ornstein B. Description of model: 1. Physical system being modeled: JAMTRAC Triangulation System 2. Principal objective of model: System evaluation, system design. To determine feasibility of using strobe displays as an aid to identification and tracking of jamming A/C. 3. Type of model: Computer simulation, real time. Type and size of computer: AN/FSQ-7 (XD-1) (Modified - All radar data displayed bright) 65K core registers, uses 40K. Time required for one run: 1 - 2 hours. 4. Input information: Target data (strobes) and true reference position azimuth width - multiple targets along azimuths which differ by a specified threshold are reported as a single strobe. Target errors - mean azimuth and azimuth jitter Program parameters Console switch actions for monitoring How inputs are introduced: Mean errors Standard deviation of errors (normal dist.) 5. Variables in model: Target flight paths Raid configurations Radar site configuration Track monitor positions and distribution of team responsibilities Strobe correlation parameters Experimentation with monitoring aids - display of data and available switch actions 6. Outputs from model: Record of switch actions taken by each track monitor Raid areas and shapes Distance of tracks from raids Distance of tracks from true target position Record at track positions and velocities each frame 51

7. Criteria to evaluate system performance: Per cent reference positions associated with system tracks Miss distances for those not associated Number of ghosts (tracks on false targets) carried Per cent total time during which maximum number of associations were maintained C. Validation and use of model: 1. Experiments completed: Air Force Operator Training, Experimentation, Evaluation of System and Operator. March-April, 1959. C. LaSonde and M. Duffy were responsible for these experiments. 2, Tests to validate model: JAMTRAC Tests - July and September, 1960. 3. Future experiments and/or tests: Features of this simulation were incorporated into the TAPRE System. 4. Response of sponsor: Not applicable. 5- (a) Agency receiving results: Lincoln Laboratory, IBM Corporation, SDC, ADSID. (b) Use of model and significant results: Used by SDC as a guide in development of the JAMIRAC program. Used by MITRE in further development of CCM. (e.g. TAPRE). D. Documentation: 1. Published or unpublished reports: The MITRE Corporation: W-100 SASTRO Evaluation Testing W-229 Evaluation Rept. SASTRO Evaluation Testing W-412 Programmers Guide and Reference Manual to the SASTRO II Simulation System Lincoln Laboratory: 6D-2832 Operators Manual for the SASTRO II Switch Program 6D-2541-1 A Test System for Raid CCM Studies 2. Logical flow charts available: W-412 is the best source of programming information. 52

ABSTRACT (18) The MITRE Corporation A. Identification: 1. Agency sponsoring contract: Air Defense System Integration Division (ADSID) 2. Agency monitoring research: A MITRE "in-house" project 3. Official title of contract: AF 33(600)39852 4. In-house designation: TAPRE (Tracking in an Active-Passive Radar Environment) 5. Date of initiation: January 1959 6. Completion: December 1959 7. Principal investigator: R. Davis B. Description of model: 1. Physical system being modeled: The JAMIRAC semi-automatic triangulation tracking system. Tracks are established and semi-automatically maintained (SAGE-like) using active data (range, azimuth) and passive data (azimuth-only). 2. Principal objective of model: System evaluation, system design. This program simulated extensive hardware changes (JAMIRAC consoles) prior to construction of prototype models. 3. Type of model: Computer simulation, real time and/or fast time. Type and size of computer: AN/FSQ-7 (XD-1) computer (core, drums, tape) 65K core registers, uses 52K. Time required for one run: Real time: 20 minutes to 3 hours; Fast time: 5 to 7 minutes depending on type of output processing desired for 50 tracks, 75 frames. 4. Input information: Tape: A/C position and velocity references A/C radar returns (range, azimuth) with errors Card or tape: System geometry Tracking parameters Program items to initiate tracking (Fast time) Consoles: Switch actions to initiate and monitor tracking (Real time) How inputs are introduced: Radar mean errors and standard deviation of errors. Errors resulting from system geometry, e.g., projection of target returns to stereographic plane, etc. 5. Variables in model: A/C configurations and radar characteristics Tracking program parameters Correlation thresholds Smoothing parameters Switch program parameters Many switch action options were provided for experimentation. Tracking program logics Several alternate logics were developed 53

6. Outputs from model: Tabulation and direct graphs from printer Mean, standard deviation, and percentile errors of speed, heading, and distance collective errors on 50 tracks each frame errors on individual tracks each frame Number of tracks lost after specified time intervals Recording of specified items and tables for system analysis Periodic positions and velocities (tape) for input to BOMARC Pattern Patrol Simulation 7- Criteria to evaluate system performance: Minimization of tracking errors and track disassociation from radar data. C. Validation and use of the model: 1. Experiments completed: Extensive testing was done to evaluate and optimize parameters for various active-passive tracking techniques. R. Davis was responsible for these experiments. 2. Tests to validate model: Two live system tests were run during the summer of 1960 using one of the many TAPRE simulation logics. 3- Future experiments and/or tests: None at present. A. Milbert will be responsible for these tests. 5- (a) Agency receiving results: MITRE, SDC, USAFESD (b) Use of model and significant results: Optimize parameters, verify logics, develop improved logics. D. Documentation: 1. Published or unpublished reports: TM-216 "Specifications for a Simulation System to Study Tracking in an ActivePassive Radar Environment." TM-2607 "Initial TAPRE Test Plans for Verification and Improvement of CCM Air Surveillance." W-412 "Programmers' Guide and Reference Manual to the SASTRO I Simulation System." W-2912 "Programmers' Guide and Reference Manual to the TAPRE Simulation System." W-2827 "Operators' and Users' Guide to the TAPRE Simulation System." W-2636 "Switch Interpretation Manual for the TAPRE System." W-2636, S.1 "Supplement to Switch Interpretation Manual for the TAPRE System." W-3810 "TAPRE Simulation Study: Optimization and Comparative Evaluation at Various Automatic Passive-Active Tracking Techniques." 2. Logical flow charts available: Contained in W-412 and W-2912 above. 54

ABSTRACT (19) The MITRE Corporation A. Identification: 1. Agency sponsoring contract: Air Defense System Integration Division (ADSID) 2. Agency monitoring research: A MITRE "in-house" project 3. Official title of contract: AF 33(600)39852 4. In-house designation: SABOT (SAGE BOMARC Test) 5. Date of initiation: July 1958 6. Completion: July 1959 7. Principal investigator: R. Hirschkop B. Description of model: 1* Physical system being modeled: SABOT simulates in the interception of a single target by a single BOMARC- B missile in a SAGE environment* The missile, SAGE radars, and missile-target seeker are subsystems modeled. 2. Principal objective of model: System evaluation, system design. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM-704; 32K magnetic core memory; no drums (will soon be available on IBM-7090). Time required for one run: An average single missile-single target intercept takes approximately 1 1/2 minutes of computer time. 4. Input information: The necessary inputs are: Target flight characteristics and path Target and missile tracking radar location and error quanta Location of missile launch site Missile model parameters SAGE guidance parameters How inputs are introduced: Both expected values and distributions are utilized by this system. 5- Variables in model: Same as 4 above. 6. Outputs from model: At the end of every case, which consists of a specified number of runs, frequency distributions are tabulated for data collected at specified times during the intercept, e.g*, at lock-on, at start search. These distributions describe items such as missile heading errors, missile-target geometry, fuel aboard missile, etc. Also the number of runs which were terminated for various reasons - lost target due to tracking, no lock-on, etc. - are tabulated. 7+ Criteria to evaluate system performance: The number of targets locked onto, and the number of targets intercepted have both been used as criteria for system performance. C. Validation and use of the model: 1. Experiments completed: Several hundred runs have been made against a variety of targets with different values for several guidance parameters. These parameter values have been evaluated relative to system effectiveness. R. Hirschkop and J. Dominitz were responsible for these experiments. 55

2. Tests to validate model: An effort to calibrate the model using live data has been started but no results are available yet. 3. Future experiments and/or tests: A continuation of C 1 and C 2 is planned. 5- (a) Agency receiving results: Boeing Aircraft Company, JADSTO (b) Use of model and significant results: For the optimization of some parameters used in the guidance function. D. Documentation: 1. Published or unpublished reports: "SABOT User's Manual," The MITRE Corporation, TM-2555 "Mathematical Specifications for SABOT," Boeing Aircraft Company Document D5-5825 2. Logical flow charts available: "SABOT Flow Charts," Boeing Aircraft Company Document D5-2204 56

ABSTRACT (20) The MITRE Corporation A. Identification: 1. Agency sponsoring contract: Air Defense System Integration Division (ADSID) 2. Agency monitoring research: A MITRE "in-house" project 3- Official title of contract: AF 33(600)39852 4. In-house designation: PADS (Passive-Active Data Simulation) 5. Date of initiation: 1958 6. Completion: Basic package was completed in 1958, but there is a continuing series of additions to the model. 7* Principal investigator: L. S. Hager and W. H. Mead B. Description of model: 1. Physical system being modeled: SAGE Long Range Radar Subsystem generating active and passive radar data including SIF data. 2. Principal objective of model: A radar data source for other programs. PADS produces an output tape containing radar data and true track information. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704, 32K memory, 2 tape units. Program has been converted to IBM 7090. Time required for one run: Real time if 70 aircraft paths and 4 radar sites are simulated. 4. Input information: Radar location} radar scan rates, and aircraft paths are necessary inputs to the model. How inputs are introduced: Radar data azimuth and range errors use a normal distribution with specifiable standard deviations and biases. Noise is distributed randomly and the amount is based on a specificable mean used in conjunction with a Poisson distribution. 5. Variables in model: Radar locations, radar data error (azimuth, range) and random noise parameters, aircraft paths (starting location and velocity; ensuing turns, altitude and speed maneuvers). 6. Outputs from model: A binary tape containing aircraft location and velocities at regular intervals and radar data per site per second. 7. Criteria to evaluate system performance: Not applicable. C. Validation and use of the model: 1. Experiments completed: It has been used as source of radar data by many programs, e.g., ESS tests, TAPRE tests, SATIN tests. 2. Tests to validate model: Since several of the using programs also use live ESS data there is a continuing evaluation of the reasonableness of the model. 3. Future experiments and/or tests: SATIN tests among others. 4. Response of sponsor: PADS has been effectively used by many simulations requiring active and passive radar data. 57

5. (a) Air Force Electronic System Division (AFESD) receives benefits of model useage. (b) Use of model and significant results: Experimentation and evaluation of SAGE) SATIN, TAPRE, etc. D. Documentation: 1. Published or unpublished reports: Operational Specifications for Passive-Active Data Simulation (PADS) The MITRE Corporation, TM-2671 and supplements. Mathematical Specifications for Passive-Active Data Simulation (PADS) The MITRE Corporation, TM-188 2. Logical flow charts available: None 58

ABSTRACT (21) 1^^ ^ The MITRE Corporation A. Identification 1. Agency sponsoring contract: Air Defense System Integration Division (ADSID) 2. Agency monitoring research: A MITRE "in-house" project 3. Official title of contract: AF 33(600)39852 4. In-house designation: TDG/TNA (Track Data Generation/Tracking and Analysis) 5. Date of initiation: 1958 6. Completion: 1958 7. Principal investigator: L. S. Hager and W. H. Mead B. Description of model: 1. Physical system being modeled: Radar inputs and tracking (non ECM environment) SAGE long-range radars and tracking programs. 2. Principal objective of model: System evaluation, system design of SAGE tracking function. 3. Type of model: Computer simulation, fast time. Type and size of computer: 704 computer (core, drums and tape) 8192 reg. core. Time required for one run: TDG - 15 minutes TNA - 10 minutes for 100 runs. 4. Input information: TDG - radar locations, aircraft paths, radar scan time, radar error distributions TNA - tracking parameters (uses TDG output tape as an input) How inputs are introduced: Radar errors use normal distribution in conjunction with a standard deviation, number of noise returns per radar-track-look is based on Poisson distribution. 5. Variables in model: TDG - radar location, radar errors, radar random noise, aircraft path TNA - track data as supplied by TDG output tape, tracking logic parameters. 6. Outputs from model: TDG - binary output tape containing radar data and true track locations. TNA - tracking error analysis (position, speed, heading errors) both printed and on CRT film. 7. Criteria to evaluate system performance: Tracking errors. C. Validation and use of the model: 1. Experiments completed: The model was used to develop changes to the SAGE tracking logic. L. S. Hager and W. H. Mead were responsible for these tests. 2. Tests to validate model: Tracking logic changes developed with the model were validated in ESS tests. 59

3. Future experiments and/or tests: None 5. (a) Agency receiving results: ADSID received tracking logic changes. (b) Use of the model and significant results: Tracking logic changes, based on design work utilizing the model, were implemented in SAGE. D. Documentation: 1. Published or unpublished reports: Lincoln Laboratory: TDG User's Manual 6D-2661 TNA User's Manual 6D-2659 2. Logical flow charts available: None 60

ABSTRACT (22) The MITRE Corporation A. Identification: 1. Agency sponsoring contract: Department of National Defence, Canada. Programmed by H. S. Gellman & Co. Ltd. Lincoln Laboratory aided in specifying the model. 2. Agency monitoring research: A MITRE "in-house" project monitored by JADSTO. 4. In-house designation: BOBCAT 5. Date of initiation: February 1958 6. Completion: 7. Principal investigator: J. Dominitz and R. Kingsbury B. Description of model: 1. Physical system being modeled: BOBCAT simulates a single BOMARC A interceptor missile engagement against a single target in the SAGE environment. The SAGE radars, tracking logic, guidance logic and missile performance are simulated. 2. Principal objective of model: System evaluation; system design (was used to determine optimum values for various parameters). Presently the simulation is being used to predict the results of various target profiles for the live Category III BOMARC A tests at the Eglin Gulf Test Range. 3. Type of model: Computer simulation, fast time. Type and size of computer: Presently programmed for the IBM 704 (with drums) and requires slightly more than 8200 memory locations. It will soon be available on the IBM 7090 and will require approximately 16,000 memory locations. Time required for one run: A single missile-target engagement requires approximately 0.75 minutes of 704 computer time. 4. Input information: Inputs consist of parameters describing the geometry of the system, target profile, radar site positions and accuracy, missile characteristics, various time delays and several probabilities involving radar and seeker operation. How inputs are introduced: Both expected values and distributions are utilized. 5. Variables in model: Target Parameters Initial position Velocity Altitude Heading Maneuvers (in heading only) SAGE Parameters General system parameters Tracking parameters (missile and target) Radar parameters (position and errors) Guidance parameters Noise table for radars Missile parameters Missile model constants Missile characteristics Missile seeker parameters 61

6. Standard output is a series of frequency distributions for various values saved over a series of runs having the same initial geometry and differ only in a random fashion. The per cent of runs resulting in missile lock-on as well as the number of runs terminated due to target or missile tracking difficulties are also standard outputs. The quantities presently saved as frequency distributions are: a) Miss distance b) Computation rate c) Missile position error vector d) Missile velocity error vector e) Relative separation error in x and y f) Lock-on time error g) Command deflection h) Time of minimum separation i) Deflection and nod error j) Range error It is also possible to examine events in detail under limited sense-switch control. 7. Criteria to evaluate system performance: The primary criterion used to evaluate system performance is the percentage of runs resulting in missile lock-on. The miss distance and the guidance computation rate are of secondary importance. C. Validation and use of the model: 1. Experiments completed: Many cases have been run using BOBCAT in tracking studies, guidance investigations and the present presimulation of planned live flight tests. These cases have involved many target profiles and a variation in a large number of the pertinent parameters. J. Dominitz, R. Harvey, L. S. Hager and R. Kingsbury were responsible for these tests. 2. Tests to validate model: Tests have been and are now being made in an effort to calibrate and validate the simulation. 3. Future experiments and/or tests: It is planned to use the BOBCAT simulation in further investigation of the BOMARC A intercept capability after the task of validating the model is completed. J. Dominitz and R. Kingsbury will be responsible for these tests. 5. (a) Agency receiving results: The results of the studies utilizing this model have been published in various documents. Many organizations have access to this work. Some of these are: ADSID, ADC, Boeing, SDC. (b) Use of model and significant results: Several SAGE parameters have been adapted as prescribed by the simulation results. Changes have been made in tracking logics to aid interceptions by the BOMARC A. Several of the live profiles to be flown during the Category III testing were modified after simulation results predicted the occurrence of problems should these profiles be used. D. Documentation: 1. Published or unpublished reports: "Simulation of a BOMARC Intercept in a SAGE Environment" Department of National Defence, Canada, COR/DSE Memorandum 58/1 (Secret). "BOBCAT Users' Manual," 6M-5772 Lincoln Laboratory (Confidential) 62

Supplements to "BOBCAT Users' Manual" 6M-5772 Supp #1 Lincoln Lab (Confidential) TM-195 MITRE Corp. (Unclassified) TM-298 MITRE Corp. (Confidential) "Programmers Guide-BOBCAT" (unpublished) "Initial Report on Results of a SAGE BOMARC A Simulation Using the BOBCAT Program," 6M-5865 Lincoln Laboratory (Secret) "BOBCAT, A Simulation Study of the BOMARC A in a SAGE Environment," 6M-5939 Lincoln Laboratory (Secret) "Further Test Results of the BOBCAT Simulation," TM-287, MITRE Corp. (Secret) 2. Logical flow charts available: Logical flow charts are available in "Programmers Guide-BOBCAT". 63

ABSTRACT (23) Norair Division A. Identification: 1. Agency sponsoring contract: WADD 2. Agency monitoring research: WADD 3. Official title of contract: SR 17530 Design Criteria for Automatic Test and Checkout Systems 4. In-house designation: MS Model 5. Date of initiation: 1 August 1960 6. Completion: 1 January 1961 7. Principal investigator: J. R. Oliver, R. S. Schweisthal B. Description of model: 1. Physical system being modeled: Maintenance System 2. Principal objective of model: System Criteria 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704. Time required for one run: 20 minutes. 4. Input information: Detailed Complexities Gross Cost Gross Reliability How inputs are introduced: Expected values. 5. Variables in model: Cost Manpower Spares Transportation Equipment Time Test Remove and Replace Transportation 6. Outputs from model: Total Cost Down Time Elapsed Time for Maintenance Best Mode for Maintenance Best Location for Maintenance 7. Criteria to evaluate system performance: Cost Reaction Time Availability 64

C. Validation and use of the model: 1. Experiments completed: Technical experiments within SR 17530 J. R. Oliver was responsible for these experiments. 2. Tests to validate model: Yes 3. Future experiments and/or tests: Yes 4. Response of sponsor: Good 5. (a) Agency receiving results: WADD (b) Use of model and significant results: Policy for Design Criteria for Test Systems. D. Documentation: 1. Published or unpublished reports: Northrop NB 61-1 "Design Criteria for Automatic Test and Checkout Systems - Final Report" 2. Logical flow charts available: None 65

ABSTRACT (24) Norair Division A. Identification: 1. Agency sponsoring contract: Norair Division, Northrop Corporation. Company sponsored. 2. Agency monitoring research: Systems Analysis Group - Norair Division. 3. Official title of contract: Not applicable 4. In-house designation: Radar Reflectivity Model 5. Date of initiation: June 1960 6. Completion: October 1960 7. Principal investigator: Homer Ralles, George Ivanoff B. Description of model: 1. Physical system being modeled: The detection model was developed to simulate a radartarget combination so as to provide information necessary to optimize the Snark Weapon System. 2. Principal objective of model: System evaluation. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704. Time required for one run: 0.07 hour. 4. Input information: Radar Blip-scan ratio Radar signal to noise ratio Radar range capability Target flight altitude Average echo area versus aspect angle of the Snark How inputs are introduced: The equation for the cumulative probability of detection is used to determine the distribution of detection ranges as a function of the lateral displacement of target track along the radar base line. For any given track perpendicular to the base line, the probability of detection can be accumulated scan by scan until the desired level of detection is achieved. Repeating this process for various tracks a distribution of detection ranges as a function of lateral displacement can be determined for a given level of detection probability. 5. Variables to model: Radar range parameters Blip scan ratio Target echo area parameters Target aspect angle parameters Target velocity parameters Target altitude parameters Radar scan rate parameters Operator factor 6. Outputs from model: Standard output consists of a printout of the cumulative probabilities of target detection at various frontal ranges and at discrete displacement ranges from the radar station. 66

A plot of target range at the predetermined cumulative probability of detection will result in a distribution of the acceptable level of detection for all target penetration distances within radar range. 7. Criteria to evaluate system performance: A separate kill submodel must be used to evaluate the vulnerability of the vehicle to a particular kill mechanism. C. Validation and use of the model: 1. Experiments completed: Several X and S band radars were investigated and the detection of the SM 62 against each type was computed for further evaluation. Homer Ralles and George Ivanoff of the Systems Analysis Group were responsible for these experiments. 2. Tests to validate model: No tests have been performed to validate the model. 3. Further experiments and/or tests: None are presently planned. Subsequent effort depends on funding. Systems Analysis Group will be responsible. 4. Response of sponsor: The results of this work have been examined by the Snark Project Office, WADD, who were satisfied with the model as used to evaluate the benefits of a study undertaken at company expense. 5. (a) Agency receiving results: Not applicable (b) Use of model and significant results: See item 4. D. Documentation: 1. Published or unpublished reports: WSDS-121 (showing results) Proposal NB 60-276 (outlining the problem) 2. Logical flow charts available: None. 67

ABSTRACT (25) Norair Division Northrop Corporation A. Identification: 1. Agency sponsoring contract: United States Air Force Research and Development Command (ARDC) 2. Agency monitoring research: Wright Aeronautical Development Division (WADD) Weapon System Project Office 3. Official title of contract: SM-62A Snark Weapon System 4. In-house designation:Operational Range and Altitude Performance of the Snark Intercontinental Missile (SM-b2A) 5. Date of initiation: Initial work commenced in 1955, the supporting model was adapted in 1958 with refinements added the next two years. 6. Completion: The model was- completed, checked out and the results evaluated late in 1959 initially with a model expansion mid-1960. 7. Principal investigator: Originally S. D. VanNorman but sequentially Homer A. Ralles then John H. Tinley. B. Description of model: 1. Physical system being modeled: The SM-62A performance is dependent on configuration and environmental conditions. The flight profile as a function of maximum range, release altitude and the condition variables are modeled to determine the expected and probability distribution values of range and altitude. 2. Principal objective of model: System evaluation; system design is a secondary application. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704 using almost the complete memory storage. Time required for one run: Due to the large number of runs to satisfy a selected level of confidence for the chi-squared goodness of fit for a normal distribution, approximately.2 of an hour per run was required. 4. Input information: The variables are the input parameters and are selected according to mission, trajectory and season. The input parameters are: Mach number Temperature Velocity as a function of the above Wind Velocity Fuel Quality - density, heat content Specific Fuel Consumption Icing Lift to Drag Weight of Missile Net Propulsive Thrust How inputs are introduced: The values which can be described by a distribution, a variance and an expected value are so defined. The normal distribution is assumed for all distribution functions. Other values are defined by:.-', ic expressions. 5. Variables in model: The input parameters, the season, trajectory and mission commands represent all the variables. The input parameters are described in section 4. 68

6. Outputs from model: The outputs are presented as probability distributions of range values using sampled input parameters for each season, trajectory and mission command sequence. The same output is obtained for the maximum altitude determination. A special output included is an icing condition evaluation. 7. Criteria to evaluate system performance: The criteria of evaluation is the expected value resulting from a computer run compared with the established contractual range specification. C. Validation and use of the model: 1. Experiments completed: The simulated conditions compared closely to Air Force calculations for comparable mission operations. The operational launch and test locations are dissimilar and as only operational missions were simulated no actual validation has been attempted. Test results have supported the magnitude of model results. 2. Tests to validate model: Only statistical validation using a chi-squared goodness of fit test. 3. Future experiments and/or tests: None 4. Response of sponsor: Favorable. 5. (a) Agency receiving results: Weapons System Project Office of WADD. (b) Use of model and significant results: The results have been compared with operational assignments against Soviet targets. Consideration of programmed flight deviation tactics and cross section echo reduction has been possible by referencing the model results. As mentioned in section C l(a) the military user has compared the model results to their independent calculations and actual test flights. D. Documentation: 1. Published or unpublished reports: Simulation of Operational Range and Altitude Performance, NB 60-157, May 1960,Unclassified. Other documents are classified and have the following company identification: Range and Altitude Performance of the SM-62 Under Operational Conditions, NOR 59-273, April 1959, Secret. SM-62A Operational Performance Capability, NOR 59-415, 7 September 1959, Secret. 2. Logical flow charts available: Yes. 69

ABSTRACT (26) Norair Division A. Identificat: Northrop Corporation A. Identification: 1. Agency sponsoring contract: Company funded 4. In-house designation: Northrop Damage Assessment Model (NORDAM) 5. Date ofrinitiation: June 1959 6. Completion: November 1959 (modifications have been made since this date) 7. Principal investigator: E. Kamiya B. Description of model: 1. Physical system being modeled: Weapon effectiveness. 2. Principal objective of model: To evaluate weapons and their characteristics as they affect their capability against various types of targets. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704O 8K. Time required for one run: Approximately 30 minutes. 4. Input information: Target: configuration vulnerability Weapon: yield number Delivery: number and location of aim points accuracy Limits: maximum and/or minimum damage to be evaluated number of times process is to be repeated How inputs are introduced: Lethal radius = f(yield, vulnerability) accuracy (CEP) = f(ox, cy) aim points = (xox, ty) target configuration = rectangular coordinates 5. Variables in model: Coordinates for each bomb impact, the independent variable, are obtained by a Monte Carlo technique. The dependent variable is the cumulative fraction of bomb damage to the target. 6. Outputs from model: A plot of the fraction of target destroyed as a function of the number of bombs on target. 7. Criteria to evaluate system performance: The effects of weapon yield and delivery accuracy on the number of bombs required to achieve a given target destruction level. 70

C. Validation and use of the model: 1. Experiments completed: ICBMs vs. area targets Naval Shore Bombardment Tactical Field Artillery E. Kamiya, R. Seelenfreund and E. Browne were responsible for these experiments. 2. Tests to validate model: None 3. Future experiments and/or tests: Planned, but awaiting funds. G. Thornton, et. al., will be responsible. 4. Response of sponsor: The company is pleased with the results to date and sees great flexibility for future modifications to handle a greater variety of damage assessment problems. 5. (a) Agency receiving results: Not applicable. (b) Use of model and significant results: Has been and is being used by Northrop to obtain inputs to many studies requiring damage assessment data. D. Documentation: 1. Published or unpublished reports: "Computer Simulation of a Multiple Bomb Target Coverage Problem," Northrop/Norair Document WSDS-66, 2 November 1959. "Target Damage Assessment Model," Northrop/Norair Brochure NB 60-120, May 1960. Thornton, G. W., "Target Damage Assessment," Northrop/Norair Computing and Data Processing Report COMPARE No. 60-A631, 7 January 1960. E. General Discussion: Although not stated above, the negative exponential is employed for calculating the expected damage to targets. This general approach has been employed in numerous other programs. 71

ABSTRACT (27) Norair Division Northrop Corporation A. Identification: 1. Agency sponsoring contract: This model was sponsored by the Northrop Corporation as part of a study to improve the maintenance qualities of company products. 2. Agency monitoring research: Not applicable. 3. Official title of contract: Not applicable. 4. In-house designation: Not applicable. 5. Date of initiation: 1956 6. Completion: September 1957. Since then some changes have been made for application to specific weapon systems. 7. Principal investigator: J. R. Watson, D. Gregor B. Description of model: 1. Physical system being modeled: This model is designed to evaluate an operational element (weapon) within any specified military maintenance organization and under any specified maintenance plan. Many of the maintenance problems are associated with the maintenance plan, organization, and environment of the using organization. This model divorces these problems from the basic maintenance requirements of the equipment, allowing the suitability of that equipment to be evaluated within the projected maintenance environment. This measure of suitability (maintenance index) can be used to predict manpower, equipment, and money resources which must be allocated to achieve given levels of operation or posture. 2. Principal objective of model: To evaluate the equipment which is involved in a weapon support system. 3. Type of model: Analytical. Type and size of computer: IBM 7090 (total core 40,000; useful core 32,000). Time required for one run: 30 minutes. 4. Input information: The following information is necessary in order to use the model: (1) A parts breakdown: This breakdown should be inclusive of and limited to all the parts which are significant to maintenance, i.e., they require some discrete maintenance activity such as replacement, repair, lubrication, etc. (2) The frequency at which the following events occur for each part: a. Failure b. Replacement (scheduled) c. Inspection d. Servicing (replenishing consumables) (3) The estimated time or effort involved in accomplishing the following functions for each part: a. Troubleshooting b. Replacement c. Inspection (including tests and checkouts) d. Servicing (4) In addition, the expected amount of time the equipment will be operated and the modes of operation must be known (operations plan). 72

How are these inputs introduced: Expected values. 5. Variables in model: The expected amount of operating time or the modes of operation can be made variable. 6. Outputs from model: The output from the model is a maintenance index which is related to the number of maintenance men required. This figure may be in terms of man-hours required per operating hour. In addition, the number of men required at various skill levels and types of skills may be determined. 7. Criteria to evaluate system performance: The criteria used in evaluating equipment performance is the amount of maintenance effort, manpower, or skills required. C. Validation and use of the model: 1. Experiments completed: The model has been applied to a number of preliminary aircraft designs. Of these, only one (the T-38) has been produced and used in a quantity sufficient to justify verification of the results. Data obtained from the Category II tests of this aircraft have been compared with the original prediction. The data from the Category II tests is not directly applicable because the model was exercised for training command operation of a fully developed aircraft, i.e., with the normal bugs inherent in every new aircraft eliminated. Total maintenance requirements are dynamic, dependent upon changes in design, learning processes, environment, and other factors. However, projections of the Category II results indicate a reasonable correlation between the results of the application of the model and reality. The data obtained was from Air Force records acquired during Category II testing at Edwards Air Force Base. 2. Tests to validate model: See C-1 (a) above. 3. Future experiments and/or tests: Category III and Training Command data will be collected as it becomes available. Training command will be available for these tests. 4. Response of sponsor: Response has been favorable. Since the publication of the report, similar work has been done by the Air Force (MIL 9412 and MIL-M-26512) and industry. (Republic Corp. AF Contract AF 33(616)-6495.) 5. (a) Agency receiving results: This model was widely distributed throughout the Air Force (WADC, Headquarters USAF, and Training Command). (b) Use of model and significant results: Evaluating preliminary designs and operational use of various aircraft by USAF Training Command and foreign Air Force Training Commands. D. Documentation: 1. Published or unpublished reports: "A Method for Predicting Weapon and Support System Maintainability," NAI 57-848, September 1957. 2. Logical flow charts available: Yes. 73

ABSTRACT (28) Norair Division A. Identification: Northrop Corporation A. Identification: 1. Agency sponsoring contract: AFDAP, Headquarters, USAF 2. Agency monitoring research: AFDAP 7. Principal investigator: J. Taylor, W. A. Armstrong, and J. F. Paris B. Description of Model: 1. Physical system being modeled: Simulation of terminal phase and air interception from roll out at the positioning point where intensive radar search begins through armament launch. 2. Principal objective of model: To determine the sensitivity of and trade-offs between the various parameters in, the air battle such as fighter and bomber speeds, terminal attack angle, radar range, ECM, etc. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704 (8K). Time required for one run: About 20 minutes for one case which generally cnsaisted of about 250 runs through the model - (Monte Carlo-Technique) 4. Input information: Fighter and bomber speeds Maximum ranges (search and tracking) Radar reference range (Ro) Midcourse angle Terminal attack angle Antenna sweep angle and rate Armament range and speed 6. Outputs from model: For each case: Probability of detection and conversion Range and aspect angle at armament launch Number of runs in which fighter detected bomber Average detection range Number of each type aborted run Total number of runs Number of times lock is broken Time lost by interceptor while lock was broken C. Validation and use of the model: 1. Experiments completed: Approximately 1000 cases were run and are tabulated and analyzed in Northrop report NAI-57-1301, Vol. I - III. J. L. Taylor was responsible for these tests. 2. Tests to validate model: Results compared favorably with results of Operations "Lockon" and "Wolfpack." 3. Future experiments and/or tests: None at Northrop. 4. Response of sponsor: The model and its results have been well received by the sponsor. The model has been made available to other organizations - MITRE and SDC - and the model has been used at Northrop by the Bendix Corporation. 5. (a) Agency receiving results: USAF Headquarters, Development and Planning (AFDAP) 74

D. Documentation: 1. Published or unpublished reports: Northrop Reports NAI-57-1240 Vol. I - VI NAI-57-1301 Vol. I - III NOR 60-31 2. Logical flow charts available: In report NAI-57-1240, Vol. V and NOR 60-31. E. General Discussion: See also Abstract (9) 75

ABSTRACT (29) Operations Research Office A. Identification: 1. Agency sponsoring contract: Department of the Army 2. Agency monitoring contract: Department of the Army 3. Official title of contract: DA 44-109-QM-266 for research and development 4. In-house designation: None 5. Date of initiation: 1954 6. Completion: Feasibility study completed 1954, first version of model completed 1959, second version due for completion in fall of 1961. 7. Principal investigator:' Hebron E. Adams B. Description of model: 1. Physical system being modeled: Small unit ground combat. 2. Principal objective of model: The principal objective is to study Army combat equipment, organizations, tactics, and doctrines. 3. Type of model: Computer simulation, slow time. Type and size of computer: UNIVAC Scientific 1103A with 20,480 words of addressable memory. Time required for one run: One hour for a 30-minute pitched battle at company level with full output. 4. Input information: Input to the model consists of parameters describing the militarily significant properties of the battlefield, the opposing forces, the tactical situation, and various probabilities. How inputs are introduced: The inputs are introduced as step function distributions in tabular form. 5. Variables in model: Battlefield parameters: Terrain - elevation slope trafficability Vegetation Man-made objects - roads bridges buildings field fortifications Water - rivers lakes 76

Force parameters: Vulnerability parameters Maneuver capability parameters Intelligence parameters Firepower parameters - weapon parameters ammunition parameters Situation parameters: Initial dispositions of forces Mission parameters for opposing forces 6. Outputs from model: A periodic output consists of final dispositions, remaining strengths, and remaining ammunition of the various components of the opposing forces. In addition to the periodic output, a special output identifies the pertinent information connected with each event occurring during the course of the battle: these events are firing, moving, dismounting, and intelligence acquisition. 7. Criteria to evaluate system performance: All evaluation is handled outside the model by use of the output described in B6. C. Validation and use of the model: 1. Experiments completed: About 15 repetitions of each of three situations employing combined arms teams as opposing forces have been simulated with the model. These have been used to observe what effect a change in the accuracy of a tank's main armament has on the outcome of a company-sized battle within a given tactical framework. These experiments were designed by Dr. Joseph A. Bruner and the Armor Study Group of the Operations Research Office. 2. Tests to validate model: No systematic tests have been performed to validate the model. 3. Future experiments and/or tests: A series of platoon-level infantry battles will be run soon. The results of these battles will be compared with the results of some recent U. S. Army Combat Development Experimentation Center battles. This comparison, though interesting, will not be a validation of the model. Robert G. Hendrickson will be responsible for these experiments. 4. Response of sponsor: Favorable 5. (a) Agency receiving results: Department of the Army received publications describing the feasibility study and the first version of the model. D. Documentation: 1. Published or unpublished reports: Johns Hopkins University, Operations Research Office, Technical Memorandum T-389, by Hebron E. Adams, Richard E. Forrester, Joan F. Kraft, and Barbara B. Oosterhout. 2. Logical flow charts available: Logical flow charts are available in the document cited in D1. More detailed charts are in preparation but are not presently available. E. General Discussion: Abstracts included from ORO illustrate only a sample of programs of the organization. 77

ABSTRACT (30) Operations Research Office A. Identification: 1. Agency sponsoring contract: Chief of Research and Development, Department of the Army 2. Agency monitoring research: Deputy Chief of Staff for Operations. 3. Official title of contract: Work is being done under one section of the overall Operations Research Office work program. The title of the sub-task which is being reported for this questionnaire is "THEATERSPIEL," code number 35.10. 4. In-house designation: THEATERSPIEL or "Theater Level War Game." 5. Date of initiation: The development of the models used has taken place over a number of years (6 or 7) both in ORO and at Army facilities. The work of putting the sub-models into a computer was begun at ORO in January 1961. 6. Completion: The first runs of the first version of the model are expected to take place in June of 1961. This is considered the first step in an evolutionary improvement cycle. Work on these models will be over a period of time probably to be measured in years. 7. Principal investigator: Richard E. Zimmerman B. Description of model: 1. Physical system being modeled: The models are used to evaluate part of the results of one period of play of a war game. They also keep track of results, preparatory to the next period's play and bookkeeping procedure to make it possible to play a war game with time and clerical work requirements reduced to a point where the players and the game controllers can concentrate their efforts on important decisions. A central framework keeps track of physical locations and current status of the various fighting units for each side. The principal parts of the computer evaluation attached to this framework are 1) the Ground Combat model, consisting of a section which evaluates the direct fighting and one which evaluates the effects of support weapons (particularly bombardment); 2) a model which evaluates the air combat support, air-to-air combat, and surface-to-air missile activity; 3) a section on logistics which provides limits on the movement of forces and the supply to forces. The models do not take into account specifically the tactical movement of forces (at least at present) nor do they take into account theater intelligence. Thus, important parts of the problem are not included in the simulation, including factors which the (human) controller applies together with the overall and detailed strategy which the Red and Blue players specify, and can if they wish change for each battle period. 2. Principal objective of model: The principal objective of the model is indeed a system evaluation but this is meant in a much larger sense and on a greater level of aggregation than seems to be true of many models. The system would be an entire theater of war and the system is the Army's capability of waging war within this theater. Note that any particular printout from the computer would not form an evaluation in itself, as the "simulation" only assists in the evaluations and in the progress of the game. 3. Type of model: A computer simulation is being constructed. (It is not clear at this stage of progress whether we will have a real time simulation or a fast time simulation. The hand-played games on which the computer simulation is based might be called slow time simulations in that one day of play required several days of evaluation. It seems likely that the end result of the mechanization we are undertaking will be a fast time simulation.) Type and size of computer: 1103A (UNIVAC Scientific) with a single core memory. Time required for one run: 20 minutes. 78

4. Input information: The Red and Blue players issue "mission-type orders" to their fighting units which are kept track of as to geography on maps which the computer takes no cognizance of. The mission-type orders of the opposing players are translated into movements of particular divisions by the Controller in a room separate from the rooms occupied by each of the two player teams. The Controller decides which divisions on each side will be grouped together for battles. He moves the symbols representing divisions in accordance with the computer output which says which side wins. This is based on the strengths of the opposing divisions, suitable movement rules, and chance factors. The inputs on strengths, etc., have been put into the computer initially and are updated each day so that the Controller's daily input is considerably less detailed than the intial input at the start of a series of plays. In the various inputs a great deal of "aggregation" is necessary. A single figure of merit is used for the relative strength of opposing divisions, for example. Further, whole squadrons of planes are played against each other in air combat, rather than single fighters. In one sense these are "expected values." The intial input to the model before the series of plays will include information on approximately 200 units for each side with about 20 pieces of data for each unit. This is referred to as the Troop and Station List. For the updating process for each day's play, the inputs are what units are grouped together on a side, who is the attacker, a judgmental alternate location in the event of a win or lose by the side being considered (if an attacker wins, he moves forward, if he loses, he stays where he is) suitable information on the missions of air units, information on priorities for logistic support, etc. It is estimated for each day's play, information on approximately half of the units will be required. This will involve approximately 6 pieces of information for each such unit. How inputs are introduced: In general the inputs are introduced wherever possible as multipliers. That is, the number of "table lookups" has been kept as low as possible by using suitable multipliers instead. 5. Variables in model: Variables include the numbers, strengths and postures of all of the various units for any particular game. 6. Outputs from model: The outputs are the outcomes of individual ground or air battles, casualties and consumption of material through use in battle or through destruction in battle. Each day's use of the model results in answers which are first reported to Control, who then edits them for reporting to the players. (See also answer to B1) 7. Criteria to evaluate system performance: The ground battle is won or lost depending to a large part upon the strength and the capability of the various units relative to one another. This capability is in terms of an arbitrary "combat potential" which in turn is based on judgements of firepower, weapon lethal areas and similar matters all judged in a subjective context. The ration of the firepowers of the opposing units is converted to a Vin'probability dependent upon posture and terrain of the opposing sides. The combat potential is made to vary depending on whether or not the combat potential can be reduced by the fire of support weapons before the ground action and can be otherwise modified(depending on whether logistics supply is forthcoming from the section of the model devoted to logistics). With the probability established in this manner, the outcome of any particular battle is stochastic. C. Validation and use of the model: 1. Experiments completed: The hand-played use of the models have been tested only against the judgment of experienced military officers in our organization and in games played at US Army installations. 2. Tests to validate model: To the extent that the model uses historical data for such matters as casualties, it can be said to have been at least partly validated. One of the difficulties is in the use of untried weapons when the model is applied to future situations. 79

3. Future experiments and/or tests: None directed to validation of the model. 4. Response of sponsor: While the work we have done has not been completed, the evidence is that such work will be of interest to the sponsor inasmuch as the sponsor is engaging in a program that parallels the work we expect to complete soon. D. Documentation: 1. Published or unpublished reports: The basic models used in hand-play of the war games are described in ORO-T-383 which has been printed but is not yet available for distribution as of 15 May 1961. 2. Logical flow charts available: None 80

ABSTRACT (31) Operations Research Office A. Identification: 1. Agency sponsoring contract: Department of the Army 2. Agency monitoring research: Chief of Research and Development, Department of the Army 3. Official title of contract: Combat Intelligence Organization and Procedures 4. In-house designation: Study 43.4 5. Date of initiation: March 1958 6. Date of completion: March 1961 7. Principal investigator: Dr. William R. Sickles B. Description of model: 1. Physical system being modeled: The system being modeled is the information-tactical decision-battle action cycle in ground combat. 2. Principal objective of model: The principal objective of the model is to provide a means of evaluating relationships between various classes of information and classes of action taken by the decision maker. 3. Type of model: The physical model is a two-sided free-play manual war game (INDIGO) designed specifically for the study of various information and control subsystems. All action that develops is a direct result of orders submitted to Control by Red and Blue player teams. That portion of the model concerned with analysis of the data generated by the game involved utilization of the 1103A computer. 4. Input information: Input to the model consists of initial positions, task organizations, procedures and rules for play and assessment including numerous distributions of random numbers. How inputs are introduced: By means of a Rule Manual. 5. Major variables in the model may be grouped as follows: Tactical Movement Deployment Contact Battle Conventional artillery Nuclear artillery Casualties Air Operations Engineer Operations Intelligence Ground reconnaissance and surveillance Air reconnaissance and surveillance Communications 6. Outputs from model: Output of the physical model consists of a set of IBM punch cards, each card containing a record of the status and/or action concerning one military unit at one point in game time. (A point in game time is equivalent to a 30 minute interval.) 81

Specifically each card contains such coded information as the unit involved, time, location at beginning and at end of interval, orders issued, action occurring, messages concerning enemy units, casualties, time on target, results of arty or nuclear fire. These outputs from the physical model become inputs to the statistical or computer model. Output from the computer model consists of a series of charts and calculated coefficients of correlation expressing the degree and direction of relationship between numerous pairs of classes of information input and order output. 7. Criteria to evaluate system performance: The criterion employed is the degree to which classes of messages are predictably related to classes of orders. Generally speaking an r>.40 meets this criterion. C. Validation and use of the model: 1. Experiments completed: The model has been employed in one play of the game. Dr. Frank J. Harris was responsible for this experiment. 2. Tests to validate model: None. 3. Future experiments and/or'tests: This model and the results of one play are currently being evaluated internally. Further planning awaits the results of this review. 4. Response of sponsor: This work will not be available to the sponsor for evaluation until internal review has been completed. D. Documentation: 1. Published or unpublished reports: The draft report currently being reviewed is entitled "Methods of Analyzing War-Game Data with Reference to Information Requirements." 2. Logical flow charts available: A logical flow chart (schematic) is available in the document cited above. 82

ABSTRACT (32) Operations Research Office A. Identification: 1. Agency sponsoring contract: Department of the Army 2. Agency monitoring research: Department of the Army, Office Chief of Research and Development 5. Date of initiation: 1956 6. Completion: 1960 7. Principal investigator: Mrs. Betty Holz B. Description of model: 1. Physical system being modeled: - Automotive maintenance of the combat vehicles of a tank battalion on a penetration type march. 2. Principal objective of model: System evaluation. 3. Type of model: Computer simulation, fast time. Type and size of computer: UNIVAC 1103A, 4,096 word core, 16,384 word drum, card reader and punch, and 4 tape units are required. Time required for one run: Running time varies with input parameters from 2 minutes to 2 hours. Real time to computing time ratio varies from 500:1 to 200:1. 4. How inputs are introduced: Input includes parameters describing pertinent characteristics of the battalion organization and march formation. Distributions are used for predicting when deficiencies will occur, how many parts are defective, what parts are defective, and when deficiencies will be immobilizing. Input parameters also set the amount of distortion to be used in sampling from these distributions. Vehicle and part parameters are also required. 5. Variables in model: Organization parameters Mechanic parameters Vehicle organization parameters Operating procedure parameters Basic load of spare parts Formation parameters March geometry parameters Formation towing parameters Vehicle parameters Deficiency parameters Speed parameters Vehicle towing parameters Part parameters Repair time parameters Repair requirement parameters Importance sampling parameters 6. Outputs from model: Estimate of distance unit will move 90% of the time before a given percentage of its combat vehicles are either immobile or dead. 83

7. Criteria to evaluate system performance: The distance that the unit has a 90% probability of going before it loses a given percentage of its vehicles through immobility or abandonment. C. Validation and use of the model: 1. Experiments completed: See ORO-SP-165. Mrs. Betty Holz was responsible for these experiments. 4. Response of sponsor: Varied. D. Documentation: 1. Published or unpublished reports: Holz and Clark, "Tests of Randomness of the Bits of a Set of Pseudo-Random Numbers," ORO-SP-77, November 1958. Boose, Sutherland, and Wing, "Analysis of Data Pertinent to Operational Mobility of M-48 Tanks (U)," ORO-SP-38, January 1959 (Confidential) Clark and Holz, "Exponentially Distributed Random Numbers," Johns Hopkins Press, 1960. Hughes and Weinert, "Analysis of M-48 Tank Automotive Deficiencies During Exercise Big Thrust," ORO-SP-166, February 1961 (Confidential) Holz, Hughes, Robinson, Sutherland, Weinert, Halpert, and Lind, "An Application of Computer Simulation to the Study of Operational Mobility of Armored Units," ORO-SP-165, Spring, 1961 (Confidential) 2. Logical flow charts available: There are logical flow charts for the model. In very general form some of them have been included in ORO-SP-165. More detailed flow charts are available, but ORO has no plans to publish them. 84

ABSTRACT (33) Operations Research Office A. Identification: 1. Agency sponsoring contract: D/A 5. Date of initiation: 1955 6. Completion: 1956 7. Principal investigator: B. Description of model: 1. Physical system being modeled: Simulation of surface-to-air missile systems. 2. Principal objective of model: To determine influences of various parameters. 3. Type of model: Computer simulation, fast time. Type and size of computer: 1103A. Time required for one run: About 2 minutes/play. 4. Input information: Basic loads In-flight missile reliability Warhead lethality Coefficients describing time of flight characteristics Equipment characteristics (radars, etc.) Delay times Firing doctrine Complete description of threat spacing altitude x-section velocity decoy/bomber ratio How inputs are introduced: A mixture of exact values and expected values. 5. Variables in model: The variables and inputs are approximately the same. 6. Outputs from model: All live aircaft, cell and target number Distribution of missile expenditures Distribution of missiles wasted Distribution of missiles that failed Distribution of missiles that killed (K-kills) Game length Random number that started game 7. Criteria to evaluate system performance: Equipment characteristics variation Firing doctrine variation Basic load variation C. Validation and use of the model: 1. Experiments completed: Many papers have resulted from the model. About 2500 hours of production have been run on the computer. 2. Tests to validate model: None 85

3. Future experiments and/or tests: None 4. Response of sponsor: Highly favorable 5. (a) Agency receiving results: Army agencies D. Documentation: 1. Published or unpublished reports: ORO R-17, Appendix S ORO R-17, Appendix T ORO R-17, Appendix G (early version of model) 2. Logical flow charts available: Logical flow charts are available in Appendices S and G listed above. 86

ABSTRACT (34) Planning Research Corporation A. Identification: 1. Agency sponsoring contract: Advanced Logistics Research Division, Bureau of Supplies and Accounts, Department of the Navy. 2. Agency monitoring research: Same 3. Official title of contract: 4. In-house designation: Project P-18 5. Date of initiation: January 1960 6. Completion: March 1961 7. Principal investigator: Mr. Andrew J. Clark B. Description of model: 1. Physical system being modeled: A General Multi-Echelon Supply System, consisting of up to 400 stockage locations organized in almost any echelon structure providing that each location is served by only one of the other locations as a supply source. 2. Principal objective of model: System evaluation, system design. The model can serve as an on-line management decision technique; however, it is also structured for system analysis and evaluation. Its main use at present is as a research vehicle. 3. Type of model: Analytical, computer simulation, fast time. Type and size of computer: IBM 7090, 32,000 words core storage, 12 tape units, off-line input-output. Time required for one run: 30 minutes. The model is actually two models: An analytic Decision model which produces least-cost inventory policies for all stockage locations, and a Monte Carlo Simulation model which replicates the system operating with the optimal policies. 4. Input information: Echelon structure data defining the number and kind of inventory locations and how they are interrelated; cost data such as ordering, holding, shortage, repair, and disposal costs; time data such as activation-deactivation dates, lead-times, and repair cycles; failure data such as expected failures over time, kind of demand probability distributions, and wearout, local repair, and distant repair rates, control data defining the mode of operation of the model and the kind of output data desired. How inputs are introduced: All factors are used by the model as input except for demand probability distributions which are internally generated from input factors. 5. Variables in model: The stock position at each stockage location, split out into serviceables on-hand, reparables on-hand, and due-ins. (These are variables in the mathematical sense; there are also a large number of parameters which may be varied in accordance with input data.) 6. Outputs from model: The Decision model produces policies for each stockage location in terms of reorder points, stock control levels, disposal points, and stock retention levels, all as functions of time. The Simulation model produces a chronological list of events occurring in the simulation and tables of resulting costs, split out by kind of cost and stockage location. Also output is a list of all inputs and certain intermediate results of the computation as selected by assignment of input factors. 7. Criteria to evaluate system performance: The analytic Decision model employs a least system cost as a criteria; performance of the system in the Simulation model is measured in terms of cost, shortages, and numbers of transactions of various types. 87

C. Validation and use of the model: 1. Experiments completed: Eight case runs have been completed, generally for the purpose of an integrated program checkout. All runs used artificial but realistic data. Mr. Andrew J. Clark and Mr. Howard Metcalfe were responsible for these experiments. 2. Tests to validate model: The last three case runs completed have served to validate the model in a restricted sense. 3. Future experiments and/or tests: A series of case runs are planned using real data obtained from the Aviation Supply Office, Philadelphia. Mr. Andrew J. Clark and Mr. Howard Metcalfe will be responsible. 4. Response of sponsor: The sponsor has been enthusiastic with this research project; there has been wide-spread interest in several branches of the Navy Department. 5. (a) Agency receiving results: The sponsoring agency has received preliminary results. (b) Use of model and significant results: No use has been made of the model yet since its main function at present is to facilitate research on multi-echelon inventory problems. D. Documentation: 1. Published or unpublished reports: PRC R-113, "Optimal Policies for a Multi-Echelon Inventory Problem," 10 July 1959; A. J. Clark, H. Scarf. PRC R-113, "Case Studies on the Multi-Echelon Inventory Problem," 1 December 1959; A. J. Gradwohl. 2. Logical flow charts available: Only a very broad scale flow chart is available showing the interrelationships between the various major portions of the model. The entire model was programmed directly in FORTRAN. 88

ABSTRACT (35) Planning Research Corporation A. Identification: 1. Agency sponsoring contract: Air Force Ballistic Missile Division (AFBMD), Headquarters, ARDC 2. Agency monitoring research: AFBMD 3. Official title of contract: Contract AF 04(647)-372 6. Completion: Final Reports dated 5 December 1959 B. Description of model: 1. Physical system being modeled: The purpose of this study is to investigate mobility as an operational means of increasing the survivability of Ballistic Missile forces or systems. 2. Principal objective of model: The principal objective of the models constructed was to evaluate vulnerability of system when various types of movements were permitted. 3. Type of model: Analytical, however, computer simulations could be employed easily. 4. Input information: Number of mobile units, velocity and geometry of movements (e.g., linear versus area). Enemy intelligence time lag, reliability, accuracy and number of enemy warheads. How inputs are introduced: The model consists of various mathematical equations developed between the input parameters and variables. These equations are developed in the document cited below. 5. Variables in model: The basic parameters characterizing mobile forces and enemy attack are: The total surface area allocated for mobile force tactical movement. The total path length allocated for mobile force tactical movement. The average network spacing. The average instantaneous movement speed of a mobile unit undergoing tactical movement, providing every stop duration is short in comparison with the enemy mean information time lag. The number of mobile units in the mobile force undergoing tactical movement (operating mobile units). The lethal diameter of detonating enemy warheads. The total number of detonating enemy warheads in the attack against the friendly mobile units. The enemy mean information time lag. In addition, the following auxiliary parameters are used: The friendly force estimate of the enemy mean time lag. 89

The friendly force estimate of the total number of detonating enemy warheads in an attack against friendly mobile units. The friendly force estimate of the lethal diameter of the detonating enemy warheads. The mean movement interval of each mobile unit which employs the tactic. The Movement Cycle Ratio. ----— the average number of movement cycles in the enemy mean time lag. Since the following parameters are functions of the movement space (path, network, or area), only their general definitions are given here. For specific definitions the reader is referred to the text. The Characteristic Measure of Movement (the measure of path length or area within reach of each mobile unit in the enemy mean time lag). The Allocation Parameter (the total space allocated for movement per Characteristic Measure of Movement). The Efficiency Index of the Characteristic Measure of Movement. The Coverage Parameter (the ratio of the measure of coverage the enemy can achieve to the measure of coverage required for certain kill). 6. Outputs from model: The outputs obtained from a model consist of various graphs relating quantities such as the probability of kill to coverage, per cent degradation in mobile units, survival probability to ratio of an enemy, the lag to the estimate of the time lag, etc. C. Validation and use of the model: 1. Experiments completed: No experiments are described in the report cited below. 5. Agency receiving results: The results were received by AFBMD D. Documentation: 1. Published or unpublished reports: The results of this study are reported in document PRC R-130 dated 5 December 1959. E. General discussion: This program is similar in purpose to the work of V. Sturdevant at RAND. 90

ABSTRACT (36) Radio Corporation of America A. Identification: 1. Agency sponsoring contract: U.S. Air Force 2. Agency monitoring research: U.S. Air Force, BMEWS System Project Office 3. Official title of contract: Ballistic Missile Early Warning System 4. In-house designation: BMEWS 5. Date of initiation: February 1961 6. Completion: May 1961 7. Principal investigator: A.D. Davies B. Description of model: 1. Physical system being modeled: BMEWS Data Gathering and Threat Evaluation Systems. 2. Principal objective of model: System Evaluation, System Design. Determine levels of performance of system elements which are adequate to meet the current threat. 3. Type of model: Analytical; computer simulation, fast time. Type and size of computer: IBM 709. Time required for one run of model: Less than one minute. 4. Input information: For each missile: Times of radar fan penetrations Azimuth Range Range Rate Detection Probabilities Also time to impact For each radar: Signal-to-noise level The blanking, gating or filtering situation with respect to azimuth, range and range rate. For threat evaluation: Event weights and thresholds Sampling rules How inputs are introduced: All are fixed values. This is an expected value model. Sampling rules call for the omission of missiles having certain properties (e.g., appearing between certain azimuths; beyond a given range, etc.) 5. Variables in model: Fan penetration schedule Probability of detection Size of raid Blanking and filtering situation Radar fan penetration parameters (range, range rate, azimuth, elevation rate) 6. Outputs from model: Times at which various warning thresholds were crossed. 7. Criteria to evaluate system performance: The amount of warning time obtained for the particular raid. 91

C. Validation and use of the model: 1. Experiments completed: Not yet 2. Tests to validate model: No. (No raids are planned to check the model. However, we will check some limiting cases by alternate calculations.) 3. Future experiments and/or tests: Yes. A. D. Davies will be responsible. 4. Response of sponsor: Good D. Documentation: 1. Published or unpublished reports: Not yet 92

ABSTRACT (37) The RAND Corporation A. Identification: 1. Agency sponsoring contract: USAF 2. Agency monitoring research: USAF 3. Official title of contract: Project RAND 4. In-house designation: Project RAND 5. Date of initiation: July 1960 6. Completion: September 1961 7. Principal investigator: A. H. Pascal B. Description of model: 1. Physical system being modeled: Movement of routine military air cargo from U.S. to overseas locations 2. Principal objective of model: Evaluate impact of changes in policies 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 7090. Time required for one run: 2 minutes. 4. Input information: Shipments: number per day weight volume priority destination Loading instructions How inputs are introduced: Number per day, weight, and volume of shipments for particular destination -- priority combinations are simulated by use of empirical probability distributions. Loading instructions are written to simulate load master's activity. 5. Variables in model: Aircraft types Maximum backlog times Daily aircraft schedule constraints 6. Outputs from model: Number of required trips per year between origin destination combinations 7. Criteria to evaluate system performance: None within the model. C. Validation and use of the model: 1. Experiments completed: None as yet. 2. Tests to validate model: None as yet. 93

3. Future experiments and/or tests: Yes. A. H. Pascal, analyst and D. Hopp, programmer will be responsible. 4. Response of sponsor: Interest indicated. D. Documentation: 1. Published or unpublished reports: "The Effects of Cargo Characteristics on Routine Airlift Operations," D-8578, A. H. Pascal, The RAND Corporation. 2. Logical flow charts available: None in final form. E. General Discussion: Abstracts included from RAND illustrate only a sample of programs of the organization. 94

ABSTRACT (38) The RAND Corporation A. Identification: 1. Agency sponsoring contract: U.S.A.F. 2. Agency monitoring research: U.S.A.F. 3. Official title of contract: Project RAND 4. In-house designation: The particular sub-project is called: STRATEGIC PLANNER 5. Date of initiation: January 1960 6. Completion: Summary 1961 7. Principal investigator: Norman Dalkey B. Description of model: 1. Physical system being modeled: Strategic Forces - Bombers, Tankers, Fleet Ballistic Missiles, ICBMs 2. Principal objective of model: Design of an emergency war plan for total strategic forces. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 7090. Time required for one run: 2 hours. 4. Input information: Location and generation rates for strategic weapons Location and characteristics of enemy targets Location and characteristics of enemy defenses Operational characteristics of vehicles and bombs Summary of desired plan How inputs are introduced: Highly specific and detailed lists or tables except for planner's summary which is input as a simplified list. 5. Variables in model: Primary variables are the planner's specifications of desired strike - target-weapon match, degree of damage desired, time phasing. Otherwise any of the items mentioned in 4(a) can be varied to study their effect on the war plan and attendent damage. 6. Outputs from model: Detailed war plan - launch time, target, refueling point, etc., for each weapon carrier Detailed damage to enemy target system 7. Criteria to evaluate system performance: Degree to which output plan matches input request Total damage achieved by war plan C. Validation and use of model: 1. Experiments completed: Not in testing stage yet. 95

2. Tests to validate model: None 3. Future experiments and/or tests: Yes. Planning Staff, SAC Headquarters will be responsible for tests. 4. Response of sponsor: Enthusiastic D. Documentation: 1. Published or unpublished reports: Sketchy description in proceedings of Third Michigan Symposium on War Games "Speeding up the Planning Process" by N. Dalkey. More complete write-up in final preparation - available within one month. 2. Logical flow charts available: Flow charts not available at present, should be available in two or three months. 96

ABSTRACT (39) The RAND Corporation A. Identification: 1. Agency sponsoring contract: U. S. Air Force 2. Agency monitoring research: U. S. Air Force (Director of Development Planning) 3. Official title of contract: Project RAND 4. In-house designation: Project RAND. The two versions of the model are known as MUSTARD I and MUSTARD II. 5. Date of initiation: MUSTARD I - January 1959 MUSTARD II - October 1960 6. Completion: MUSTARD I - April 1959 MUSTARD II. - January 1961 7. Principal investigator: Marvin Lavin B. Description of model: 1. Physical system being modeled: The vulnerability of urban areas to nuclear weapon prompt effects. 2. Principal objective of model: Estimation of weapon requirements. Efficient allocation of weapons among a set of area targets. 3. Type of model: Analytical, fast time. Type and size of computer: MUSTARD I - 704, MUSTARD II - 7090. Time required for one run: MUSTARD I - Approximately proportional to the number of urbanized areas attacked and to the weapon stockpile allocated (typical run = 30 minutes.) MUSTARD II - On the order of 5 minutes. 4. Input information: MUSTARD I - The magnitudes of one or two properties (e.g. population and manufacturingvalue-added) of a set of area targets, given at intersections of a rectangular grid overlaying each target area. Weapon properties: CEP, radius of destruction, delivery probability, number (maximum) to be allocated. Magnitudes of program "control" parameters —i.e., maximum number of weapons allocated to specified urban areas, maximum level of destruction of specified urban areas, minimum incremental destruction of specified urban areas. MUSTARD II - Single-weapon binomial survival probabilities for U.S. urban areas (expressed as a function of the delivery CEP and the destruction radius and obtained by interpolation in a table of results from MUSTARD I). Grouping of urban areas within which separate weapon allocations are made. Maximum level of destruction of specified urban areas. How inputs are introduced: Expected values, deterministic magnitudes. 5. Variables in model: Primarily the weapon characteristics: CEP, destruction radius (i.e., yield), stockpile. 97

6. Outputs from model: An ordered listing of each delivered weapon - the ordering being in terms of the incremental destruction of the specified urban resource. Corollary information includes: the urban area to which each weapon is delivered, the incremental destruction, the cumulative destruction, the GZ within the urban area (MUSTARD I only), the cumulative destruction for the complete group of urban areas. 7. Criteria to evaluate system performance: Destruction of urban resources. C. Validation and use of the model: 1. Experiments completed: None 2. Tests to validate model: None 3. Future experiments and/or tests: 4. Response of sponsor: Some application has apparently been made of the outputs. 5. (a) Agency receiving results: USAF, Dept. of Defense, and defense contractors. (b) Use of model and significant results: Development Planning. D. Documentation: 1. Published or unpublished reports: MUSTARD I - RAND Research Memoranda 2331, 2434, 2504 (Secret) RAND Research Memorandum 2429 (Confidential) MUSTARD II - RAND Research Memorandum 2737 (Secret). To be distributed. 2. Logical flow charts available: None 98

ABSTRACT (40) The RAND Corporation A. Identification: 1. Agency sponsoring contract: U.S. Air Force 2. Agency monitoring research: U.S. Air Force 3. Official title of contract: Project RAND 4. In-house designation: Project RAND (This particular model is called SAFE) 5. Date of initiation: 1960 6. Completion: May 1960 7. Principal investigator: Mr. Olaf Helmer B. Description of model: 1. Physical system being modeled: SAFE, whose name stands for "Strategy And Force Evaluation," is a strategic air-war planning game, concerned with force composition, procurement strategy (including R+D planning), deployment, and operational strategy for a possible central strategic air war toward the end of the present decade. 2. Principal objective of model: System evaluation. The principal objective in playing SAFE is to stimulate ideas in the field of air strategy and of new or improved weapon systems and to obtain a preliminary, intuitive, evaluation of such ideas. 3. Type of model: Simulation by human players of top-level air force planning. ). Input information: Performance characteristics and cost estimates of e-isting and future weapon systems. How inputs are introduced: In terms of distributions, when required in view of uncertainties. 5. Variables in model: Variants on existing weapon systems, new weapon systems, locations of new fixed installations, deployment of mobile equipment, air war strategy. 6. Outputs from model: Outcomes of simulated central strategic air wars, reflecting - at least on an intuitive level - as assessment of the above inputs as chosen by the players. 7. Criteria to evaluate system performance: An informal assessment of the relative position of the two sides at the end of a simulated strategic air war. C. Validation and use of the model: 1. Experiments completed: Two trial runs at RAND. A tentative tryout in the Senior class at the Air Force Academy. Robert Bickner and Captain William Lackman were responsible for these experiments. 2. Tests to validate model: None. 3. Future experiments and/or tests: Yes. Robert Bickner and Olaf Helmer will be responsible for these tests. 4. Response of sponsor: None, except for a favorable reception by the Air Force Academy. 99

5. (a) Agency receiving results: So far only a preliminary version has been turned over to the Air Force Academy (see above). (b) Use of model and significant results: Cadet training. D. Documentation: 1. Published or unpublished reports: None 2. Logical flow charts available: In preparation 100

ARBSTRACT (41) The RAND Corporation A. Identification: 1. Agency sponsoring contract: Air Force 2. Agency monitoring research: Air Force 3. Official title of contract: One of many projects being done under Project RAND 4. In-house designation: Alternative Central War Strategies 5. Date of initiation: Part-time effort for 6 months (October 1960) 6. Completion: None - results of study investigation would be fed into various studies or projects where applicable. 7. Principal investigator: One man at present time, on a voluntary basis; Jay T. Wakeley B. Description of model: 1. Physical system being modeled: An attempt is being made to develop a more useful description of the environment of strategic force mixes to better identify and study interactions pertinent to alternative central war strategies. 2. Principal objective of model: Provide a framework for identifying critical uncertainties and an opportunity to exercise our ingenuity in designing better systems and mixes of systems. The problem is to structure the study in such a way that we can observe iterations and recognize the sources of variability. An attempt is being made to adapt elementary statistical design methods to the problem. 3. Type of model: Computer simulation, fast time. Type and size of computer: Not ready for computer; investigations limited to hand calculation to date. 4. Input information: Sets of force objectives, each reflecting different force mixes by type and number, specific offensive weapon systems and the manner in which based. (These represent gross technical, political and economic contingencies over time "in the large"), and operational and tactical alternatives during conflict ("time in the small"); "Jobs" to be done also an important variable. How inputs are introduced: By specification to delimit differences of interest and to attain credibility. By distributions of rate-of-use of weapon systems. By expected value. Derived ideally from results of other study efforts (defense effectiveness, penetration capabilities, etc.) K VatJrib-s in mo'el.::ormienur'-te ',ith s.;ectc' force obj ctivcs and estimated budget I-ve.s for, both t' offence. nrl tc dense (two-sided ylyi): Offensive systems; their numbers and types, basing posture and launch rate capability, and destructiveness in terms of radiation and blast (Yield, CEP and Rel.), aggregated. Degradation of capability as function of time (increasingly hostile environment). Longevity of command and control capability. Degree and type of civil protection (active and/or passive). 101

Targets. Warning. 6. Outputs from model: City targets hit Military targets hit Collateral damage to cities Ratio of force remaining Hopefully these measures of outcome might be combined in some meaningful way so results can be more simply presented. 7. Criteria to evaluate system performance: Cost-Effectiveness in getting a "job" done Insensitivity to different force objectives Complementalness of offensive systems Insensitivity of outcomes to operational and tactical contingencies during conflict. C. Validation and use of the model: 1. Experiments completed: None 2. Tests to validate model: None 3. Future experiments and/or tests: None 4. Response of sponsor: Too early to report D. Documentation: 1. Published or unpublished reports: None available at this stage; limited to "internal" papers at moment. More work essential; no validation, etc. 2. Logical flow charts available: None 102

ABSTRACT (42) Stanford Research Institute A. Identification: 1. Agency sponsoring contract: United States Army Ordnance 2. Agency monitoring research: Army Research Office, U.S. Army Research and Development 3. Official title of contract: An Economic Analysis of Cost, Availability and Capability of Air Defense Systems 4. In-house designation: Project 2351 5. Date of initiation: 1 June 1960 6. Completion: 31 October 1960 7. Principal investigator: Arthur C. Christman, Jr. B. Description of model: 1. Physical system being modeled: NIKE - HAWK - REDEYE mixes in both clear and ECM environments. 2. Principal objective of model: System evaluation, capability analyses, i.e., an evaluation of the active air defenses of a field army against aircraft and missile attacks and penetrations. 3. Type of model: Computer simulation; fast time, Monte Carlo. Type and size of computer: IBM 704 with 32K memory. Time required for one run: Running time is to a great extent dependent on the number of batteries, the number of targets (aircraft and/or missiles) and the flight profiles of aircraft. At this time not enough runs have been made to allow an assessment of an average running time. 4. Input information: Battery Parameters Detection Ranges Tracking Ranges Number of Trackers Dead Zones Radar Jamming Data Tactical Control Delays Fire Control Delays Launcher Reloading Times Launcher Clearing Delay Off Launch Reliability Missile Parameters Time-Range Curves In-flight Reliability Probability of Target Kill Target Parameters Aircraft Flight Profiles Missile Trajectories Identification Data Probability of Battery Damage Probabilities of Damage of Kill against other targets, e.g., communications centers 103

Other Parameters Deployment Coordination Linkages, i.e., Level of Control Target Assignment Missile Supply Terrain Masking How inputs are introduced: Parameters such as detection ranges and fire control time delays are introduced as distributions. (They are, of course, dependent on the target, the environment, etc.) Time-range curves are introduced as piece-wise linear curves which vary with altitude and off-range. Probabilities are introduced as specified functions of other parameters. 5. Variables in model: The variables are the same as the inputs listed in (4) above. 6. Outputs from model: (1) A listing of all events in a time sequential order (2) Total enemy aircraft and/or missiles destroyed by each battery type (3) Batteries damaged by bombs and/or missiles (4) Misfires by battery type (5) In-flight aborts by battery type (6) Overkills by battery type (7) Communication centers and other passive targets destroyed by bombs and/or missiles 7. Criteria to evaluate system performance: Outputs (2) (3) and (7) may be considered as primary measures of effectiveness. By running many cycles of an identical situation (the only changes being random phenomena) means and confidence levels may be obtained for these measures. C. Validation and use of the model: 1. Experiments completed: None, but all test battle simulation cases have been completed. Eugene Levine, Systems Research Group, Inc., and Arthur Christman, Stanford Research Institute, were responsible for these cases. 3. Future experiments and or tests: None, but various attacks and penetrations (aircraft and/or missiles) against various air defenses (NIKE, and/or HAWK, and/or REDEYE) in different environments will be simulated. Later, Mauler and FABMDS will be simulated. Eugene Levine and Arthur Christman will be responsible. 4. Response of sponscr: Favorable. 5. (b) Use of model and significant results: To be used in deployment evaluation and other areas such as fire doctrine and basic load. D. Documentation: 1. Published or unpublished reports: No, but Weapons Systems Laboratory Research Memorandum WSL 27 "Some Aspects of Field Army Air Defense Capabilities - 1963" by A. C. Christman, Jr., March 1960 describes the use of an earlier model. 2. Logical flow charts available: None. 104

ABSTRACT (43) Strategic Air Comna.nd, Hq. Operations Analysis A. Identification: 1. Agency sponsoring contract: War game prepared within the Operations Analysis Office (Plans and Tactics), Headquarters, SAC 2. Agency monitoring research: JSTPS War Game Group 4. In-house designation: The machine model title is "Event Sequenced Program" -- ESP 5. Date of initiation: 7 January 1961 6. Completion: 20 March 1961 7. Principal investigator: William B. Kennedy B. Description of model: 1. Physical system being modeled: SIOP via the ESP war game 2. Principal objective of model: System evaluation 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM-704 or IBM 7090 with 32K core and 8 magnetic tapes. Time required for one run: Time required for full-scale game (two-sided): 20 minutes per replication (play/game) using the IBM-704; estimated 10 minutes on IBM 7090. 4. Input information: Core "status files" Delivery vehicles These files are updated as DGZ's bases game progresses. Tanker bases Total DGZ's and defense sites per area considered Number of replications required (1-99) Table of natural logarithms Total "targeted" DGZ's per area Attrition function parameters All Events File (Time and event ordered) Type and time of occurrence Location codes (base, target, etc.) Expected attrition factors Deed, gross error and identification factors Abort factors Fuel flight life How inputs are introduced: Expected values for attrition are entered with each record of the All Events File. One record (tape) describes one event; these are read and studied sequentially. These expected values for combat attrition are modified, during the game run, according to preceding mission successes (or failures) as compared with the plan expectation. 5. Variables in model: The attrition function has several variables used to modify the expected attrition (from the plan) in accordance with game progress. Relative "starting times" of the two combatants. Force deployment 105

During the game run status files are built which provide information on bombing successes, by area, to "now" (current event being analyzed), bomber flight life, and refueling-area status. 6. Outputs from model: All updated status files (see 4 and 5 above) Weapons delivered by target type and target priority. Magnetic tapes, resides JEOPT (core dump); reprint of All Event File event record plus supplemental information: Weapon failure and delivery vehicle removal; Weapon success; Core dump at prescribed time intervals, GCT elapsed time 7. Criteria to evaluate system performance: As is characteristic of war games in general, such analysis requires post-game study by knowledgeable, astute personnel. Criteria must be firmly established, in objective terms, before adequate machine comparisons can be made of the results of one game with those of another; these have not yet been defined. C. Validation and use of the model: I. Experiments completed: 5 small scale test runs using "created" test data. 2 full scale test runs using actual SIOP information. F. O'Meara, P. O'Neil, A. Prochnow, and W. Kennedy were responsible for these tests. 2. Tests to validate the model: The tests listed above have been performed to validate the model. 3. Future experiments and/or tests: Input tests are being planned. 5. Agency receiving results: JSTPS E. General Discussion: Abstracts included from SDC illustrate only a sample of programs of the organization. 106

ABSTRACT (44) Strategic Air Command, Hq. Operations Analysis A. Identification: 1. Agency sponsoring contract: Air Force, SAC (This is a within-Air Force study) 2. Agency monitoring research: Air Force, SAC 3. Official title of contract: Project Rothamsted 4. In-house designation: Project Rothamsted 5. Date of initiation: June 1960 7. Principal investigator: Gerald D. Berndt B. Description of model: 1. Physical system being modeled: The model is that of the associations and interactions of Minuteman missiles under attack by RED missiles. 2. Principal objective of model: The objective is to evaluate the effects of various BLUE tactics under the concomitant variation of the factors: BLUE tactics, RED tactics, silo and LCC hardness, RED weapon yield, weapon CEP, weapon ratio. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704 and IBM 7090. Time required for one run: 20 hours. 4. Input information: Classes of RED and BLUE tactics, and numerical values for the other factors listed in 2. How inputs are introduced: Missile launch and impact time parameters, specified DGZs for RED and AGZs which are random bivariate normal vector reliability factors for RED which are random rectangular variables. Account is taken of the effect of each RED missile. 5. Variables in model: The six factors listed in 2, plus the response: the number of surviving BLUE missiles. 6. Outputs from the model: The response, the number of surviving BLUE missiles. 7. Criteria to evaluate system performance: The evaluation is of tactics and the criterion is the response already mentioned. 107

ABSTRACT (45) Strategic Air Command, Hq. Operations Analysis A. Identification: 1. Agency sponsoring contract: Operations Analysis, Hq, SAC 2. Agency monitoring research: Operations Analysis, Hq, SAC 3. Official title of contract: A Comparision of GAM-77 and GAM-87 4. In-house designation: Same as above. 5. Date of initiation: Summer 1959 6. Completion: Fall 1959 7. Principal investigator: H. W. Jarrett, Jr. B. Description of model: 1. Physical system being modeled: GAM-77, GAM-87 2. Principal objective of model: System evaluation 3. Type of model: Analytical 4. Input information: Weapons scheduled (missiles and bombs) Total targets Bomber and missile abort rate Bomber and missile survival Weapon yield, CEP Target description Cost How inputs are introduced: Expected values. 5. Variables in model: The inputs and variables are the same. 6. Outputs from model: Number of targets receiving missile hits Number of targets receiving bomb hits Missile coverage of targets Bomb coverage of targets 7. Criteria to evaluate system performance: Targets destroyed by total force Targets destroyed per unit of cost Targets destroyed with fixed budget Cost to achieve equal target destruction C. Validation and use of the model: 1. Experiments completed: One run completed in the fall of 1959. H. W. Jarret, Jr. was responsible for this experiment. 108

4. Response of sponsor: Favorable 5. (a) Agency receiving results: D/Plans Hq SAC (b) Use of model and significant results: Input data on questions regarding the effectiveness of GAM-77/GAM-87. 109

ABSTRACT (46) Strategic Air Command, Hq. Operations Analysis A. Identification: 1. Agency sponsoring contract: SAC in-house study. 2. Agency monitoring research: Operations Analysis Plans and Tactics Division 4. In-house designation: 59A War Game 5. Date of initiation: January 1959 7. Principal investigator: Robert D. Castater B. Description of model: 1. Physical system being modeled: War game to evaluate EWO plan. 2. Principal objective of model: System evaluation. 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704 with 32K core and 8 tape units. Time required for one run: 25 hours. Built in interrupt features (manual and automatic) 4. Input information: Defense force posture - numbers, locations, capabilities By types -- fighters, controls, missiles, AAA Offense plan - SAC strike plan - penetrators profile, ECM load, target, times, weapon-use and types Game parameters - blast tables, range tables, kill tables, ECM value tables, velocity tables, constants for fighter search How inputs are introduced: Kill values are expected values. Blast tables are distributions associated with weapon type. 5. Variables in model: Tactics or penetrator profiles ECM Bomb damage to defense Fighter control Radius of coverage for defenses etc 6. Output from model: Time pulse by time pulse (5 minutes), location and status of each penetrator with his ECM environment, any targets hit this time pulse, accumulated kill potential, dead, alive, mission accomplished. Time pulse (or hourly) status of defense effectiveness remaining for those less than 100 per cent. Box score by time pulse (5 minutes) of penetrators by type, killed, by what type defense killed, high altitude or low altitude, number of bombs on targets, number of missions accomplished, number still active. Totals 110

7. Criteria to evaluate system performance: Complete simulation of penetration and attrition associated with a one sided war. C. Validation and use of the model: 1. Experiments completed: Test runs were run previous to production runs. All possible profiles and defense set ups were tested. Robert D. Castater, Stanley J. Brocky and George Schmidt were responsible for these tests. 2. Tests to validate model: 4 production runs and 2 test runs. 3. Response of sponsor: Favorable. Need for two sided game demonstrated. 5. (a) Agency receiving results: SAC, Director of Operations (b) Use of model and significant results: Planning purposes. 111

ABSTRACT (47) System Development Corporation A. Identification: 1. Agency sponsoring contract: United States Air Force 2. Agency monitoring research: United States Air Force 3. Official title of contract: AF 33(600)-37684 4. In-house designation: ADC Contract (WARM) 5. Date of initiation: December 1959 6. Completion: July 1961 7. Principal investigator: Mr. Robert A. Totschek B. Description of model: 1. Physical system being modeled: The Semi-Automatic Ground Environment (SAGE) system is being modeled within one of the six United States Air Defense Divisions; its associated environment and several enemy missile and manned bomber threats are also modeled. 2. Principal objective of model: The principal objective of the model is to compare and evaluate the effects of several different weapons assignment logics within SAGE. 3. Type of model: The model may be described as a fast-time digital computer Monte Carlo model. WARM contains approximately 10,000 instructions and uses 12,000 registers for tables and items. The system input and output routines are an integral part of the program. As many as six tape drives may be used. Two drums of 2,048 words are used for storage. The program is coded so that it can be run on the AN/FSQ-7 computer with 65,000 words of core storage. The time for one run on the computer varies with the size of the threat, the SAGE environment, and the assignment model being tested. 4. Input information: In order to fully utilize the capability of the WARM system the following input parameters are necessary: Geographic Parameters These describe the location of sector boundaries, radar sites, FIS bases, SAGE Direction Centers, SAC bomber bases, Missile Master/BIRDIE complexes, Missile bases and other target points and weapons within a SAGE division. Threat Script The threat script describes the armament, speed, altitude, flight path, time of appearance and ground objectives of each hostile aircraft. SAGE Armament These parameters include the numbers, type, armament and location of manned and unmanned interceptors in the SAGE environment as well as the number and type of weapons at Missile Master/BIRDIE complexes. Communications The communication links between SAGE Direction Centers and their associated radar sites and FIS bases are defined. 112

Probabilities These include detection probabilities for radar equipment based upon range, target size, target speed, target altitude and the presence or absence of jamming; probabilities of detection, conversion and kill (Pdck) for Nike and manned or unmanned interceptors based upon range, altitude, speed and type of weapon. 5. Variables in model: All input parameters are completely variable to allow for a variety of threats, defense postures, and modes of operation. 6. Outputs from model: Standard output consists of a symbolic tape containing a statistical evaluation of the case just concluded. (Each simulated air battle is termed a "run". A statistically significant number of runs is termed a "case". ) For each run the following ratios are computed at five minute intervals (subjective time): Bomber Attrition Assignment Model Efficiency Ground Damage Assessment Rate of Ground Damage Percentage of Available Weapons Remaining SAGE Degradation Weighted Comprehensive Measure At the termination of a case the mean and standard deviation is computed for each ratio and time interval. A frequency distribution of each ratio per time interval is written on the output tape. In addition to this standard output, detailed dynamic recording of one or more particular runs may be requested. This permits close examination of those events considered most important to the air battle. In the present model of WARM the dynamic run recording includes: Assignment Recording Frame Number Squadron Designation and Airbase Number and Type of Interceptor Scrambled Predicted Intercept Time Predicted Pdck Target Designation and Flight Size SAGE Degradation Frame Number Number and Location of Radar Sites Destroyed Number and Location of Control Centers Destroyed Number and Location of Airbases Destroyed Name, Number and Location of other Ground Objectives Destroyed Battle Status Frame Number Number of Tracks Destroyed Number of Tracks in System Number of Weapons Available for Assignment 7. Criteria to evaluate system performance: Number of bombers destroyed, amount of ground damage caused by bombers, number of interceptors destroyed, total environment resulting from missile and bomber attack. D. Documentation: 1. Published or unpublished reports: A users manual will be published at a later date. 2. Logical flow charts available: Flow charts will be available at a later date. 113

ABSTRACT (48) System Development Corporation A. Identification: 3. Official title of contract: AF 33(600)-37684 4. In-house designation: ADC Contract (GUISE) 5. Date of initiation: March 1960 6. Completion: July 1961 7. Principal investigator: Mr. William R. Hutcherson, Jr. B. Description of model: 1. Physical system being modeled: The system being modeled is the SAGE weapons guidance function. 2. Principal objective of model: The principal objective of the model is to provide a means of evaluating the computing time requirements for the different guidance logics used by SAGE. Also to provide a simple and flexible vehicle for testing out new developmental ideas in guidance. 3. Type of model: The model may be described as a fast time computer model. The guidance programs contain 3400 instructions and uses 2200 locations for tables and temporary storage. The track generation, input, output, control and evaluation programs contain 4850 instructions and uses the same 2200 locations for tables and storage. Thus, the entire program requires approximately 10,450 words of core storage. No tapes or drums are used except for 1 tape drive for output. The program is coded for the AN/FSQ-7. 25 simultaneous interceptions lasting for 100 frames are target flight path parameters, interceptor flight characteristics, type of tactic, and guidance equation parameters. These inputs are constants. 5. Variables in model: Guidance parameters Equation threshold parameters Target maneuver parameters Frame time parameters Interceptor parameters Position parameter Vector heading parameter Bank angle parameter Target parameters Position parameter Vector heading parameter Bank angle parameter Evaluation parameters Output parameters Operation parameters Control parameters Tactic parameters 114

6. Output from model: Standard output consists of a printout of the summary information of each interception. This information contains the x and y dispersion between target and interceptor at predicted time of intercept, the time and distance error from actual intercept, track crossing angle, number of new headings calculated, number of different turn combinations used, total number of frames, total computing time used, and if this intercept was considered to be a success. In addition, a printout of each frame's flight information may be obtained. This information contains positions, headings, turn combination, time until intercept, phase indicator, and computing time. 7. Criteria to evaluate system performance: Primary criterion employed is the determination of a successful intercept conditioned by a feasible output of the parameters listed in B.6. C. Validation and use of the model: 1. Experiments completed: About 20 runs each containing 25 interceptions have been run comparing the computing time required by different guidance logics. Mr. William R. Hutcherson was responsible for these experiments. 2. Tests to validate model: Some of the tests have been checked by hand calculations and also compared to comparable SAGE DCA program results. 3. Future experiments and/or tests: More tests are planned to show the effects of different target maneuvers and speed ratios upon the computing time requirements of SAGE guidance logics. Mr. William R Hutcherson will be responsible for these tests. 4. Response of sponsor: The sponsor has shown interest in this effort and hopes that the model will be used to develop improvements to guidance. 5. Use of model and significant results: The model is presently being used to compare the computing time requirements for double and single turn guidance logics. D. Documentation: 1. Published or unpublished reports: TM-624, "An Operational and Program Description of the Guidance System Evaluation Model (GUISE)," by W. Hutcherson, J. Denning and M. Olin. 2. Logical flow charts available: Logical flow charts are available in the document cited abov6. 115

ABSTRACT (49) System Development Corporation A. Identification: 1. Agency sponsoring contract: Air Defense Command 2. Agency monitoring research: Air Defense Command 3. Official title of contract: AF 33(600)-37684 4. In-house designation: ADC Contract 5. Date of initiation: September 1960 6. Completion: June 1961 7. Principal investigator: B. Description of model: 1. Physical system being modeled: Air Surveillance Tracking Subsystem of the SAGE Air Defense System 2. Principal objective of model: System evaluation 3. Type of model: Computer simulation, fast time. Type and size of computer: AN/FSQ-7 computer with 65K memory. Time required for one run: 20 minutes plus 20 minutes for data processing. 4. Input information: The program is adapted to SAGE Sector environment -- radar types and site locations are selected. Aircraft flight paths are designed appropriate to the evaluation —flight paths will incorporate various maneuvers representing situations encountered by operating SAGE sites. Single, double, or triple radar coverage may be used. Several aircraft types may be represented —varying in speed, altitude, maneuver capabilities, reflectivity, etc. A "clutter density" (returns per unit area) is selected for generation of simulated noise - clutter returns. How inputs are introduced: For a given aircraft - radar configuration, simulated returns are generated based on normal distribution of error and probability of return. Clutter returns are locally random, uniformly distributed as a chosen area according to input "clutter density." 5. Variables in model: Variables in the model are the various track-while-scan logics chosen for evaluation. 6. Outputs from model: Outputs are mean, variance, and percentile values of the deviations in position and velocity between the designed aircraft flight paths and the "as tracked" flight paths. Data processor allows for a maximum of 100 flights and about 45 minutes real time. 7. Criteria to evaluate system performance: There are no evaluation criteria within the model. Evaluation is based on the output statistics. C. Validation and use of the model: 2. Tests to validate model: Standard program check-out has been performed to verify the logics involved. 116

3. Future experiments and/or tests: Evaluation of four logics will be performed using extensive flight path and clutter environments simulation. Requirements Analysis Group of SDC will be responsible for these experiments. 4. Response of sponsor: Favorable D. Documentation: 1. Published or unpublished reports: A report will be published as testing proceeds. 2. Logical flow charts available: SDC Document FN-5476 (to be published) 117

ABSTRACT (50) System Development Corporation A. Identification: 1. Agency sponsoring contract: Air Defense Command (ADC) 2. Agency monitoring research: Air Defense Command (ADC) 3. Official title of contract: AF 33(600)-37684 4. In-house designation: System Training Program (STP) 5. Date of initiation: August 1954 6. Completion: In progress B. Description of model: 1. Physical system being modeled: The entire aircraft control and warning system in the United States, Canada and Alaska as well as portions of this system in Europe (USAFE) and the Pacific (PACAF). 2. Principal objective of model: The training of aircraft control and warning crews and their battle and support staffs. 3. Type of model: Computer simulation, real time. Type and size of computer: Training materials are produced using the IBM 7090 computer; training missions are conducted using the military AN/FSQ-7 computer. Time required for one run: Computer time required varies with the size of training problems. 4. Input information: The 15 groups of variables, listed in Item 5, below which represent several hundred variables, are required for the representation of all inputs received at air defense direction and control centers, as well as for the mechanics of training materials production. How inputs are introduced: The program system used for the production of training materials incorporates many "models," e. g., aircraft models, flight plan models, radar models, etc. Characteristics of inputs are usually in terms of interval scales of real values; in some cases, expected values are used, as in flight vs. flight plan discrepancies, and in the case of radar, certain probabilities of detection are assumed. 5. Variables in model: Basic Data Location Identifiers Aircraft Characteristics Flight Plan Parameters Radar Parameters Boundary Definitions Map Parameters Jammer Parameters Problem Data Problem Definition Aircraft Assignment Path Formation Flight Plans Flight Selection Flight Deletion Noise Parameters Crosstell-Parameters 118

6. Outputs from model: A set of training aids, consisting of magnetic tapes and 9-mil cards for input to the AN/FSQ-7, 70 mm films for input to the AN/GSP-T2, and a large variety of maps and lists. 7. Criteria to evaluate system performance: The purpose of the STP training materials, and of the training missions conducted with them, is the improvement of air defense, i.e., detection, tracking, and identification of airborne objects, interception of unknown objects, and destruction of hostile aircraft and missiles. C. Validation and use of the model: 1. Experiments completed: Several experiments, i.e., repetitions of training missions with control over crew participation, have given evidence of improved air defense performance as a result of STP. 2. Tests to validate model: The simulated air pictures, and related communications presented by STP are being "validated," i.e., compared with live air pictures on an ongoing basis. 4. Response of sponsor: Favorable. 5. (a) Agency receiving results: Air Defense Command (U.S.); Air Defense Command (RCAF); North American Air Defense Command; Alaskan Air Command; Pacific Air Force; U.S. Air Force, Europe. (b) Use of model and significant results: Training of aircraft control and warning crews and their battle and support staffs. D. Documentation: 1. Published or unpublished reports: Contact the Editorial Liaison Office of SDC's Technical Communications Department. 2. Logical flow charts available: Same as above. 119

ABSTRACT (51) Systems Research Group A. Identification: 1. Agency sponsoring contract: Office of Naval Research, Wash., D. C. 2. Agency monitoring research: Office of Naval Research 3. Official title of contract: Contract No. 2936 (00) 4. In-house designation: MILITRAN I 5. Date of initiation: July 1959 6. Completion: June 1960 7. Principal investigator: Dr. Harold N. Shapiro B. Description of model: 1. Physical system being modeled: MILITRAN I is a compiler for generating computer simulations which will simulate a variety of weapon systems. 2. Principal objective of model: The principal objective is to achieve a compiler system which will generate specific models of weapon system simulations. 3. Type of model: Computer simulation, fast time. Type and size of computer: 709/7090 32K memory with drums and - tape units. Time required for one run: The time is heavily influenced by the model being compiled. (See Bl.). The information above relates to a fixed model which has been compiled. 4. Input information: The information necessary is solely a description of the model to be converted to a computer simulation. Here, one does not need specific parameter values, but only their description as well as that of the dynamics of the model. How inputs are introduced: Object types in terms of categories that are recognized by the compiler; Linkages that specify coordination and other similar types of interaction between objects in the model; Motion categories recognized by the compiler; etc. 5. Variables in model: The variables that arise are those parameters which are pertinent to weapon system simulations, i.e., kill probabilities, time-range curves, detection ranges, tracking ranges, etc. These variables are treated by the compiler as either density functions or stochastic parameters, etc. 6. Outputs from model: The output of the compiler is an object program which is now ready to simulate the given model. In addition to the object program, the compiler produces operating instructions, clarifications of the model, and other information necessary to properly utilize the object program. 7. Criteria to evaluate system performance: Such criteria vary with each model that is compiled. 120

C. Validation and use of the model: 1. Experiments completed: Various illustrative models have been compiled. These include: Surface-to-air simulations, amphibious operation simulations, submarine warfare simulations, etc. Systems Research Group, Inc., 244 Mineola Blvd., Mineola, L.I., N.Y. was responsible. 2. Tests to validate model: The compilations referred to above exhibit the validity of the compiler. 3. Future experiments and/or tests: See C.1. 4. Response of sponsor: Based upon the results of the completed effort, the sponsor has requested further investigations in the area of simulation compilers. 5. (a) Agency receiving results: Office of Naval Research (b) Use of model and significant results: Various simulations for specific models have been generated by the compiler. D. Documentation: 1. Published or unpublished reports: The following SRG reports document MILITRAN I: Beginnings of MILITRAN I (SRG report H-160960) Preliminary MILITRAN Pre-Coding Forms (SRG report H-181060) Illustrative Examples of the Applicability of MILITRAN I (SRG report H-210161) Technical Aspects of MILITRAN I (SRG report H-230361) 2. Logical flow charts available: Although flow charts are not available, the logical flow of the compiler system is discussed in SRG report H-230361. E. General Discussion: A second version of MILITRAN is in preparation. In conducting the survey four organizations were contacted at which it was asserted that general programming tools were being developed. Douglas Aircraft Company, Hughes Aircraft Company, Systems Research Group and Technical Operations, Inc. Of these programs, the only reports obtained were from Technical Operations, Inc., on the CL-I and CL-II programming systems. 121

ABSTRACT (52) Systems Research Group A. Identification: 1. Agency sponsoring contract: Office of Naval Research 2. Agency monitoring research: Office of Naval Research 3. Official title of contract: Fleet Marine Force Air Defense System Study 4. In-house designation: Marine Air Defense Simulation Model 5. Date of initiation: May 1958 6. Completion: April 1959 7. Principal investigator: Dr. Harold N. Shapiro B. Description of model: 1. Physical system being modeled: Hawk-Superhawk weapon mixes and Hawk-Terrier weapon mixes in both clear and ECM environments. 2. Principal objective of model: System evaluation 3. Type of model: Computer simulation, fast time. Type and size of computer: IBM 704 - 32K memory with drums and ten tape units. Time required for one run: Mean time of approximately 3 minutes. 4. Input information: Detection Ranges Tracking Ranges Acquisition Delays Fire Control Delays Time-Range Curves Coordination Linkages Probabilities of Kill/Damage Motion Data for Incoming Raids Position of Defense Weapons and Targets Environment Conditions Misfire Probabilities Launcher Reloading Time Launcher Clearing Delay Number of Trackers per Battery Tracker Dead Zones How inputs are introduced: Such parameters as tracking range are introduced as cumulative distributions Probabilities are introduced as constants but are utilized stochastically. Time-range curves are introduced as piece-wise planar curves which vary with altitude and off-range. 5. Variables in model: The input parameters listed above which are used stochastically are variables in the model. Similarly, various outputs (see below) may be considered as variable. Other variables which arise internally include: missile flight times; target assignments; remaining weapon supplies; etc. 122

6. Outputs from model: A listing of all events in a time sequential order. Total enemy aircraft destroyed. Total number of aircraft killed/damaged. Number of enemy aircraft attacking primary targets which are killed before bomb-drop. Number of enemy aircraft attacking missile batteries killed before bomb-drop. Total number of bomb-drops, etc. 7. Criteria to evaluate system performance: Many of the outputs listed above may be looked upon as measures of effectiveness. By running many cycles of identical situations (the only changes being random phenomena) one may obtain means and confidence levels for these measures. The latter may be used for system evaluation and determination of optimal weapon mixes. C. Validation and use of the model: 1. Experiments completed: Using the computer simulation that evolved from the model, over 300 machine runs were completed. The result of these experiments is documented in Naval Analysis Report No. 18, July 1959, Secret, Chapters 12 and 13. Systems Research Group, Inc., 244 Mineola Blvd., Mineola, L.I., N.Y. was responsible. 2. Tests to validate model: Parameter values were taken from manufacturer's estimate. 3. Future experiments and/or tests: See (1) and (2) above. 4. Response of sponsor: The sponsor of the work is the Office of Naval Research who used the resulting data from the machine runs, together with other independent assessments of the weapon systems under consideration to assess their pertinence to the needs of the Navy. 5. (a) Agency receiving results: Office of Naval Research (b) Use of model and significant results: See (4) D. Documentation: 1. Published or unpublished reports: A full documentation may be found in: Naval Analysis Report (SECRET) No. 18, July 1959, Chapters 12 and 13. 2. Logical flo1 charts available: A complete set of logical flowv charts is available in the reort referenced above. 123

ABSTRACT (53) Technical Operations, Inc. Project OMEGA A. Identification: 1. Agency sponsoring contract: USAF 2. Agency monitoring research: Air Battle Analysis Division, Headquarters USAF 3. Official title of contract: Contract AF 33(600)-35190 4. In-house designation: Air Battle Model II 5. Date of initiation: August 1958 7. Principal investigator: Under the direction of James L. Jenkins, Director of Project OMEGA. B. Description of model: 2. Principal objective of model: The purpose of this model is to simulate the first several days-of a two-sided global air war taking into account the resources and plans of the opposing forces. 3. Type of model: Computer simulation, fast time. Type and size of computer: Originally programmed for an IBM 709 with 32K storage and 12 magnetic tape units. Now programmed for an IBM 7090 with the same storage. 4. Input information: There are three types of inputs required by the model: lists, parameter tables, and plans. Lists - The lists are a record of the resources of each side and contain: a record of the location, facilities, weapon supplies, capabilities, and present status of the missile bases, bomber bases, local defense sites, radar defense sites, and area defense sites; a record of the location, and present status of each bomber and tanker; and a target list. Parameter Tables - The parameter tables specify the operating characteristics of the resources.... they contain such information as: the velocity of each type of aircraft in each mode of flight; the weight of each type of aircraft and each type of bomb; and the probability of a specified type of fighter aircraft killing a specified type of bomber under certain attack conditions. Plans - The plans specify the strategy and tactics to be employed by the offensive forces of each side. The plans are: Initiating Plans - An initiating plan provides directions for selecting and grouping the manned aircraft other than tankers. An initiating plan may specify the selection of more than one aircraft. The aircraft are grouped into a cell which is a group of identical aircraft selected from the same unit (operational group), and which are flown to the same place, at the same time, and in the same mode of operation. In-Flight Plans - The flight of a cell which has been formed by an initiating plan is governed by a series of in-flight plans; the number of the first in-flight plan is specified by the initiating plan. An in-flight plan gives detailed instructions for a leg of flight and specifies an operation (such as targeting, releasing air-to-surface missiles or decoys, aerial refueling, or landing) to be performed at the end of the leg. The course is given by specifying the terminal point of the leg, and the plane will fly to this point along the great circle course. The in-flight plan also specified the mode 124

of flight, the use of electronic countermeasures, whether or not the leg is over enemy territory, and the number of the next in-flight plan. Missile Plans - A missile plan specifies the launching of only one missile. The plan specifies the missile base and target. After the missile has been launched, it becomes a cell and its flight is controlled by a series of in-flight plans similar to the in-flight plans of manned aircraft. Tanker Plans - Tanker plans specify the source, selection, flight, and destination (refueling area), and recovery base of a tanker. How inputs are introduced: All of the plans, many of the lists, and some of the parameter tables required for ABM II are prepared by Plan Converter I (PC I) from data that are input in operational terminology. The Air Battle Model II consists of a master control routine and seven major routines, which are called complexes. 6. Outputs from model: The information generated during the running of the Air Battle Model II is recorded on output tapes. These tapes present a complete picture of the history of the action and interaction of each aircraft, missile, weapon, offensive and defensive site, target, and plan of both sides. D. Documentation: 1. Published or unpublished reports: TR59-1 Users and Operators Manual, 3 volumes TR59-2 Gross Flow Charts, 2 volumes TR59-3 Detail Flow Charts, 3 volumes TR59-4 Card Format Manual, 3 volumes TR59-5 Glossary 2. Logical flow charts available: See documents cited above. E. General Discussion: The above brief description of the inputs to Air Battle Model II were abstracted from TR59, Vol. 2. As in the case of the SCRAMBLE Model (10), Air Battle Model II is well documented. In addition to the documents listed in D.1., a summary of the basic features of the model is described in the following paper: Adams, R. and J. Jenkins, "Simulation of Air Operations with the Air Battle Model," OPERATIONS RESEARCH, Vol. 8, No. 5 (1960). Other large models constructed by Technical Operations, Inc., are briefly described in the following paper: Jenkins, J., "A Collection of Simulation Models for the Study of Air War," A paper given at the Nineteenth National Meeting of the Operations Research Society of America, May 26, 1961. Preliminary reports on two of these models (Simulation of Total Atomic Global Exchange - STAGE and the Missile Battle Model - MBM) are included in the bibliography in Appendix D. 125

ABSTRACT (54) Headquarters, USAF A. Identification: 1. Agency sponsoring contract: Internal 2. Agency monitoring research: Internal 3. Official title of contract: Not applicable 4. In-house designation: Simplified penetration model 5. Date of initiation: March 1960 6. Completion: Several versions have been completed. Other versions may be completed, but there is no target date. 7. Principal investigator: Mr. Clayton J. Thomas B. Description of model: 1. Physical system being modeled: The model represents the interaction of offensive missiles and bombers of one Air Force with the Air Defenses of an opposing force. 2. Principal objective of model: To serve as one tool in the investigation of future force posture. 3. Type of model: Computer simulation, fast time expected value. Type and size of computer: IBM 7090 - 32K. Approximate time required for one run of model: 2 minutes. 4. Input information: In general, for each bomber of type K (1 < K < 10) the number of bombers of type K, the speed, maximum penetration range, number of bombs and number of air-to-surface missiles must be known. For the defense quantities such as the number of fighter bases, the number and type of surface-to-air missiles fired from each zone of defense and the associated expected kill probabilities for each type of missile against each type of bomb must be known. How inputs are introduced: This is an expected value model using fixed time interval up-dating and zone defenses. 5. Variables in model: The variables in the model are basically the current tallies by zone which describes the progress of the attack in time. These variables include, such quantities as interceptors available for Scramble, interceptor currently being deployed, compiler for missile types, etc. 6. Outputs from model: A detailed history for time and zone of the progress of the war. This includes quantities such as launch times and impact times, expected numbers of on kill bombers, expected numbers of warheads delivered, etc. 7. Criteria to evaluate system performance: Warheads delivered, interacting targets destroyed, surviving penetrators. C. Validation and use of the model: 1. Experiments completed: Over 100 runs of the model have been made. AFCOA, Headquarters USAF was responsible for these experiments. 2. Tests to validate model: None 3. Future experiments and/or tests: None 126

4. Response of sponsor: The work has been sponsored b-y our owrn oflce. Tnose 7:,' ~a heard briefings based on the work have responded favorably. 5. (a) Agency receiving results: Air Staff of Headquarters USAF. (b) Use of model and significant results: Results have been used as ingretea:s i: Air Staff studies of force posture. D. Documentation: 1. Published or unpublished reports: OA Working Paper 98. 2. Logical flow charts available: A very gross flow chart together with FORTRANI progress listings are given in OA Working Paper 98. 127

ABSTRACT (55) The University of Michigan A. Identification: 1. Agency sponsoring contract: United States Army Signal Air Defense Engineering Agency (USASADEA) 2. Agency monitoring research: USASADEA 3. Official title of contract: Design of Experiments Study for Antiaircraft 4. In-house designation: Project 03903 5. Date of initiation: July 1959 6. Completion: July 1960 7. Principal investigator: Mr. Jack M. Miller B. Description of model: 1. Physical system being modeled: The system being modeled is the Missile Master (AN/FSG-1) Air Defense System operating in the ATABE (Automatic-Target-to-Battery-Evaluation) option wherein batteries are selected for assignment to targets by means of the SAGE computer's programmed selection logic. 2. Principal objective of model: The principal objective of the model is to provide a means of evaluating the performance of the Missile Master system when operating in the ATABE option. 3. Type of model: The model may be described as a fast-time digital computer Monte Carlo model. The ATABE-Missile Master model contains approximately 4,000 instructions and uses, 5,600 locations for tables, fire unit masks, and temporary storage. The system input and output routines use an additional 14,500 words of storage. Thus, the entire program requires approximately 24,100 words of core storage. No tapes or drums are used except for output. The program is coded so that it can be run on either an IBM 709 or a 7090 with 32,000 words of core storage. A 15-target raid with no maneuvering targets will take approximately 50 seconds and with maneuvering targets takes approximately 1 minute. A 10-target raid with maneuvering targets takes approximately 45 seconds, and a 20-target raid, also with maneuvering targets, takes approximately 1 1/4 minutes. 4. Input information: Input to the model consists of parameters describing the system geometry, missile flight time parameters, radar error distributions, time delay distributions, target flight path parameters, ATABE logic parameters, and various probabilities. 5. Variables in model: Raid Parameters Identification Parameters Authorization Parameters Path Parameters Target Parameters SAGE Direction Center Parameters General System Parameters Tracking Parameters 128

Radar Error and Tab Error Parameters ATABE Geometry Parameters ATABE Logic Parameters Grid System Extrapolation Times Priority Table Parameters Fire Scale Parameters Miscellaneous ATABE Logic Parameters Battery Parameters Battery Geometry and Bookkeeping Parameters Battery Radar Parameters Battery Operation Parameters Non-stochastic Times Battery Time Distributions and Probabilities Missile Parameters 6. Outputs from model: Standard output consists of a target card for each target listing such things as the number of batteries designated to the target, the number of batteries which locked-on to the target during the engagement, the number of missiles fired at the target, whether the target was killed, and the number of the battery which killed the target; a battery card for each target listing the number of each kind of missile fired at the target; a summary card listing items such as the total number of targets killed, the total number of lock-on's, the total number of missiles fired mean and standard deviation of target distance from center of the defended area at kill. In addition to the standard output, a special output may be obtained which lists all designations, lock-ons, no lock-ons, kills, and times at which these events occurred. SAGE (X,Y) coordinates, Tab (X,Y) coordinates, and True (X,Y) coordinates of the targets associated with these events are available as well as reasons why unsuccessful engagements occurred. The special output permits study of unsuccessful engagements by any given ATABE assigned battery and allows detailed examination of the sequence of events in the raid. 7. Criteria to evaluate system performance: Primary criterion employed is the number of targets killed before penetration of the defended area. Other criteria can easily be employed using the printouts mentioned in Section B.6. C. Validation and use of the model: 1. Experiments completed: About 75. simulated air raid types have been simulated with the model. These have been used mainly to obtain the preliminary results noted to plan more sophisticated experiments with the model. These experiments were designed by Mr. Morton Goldberg of the Operations Research Department. 2. Tests to validate model: No tests have been performed as of this date to validate the model. 3. Future experiments and/or tests: Data from the Phase II and Phase III NORAD SAGE/ Missile Master test program is to be used to validate the mathematical model. These are large-scale system tests employing SAC and ADC aircraft. The field test program is the responsibility of the NORAD Joint Test Force stationed at Stewart Air Force Base. The responsibility for comparing field test results and model results is the responsibility of the Operations Research Department. 4. Response of sponsor: The sponsor appears to be well pleased with the research effort and has asked that the effort be disseminated widely among various R and D agencies. 5. (a) Agency receiving results: USASADEA (b) Use of model and significant results: No use has been made of the model as yet; however, NORAD, U.S. Army Air Defense Board, U. S. Army Air Defense Command, and other agencies have indicated that they wish problems to be run on this model. 129

D. Documentation: 1. Published or unpublished reports: Michigan Institute of Science and Technology Technical Memorandum 2354-35-T by R. E. Lewkowicz and J. M. Miller. 2. Logical flow charts available: Logical flow charts are available in the document cited above. E. General Discussion: The purpose of the ATABE-Missile Master system model is the study of the performance of the Missile Master system when the decision making at the AADCP is replaced by the decision making of the ATABE logic, with batteries not permitted to select their own targets for engagement. The choice of which aspects in the SAGE-Missile Master system were to be represented in the greatest detail and which were to be represented more simply was largely guided by adherence to this purpose and partly forced by lack of information which can only be obtained through tests. Certain simplifications, assumptions, and omissions had to be made concerning both battery operation and activity at the SAGE Direction Center. In the real system the ATABE logic depends not only on track information restored in the SAGE computer and the programmed ATABE decisions rules but also on switch-actions taken by personnel at the SAGE Direction Center. The system model, as programmed at present, idealizes the SAGE Direction Center-ATABE-Battery interaction in the sense that decision making at the SAGE Direction Center is almost entirely pure programmed ATABE logic with little account taken of switch-actions. The actions that do not directly affect the ATABE logic or have the nature of supplementing or countermanding ATABE decisions are for the most part ignored for the following reasons: (1) the SOP's for many of the actions are not known, (2) the times needed to perform the actions are not known, and (3) it was felt sufficient, for the present, to study Missile Master operation in the ATABE option without permitting intervention by personnel at the SAGE Direction Center. The only actions represented in the System Model (and then only simply) are: (1) the actions of the SAGE track monitors, (2) the actions of the authority who decides that a target in the system has been destroyed, and (3) the action of the authority who authorizes nuclear fire for the sector. In contrast to the simplicity with which the actions of SAGE Direction Center personnel are represented, the representations in the System Model of the BC's actions and of battery operations are more elaborate. More is known about battery operations from tests and experience gained from previous studies. Although the SOP's for the action of the BC in the face of ATABE input are not known, and although the data from the ATABE tests have not yet produced a clear picture of time-distributions and of SOP's employed, enough is known that a reasonably detailed and plausible SOP can be postulated for the BC when he is faced with ATABE input, and provision has been made to utilize information when it becomes available to construct a more realistic representation of battery operation. Therefore, because the primary object under present study is the performance of the Missile Master system with SAGE-ATABE inputs and not the SAGE system per se, the greatest degree of realism is to be found in the routines that represent battery operation and in the evaluation event processing routine where the ATABE logic, apart from switch-action interpretation and its consequences, is represented in full. 130

ABSTRACT (56) Weapon Systems Evaluation Group A. Identification: 1. Agency sponsoring contract: Internal 2. Agency monitoring research: Internal 6. Completion: Report Dated 15 October 1959 7. Principal investigator: G. E. Pugh and R. J. Galiano B. Description of model: 1. Physical system being modeled: Close-in disposition of fallout. 2. Principal objective of model: To calculate distribution of fallout activity with given wind condition. Also included are a climatologic model and a model of biological effect. 3. Type of model: Analytical, with a computer employed to generate tables corresponding to various parameter setting combinations. 4. Input information: Data to determine a function, g(t), that represents the fraction of the total radioactivity which arrive on the ground per unit time. How inputs are introduced: The functional form g(t) = K exp -(T) n is assumed, where T and n are parameters adjusted to fit the data. With subsequent derivation in which F o is the fraction of total radioactive debris to be accounted for in the fallout pattern and r is the complete gamma function, the expression g(t) = exp -t) no T r (+1... o is derived. Next, the horizontal distribution of activity reaching the ground, p (x,y) is approximated by the two dimensional Gaussian distribution as p (xy) = 2 exp (- 1/2 (x 2 + in which c is a measure of effective cloud 211 a o radius. Subsequently, a deposition, d(x) as a function of x, the distance downwind, is obtained. Approximations are made to obtain d(x) ' L)exp -IL LPr (1+) ) ~2 2 n L + 2 2 2 o o where L =WT (W a uniform wind), L L + 2, n = 2, and M the cumulative o L + 1/2 normal. 5. Variables in model: Subsequent refinement of the model to take into account climatologic information (wind sheer) and also biological effort relations were introduced. 6. Outputs from model: The output of this model was a series of curves which were compared with available data for goodness of fit. 7. Criteria to evaluate system performance: Not applicable. 131

D. Documentati on: 1. Published or unpublished reports: WSEG Research Memorandum No. 10, "An Analytic Mode of Close-In Deposition of Fallout for Use in Operational Studies," by George E. Pugh and Robert Galiano. WSEG Research Memorandum No. 10, Supplement, "Revision of Fallout Parameters for LowYield Detonations," by George E. Pugh. E. General Discussion: This work was preceded by the paper prepared by WSEG employees: "Simple Formulas for Calculating the Distribution and Effects of Fallout in Large Nuclear Weapon Campaigns (With Applications)" This earlier paper reports on an independent mathematical study conducted by the author. 132

APPENDIX D: Report List CORPORATION FOR ECONOMIC AND INDUSTRIAL RESEARCH Webb, K. W., THE MATHEMATICAL THEORY OF SENSITIVITY, Corporation for Economic and Industrial Research, Arlington 2, Va., October 1960. Webb, K. W., SIMULATION OF THE NAVY AIRCRAFT OVERHAUL OPERATION, Report No. 169-2R, Corporation for Economic and Industrial Research, Arlington 2, Va., May 1959. DEPARTMENT OF NATIONAL DEFENCE, CANADA Jamieson, D. M., AN INTERIM REPORT OF THE RESULTS OF SIMULATION OF THE OPERATION OF BOMARC IN SAGE, Report No. COR/DSE-M-58-10, Department of National Defence, Canada, Ottawa, Ontario, December 1958, SECRET. Jamieson, D. M., SIMULATION OF A BOMARC INTERCEPT IN A SAGE GROUND ENVIRONMENT, Report No. COR/DSE-M-58/1, Department of National Defence, Canada, Ottawa, Ontario, February 1958, SECRET. Slingerland, F. and W. McBride, EFFECTIVENESS COMPARISON OF VARIOUS RAIDS AND DEFENCES, Report No. ORG Internal Memo No. 55/5, Department of National Defence, Ottawa, Ontario, Canada, February 1955, SECRET. ELECTRONIC DEFENSE LABORATORY Albert, A. and F. Proschan, INCREASED RELIABILITY WITH MINIMUM EFFORT, Report No. EDL-M210, Electronic Defense Laboratory, Mountain View, Calif., September 1959. Barlow, R. E. and L. C. Hunter, MATHEMATICAL MODELS FOR SYSTEM RELIABILITY, Report No. EDL-E35, Electronic Defense Laboratory, Mountain View, Calif., August 1959. Barlow, R. E. and L. C. Hunter, PERFORMANCE OF A ONE UNIT SYSTEM, Report No. EDL-M288, Electronic Defense Laboratories, Mountain View, Calif., May 1960. Black, G. and F. Proschan, SPARE PARTS AT MINIMUM COST, Report No. EDL-M154, Electronic Defense Laboratories, Mountain View, Calif., December 1959. Proschan, F., OPTIMAL SYSTEM SUPPLY, Report No. EDL-E38, Electronic Defense Laboratory, Mountain View, Calif., January 1960. INSTITUTE FOR AIR WEAPONS RESEARCH Bialek, S. T., A STRATEGIC PENETRATION STUDY FOR THE 1965-67 TIME PERIOD, Report No. IAWR 59R18, Institute for Air Weapons Research, Chicago, Illinois, September 1959,SECRET. Feurzeig, W. et. al., THE DETAILED FLOW DIAGRAMS OF THE SCRAMBLE INTERACTION PROGRAM, Report No. 59-D-30M, Institute for Air Weapons Research, Chicago, Illinois, February 1960. 133

Feurzeig, W., A GENERAL DESCRIPTION OF THE SCRAMBLE SUBMODEL PROGRAMS, Report No. IAWR 59R19, Institute for Air Weapons Research, Chicago, Illinois, November 1959, CONFIDENTIAL. Feurzeig, W., THE PROCEDURES OF THE SCRAMBLE INTERACTION: SUBMODEL PROCESS FLOW DIAGRAMS, Report No. IAWR 59R20, Institute for Air Weapons Research, Chicago, Illinois, December 1959, SECRET. Hesse, H. R., et. al., FINAL REPORT, TECHNOLOGICAL FORCE STRUCTURE STUDY, TASK I: STRATEGIC OFFENSIVE WEAPONS SYSTEMS 1965-1970, PART IX, CONCEPTUAL TECHNIQUES, Report No. 60-R-15, Institute for Air Weapons Research, Chicago, Illinois, December 1960. Hesse, H. R. and G. W. Morgenthaler, REMARKS ON THE OPTIMAL COMMITMENT OF DEFENSIVE INTERCEPTORS AND SURFACE-TO-AIR MISSILES, Report No. 57-6, Institute for Air Weapons Research, Chicago, Illinois, January 1958. Painter, N. H., CHARTS USED IN SCRAMBLE SEMINAR, Report No. IAWR 58-28, Institute for Air Weapons Research, Chicago, Illinois, December 1958, SECRET. Painter, N. H., SCRAMBLE MODEL - DEFENSE CONCEPTS, Report No. IAWR 59R14, Institute for Air Weapons Research, Chicago, Illinois, November 1960, CONFIDENTIAL. Painter, N. H., SCRAMBLE MODEL - INTRODUCTION, Report No. IAWR 59R13, Institute for Air Weapons Research, Chicago, Illinois, December 1960. Painter, N. H., SCRAMBLE MODEL - OFFENSE CONCEPTS, Report No. IAWR 59R15, Institute for Air Weapons Research, Chicago, Illinois, September 1959, CONFIDENTIAL. Painter, N. H., SCRAMBLE MODEL - PRE-COMPUTER INPUTS, Report No. IAWR 59R16, Institute for Air Weapons Research, Chicago, Illinois, September 1959, CONFIDENTIAL. Painter, N. H., SCRAMBLE MODEL - PROCESSOR INPUTS, Report No. IAWR 59R17, Institute for Air Weapons Research, Chicago, Illinois, September 1959, CONFIDENTIAL. Ross, S., A DESCRIPTION OF THE OUTPUTS AND EDITORS AVAILABLE IN SCRAMBLE, Report No. IAWR 59R21, Institute for Air Weapons Research, Chicago, Illinois, February 1960, CONFIDENTIAL. Saxon, D. and H. Miller, OPERATION HOUND DOG: A STUDY OF THE EMPLOYMENT OF AN AIR-TOSURFACE MISSILE SYSTEM, Report No. 58-23, Institute for Air Weapons Research, Chicago, Illinois, November 1958, SECRET. Strauss, W. J., A RELATIVELY SIMPLE ESTIMATED VALUE CAMPAIGN MODEL, Report No. 58-R-6, Institute for Air Weapons Research, Chicago, Illinois, July 1958, SECRET. Strauss, W. J., A SATURATION AND ATTRITION MODEL, Report No. 58-5, Institute for Air Weapons Research, Chicago, Illinois, July 1958. Wells, W. J., R & D TEAM CAMPAIGN STUDY, VOLUME I: DESCRIPTION OF CAMPAIGNS, Report No. IAWR 59-1, Institute for Air Weapons Research, Chicago, Illinois, January 1959, SECRET. 134

Wells, W. J., R & D TEAM CAMPAIGN STUDY, VOLUME II: METHOD OF ANALYSIS AND APPLICATION TO BASIC CAMPAIGN, Report No. IAWR 59-2, Institute for Air Weapons Research, Chicago, Illinois, March 1959, SECRET. Wells, W. J., et al, R & D TEAM CAMPAIGN STUDY, VOLUME III: RESULTS OF CAMPAIGN VARIATIONS, Report No. IAWR 59-3, Institute for Air Weapons Research, Chicago, Illinois, March 1959, SECRET. Wester, J. W., FINAL REPORT: CONTRACT AF 33(616)-3274, Report No. IAWR 60R2, Institute for Air Weapons Research, Chicago, Illinois, February 1960, SECRET. GE-TECHNICAL MILITARY PLANNING OPERATION Bovaird, R. L and H. I. Zager, AN ANALYTICAL TECHNIQUE FOR DETERMINING THE OPERATIONAL.AVAILABILITY OF COMPLEX ELECTRONIC EQUIPMENT, Report No. RM 59TMP-58, GE-Technical Military Planning Operation, Santa Barbara, California, December 1959. Bozovich, J. F., OPERATIONAL AVAILABILITY ANALYSIS -- POLARIS FIRE CONTROL SYSTEM, Report No. RM 59TMP-60, GE-Technical Military Planning Operation, Santa Barbara, California, December 1959, SECRET. LINCOLN LABORATORY OPERATIONAL AND MATHEMATICAL SPECIFICATIONS FOR RADAR DATA INPUTS IN THE INITIAL SAGE SYSTEM, Report No. 6M-3774-3, Lincoln Laboratory, Massachusetts Institute of Technology, Lexington, Mass. April 1957, CONFIDENTIAL. QUARTERLY PROGRESS REPORT —DIVISION 5 -- INFORMATION PROCESSING, Report No. 5-QP 9/59, Lincoln Laboratory, Massachusetts Institute of Technology, Lexington, Mass., September 1959. LOCKHEED AIRCRAFT CORPORATION AICBM-INSATRAC SYSTEM STUDY, VOLUME II-APPENDIXES, SEMIANNUAL REPORT: JULY-DECEMBER 1959, Report No. LMSD-288117, Lockheed Aircraft Corporation, Sunnyvale, Calif., January 1960. NORTHROP AIRCRAFT, INCORPORATED Barnum, W. C.,et al, COMPARISON OF WEAPON SYSTEMS BY A COMPUTER-PLAYED WAR GAME, VOLUME I, SUMMARY OF ANALYTICAL METHODS AND RESULTS, Report No. NAI 56-238 (I), Northrop Aircraft, Inc., Hawthorne, Calif., July 1956, SECRET. Barnum, W. C., et al, COMPARISON OF WEAPON SYSTEMS BY A COMPUTER-PLAYED WAR GAME, VOLUME II, DETAILS OF ANALYSIS AND DATA USED FOR EVALUATION, Report No. NAI 56-238 (II), Northrop Aircraft, Inc., Hawthorne, Calif., July 1956, SECRET. Barnum, W. C., R. A. Maxwell, and R. L. Moore, COMPARISON OF WEAPON SYSTEMS BY A COMPUTERPLAYED WAR GAME, SUPPLEMENT NO. 1, EVALUATION OF 1960 SYSTEMS WITHOUT COUNTERBATTERY, Report No. NAI 56-238 (1), Northrop Aircraft, Inc., Hawthorne, Calif., August 1956, SECRET. 135

Barnum, W. C., R. A. Maxwell and R. L. Moore, COMPARISON OF WEAPON SYSTEMS BY A COMPUTERPLAYED WAR GAME, SUPPLEMENT NO. 2, RESULTS OF CALCULATIONS FOR 1956 AND 1960 SYSTEMS WITH COUNTERBATTERY, Report No. NAI 56-238 (2), Northrop Aircraft, Inc., Hawthorne, Calif., September 1956, SECRET. Barnum, W. C., COMPARISON OF WEAPON SYSTEMS BY A COMPUTER-PLAYED WAR GAME, SUPPLEMENT NO. 3, EVALUATIONS OF 1956 and 1960 SYSTEMS, Report No. NAI 56-238 (3), Northrop Aircraft, Inc., Hawthorne, Calif., October 1956, SECRET. Barnum, W. C., et al, THE FUTURE REQUIREMENTS FOR MANNED AIRCRAFT FOR THE SUPPORT OF FIELD ARMY OPERATIONS, Report No. NAI 56-527, Northrop Aircraft, Inc., Hawthorne, Calif., January 1957, SECRET. Barnum, W. C., et al, THE FUTURE REQUIREMENTS FOR MANNED AIRCRAFT FOR THE SUPPORT OF FIELD ARMY OPERATIONS, SUPPLEMENT I, AN ANALYSIS OF THE COUNTER-AIR WAR, Report No. NAI 56-527, Northrop Aircraft, Inc., Hawthorne, Calif., December 1957, SECRET. Kibbey, M. and J. Watson, MISSILE SPARES PROVISIONING STUDY, Report No. NOR 58-478, Northrop Aircraft, Inc., Hawthorne, Calif., October, 1959. MATHEMATICAL MODEL FOR COMPARISON OF WEAPON SYSTEMS, Report No. NAI 54-816, Northrop Aircraft Inc., Hawthorne, Calif., December 1954, SECRET. A METHOD FOR PREDICTING WEAPON AND SUPPORT SYSTEM MAINTAINABILITY, Report No. NAI 57-848, Northrop Aircraft, Inc., Hawthorne, California, September 1957. NORAIR WEAPON SYSTEMS ANALYSIS GROUP CAPABILITIES, Report No. NB 59-137, Northrop Aircraft, Inc., Hawthorne, Calif., October 1959. Toeppen, M. K, THE FUTURE REQUIREMENTS FOR MANNED AIRCRAFT FOR THE SUPPORT OF FIELD ARMY OPERATIONS, APPENDIX B, ACTIVATION AND OPERATION COSTS OF WEAPON SYSTEMS IN A PENTANA TYPE ARMY IN ETO, Report No. NAI 56-527, Northrop Aircraft, Inc., Hawthorne, Calif., January 1957, SECRET. TERMINAL AIR BATTLE STUDY ADVANCED (1960-1970) WEAPON SYSTEMS, VOLUME I, INTRODUCTION'TERMINAL AIR BATTLE, Report No. NAI 57-1240 (I), Northrop Aircraft, Inc., Hawthorne, California, May 1958, SECRET. Armstrong, W. A. and J. F. Paris, TERMINAL AIR BATTLE STUDY ADVANCED (1960-1970) WEAPON SYSTEMS, VOLUME II, CONDITIONS OF ENGAGEMENT, Report No. NAI 57-1240 (II), Northrop Aircraft, Inc., Hawthorne, Calif., May 1958, SECRET. Nadel, A, and W. A. Armstrong, TEMINAL AIR BATTLE STUDY ADVANCED (1960-1970) WEAPON SYSTEMS, VOLUME III, ADVANCED DETECTION AND TRACKING SYSTEMS, Report No. NAI 57-1240 (III), Northrop Aircraft, Inc., Hawthorne, Calif., May 1958, SECRET. Lynch a A, F. and A. Nadel, TERMINAL AIR BATTLE STUDY ADVANCED (1960-1970) WEAPON SYSTEMS, VOLUME IV, ADVANCED AIRBORNE ECM EQUIPMENT, Report No. NAI 57-1240 (IV), Northrop Aircraft, Inc., Hawthrone, Calif., May 1958, SECRET. 136

Armstrong, W. A. and J. F. Paris, TERMINAL AIR BATTLE STUDY ADVANCED (1960-1970) WEAPON SYSTEMS, VOLUME V, TERMINAL AIR BATTLE - NORTAM, Report No. NAI 57-1240 (V), Northrop Aircraft, Inc., Hawthorne, Calif., May 1958, SECRET. Brigham, O. R. and A. R. Stutz, TERMINAL AIR BATTLE STUDY ADVANCED (1960-1970) WEAPON SYSTEMS, VOLUME VI, TERMINAL AIR BATTLE-BARRIER FORMULATION, Report No. NAI 57-1240 (VI), Northrop Aircraft, Inc., Hawthorne, Calif., May 1958, SECRET. Taylor, J. L., TERMINAL AIR BATTLE STUDY ADVANCED (1960-1970) WEAPON SYSTEMS, VOLUME I, SUMMARY OF RESULTS, Report No. NAI 57-1301 (I), Northrop Aircraft, Inc., Hawthorne, Calif., January 1959, SECRET. Armstrong, W. A. and J. F. Paris, TERMINAL AIR BATTLE STUDY ADVANCED (1960-1970) WEAPON SYSTEMS, VOLUME II, ANALYSIS OF RESULTS, Report No. NAI 57-1301 (II), Northrop Aircraft, Inc., Hawthorne, Calif., January 1959, SECRET. TERMINAL AIR BATTLE STUDY ADVANCED (1960-1970) WEAPON SYSTEMS, VOLUME III, DATA APPENDICES, Report No. NAI 57-1301 (III), Northrop Aircraft, Inc., Hawthorne, Calif., January 1959, SECRET. OPERATIONS EVALUATION GROUP INDEX OF PUBLICATIONS OF THE OPERATIONS EVALUATION GROUP (U), 1 JANUARY 1960, Operations Evaluation Group, Washington, D. C., 1 January 1960, SECRET. OPERATIONS RESEARCH OFFICE DEFENSE OF THE US AGAINST ATTACK BY AIRCRAFT AND MISSILES (U), APPENDICES A AND C, TARGET STUDIES, Report No. ORO-R-17, Operations Research Office, The Johns Hopkins University, Chevy Chase, Maryland, June 1957, SECRET. DEFENSE OF THE US AGAINST ATTACK BY AIRCRAFT AND MISSILES (U), APPENDICES D, E, F, WEAPONS STUDIES-PART I, Report No. ORO-R-17, Operations Research Office, The Johns Hopkins University, Chevy Chase, Maryland, March 1957, SECRET. DEFENSE OF THE US AGAINST ATTACK BY AIRCRAFT AND MISSILES (U), APPENDICES G AND H, WEAPONS STUDIES-PART II, Report No. ORO-R-17, Operations Research Office, The Johns Hopkins University, Chevy Chase, Maryland, March 1957, SECRET. Meal, H. C. and R. G. Hendrickson, DEFENSE OF THE US AGAINST ATTACK BY AIRCRAFT AND MISSILES (U), APPENDIX S, NIKE AND TALOS DEFENSES AGAINST BOMBER ATTACKS (U), Report No. ORO-R-17, Operations Research Office, The Johns Hopkins University, Chevy Chase, Maryland, December 1957, SECRET. Pettee, G. S., COMMENTS UPON "AN APPRAISAL OF ORO's DEFENSE STUDY, ORO-R-17" (RM-2120), Report No. ORO-SP-59, Operations Research Office, The Johns Hopkins University, Bethesda, Maryland, May 1958, SECRET. Yahraes, H., DEFENSE OF THE US AGAINST ATTACK BY AIRCRAFT AND MISSILES A DIGEST OF ORO-R-17, Report No. ORO-SP-50, Operations Research Office, The Johns Hopkins University, Bethesda Maryland, April 1958, SECRET. 137

PLANNING RESEARCH CORPORATION AN ANALYSIS OF MOBILITY FOR BALLISTIC MISSILE SYSTEMS, VOLUME I, Report No. PRC R-130, Planning Research Corporation, Los Angeles, Calif., December 1959, SECRET. AN ANALYSIS OF MOBILITY FOR BALLISTIC MISSILE SYSTEMS, VOLUME II, Report No. PRC R-130, Planning Research Corporation, Los Angeles, Calif., December 1959, SECRET. Howard, W. J., R. R. Howard, and R. A. Hadden, A STUDY OF THE DISTRIBUTIONS AND FACTORS INFLUENCING DOWN TIME AND AVAILABILITY OF OPERATIONAL MILITARY EQUIPMENTS, Report No. PRC R-53, Planning Research Corporation, Los Angeles, California, October 1957, CONFIDENTIAL. Mehl, R. F. and A. Wylly, TANK LOGISTICS (THE EFFECT OF TANK CHARACTERISTICS ON THE EFFORT REQUIRED TO SUPPORT A TANK BATTALION), APPENDIX A, TRE TACTICAL MODEL, Report No. PRC R-25, Planning Research Corporation, Los Angeles, Calif., February 1956. MILITARY REPORTS AND DOCUMENTS, Report No. PRC AD-103, Planning Research Corporation, Los Angeles, Calif., June 1960, SECRET. SOVIET AERIAL RECONNAISSANCE OF THE ATOMIC DELIVERY UNITS OF A U. S. FIELD ARMY, Report No. PRC R-148, Planning Research Corporation, Los Angeles, Calif., December 1959, SECRET. A STUDY OF DATA COLLECTION AND HANDLING IN A GLOBAL SURVEILLANCE SYSTEM, Report No. PRC R-125, Planning Research Corporation, Los Angeles, Calif., May 1960, SECRET. Wylly, A. and L. P. Holliday, TANK LOGISTICS (THE EFFECT OF TANK CHARACTERISTICS OF THE EFFORT REQUIRED TO SUPPORT A TANK BATTALION), SUMMARY REPORT, Report No. PRC R-24, Planning Research Corporation, Los Angeles, Calif., February 1956, SECRET. THE RAND CORPORATION Bean, E. E., and W. H. McGlothlin, AN ANALYTICAL MODEL FOR DEVELOPING OPTIMAL BALLISTIC MISSILE MAINTENANCE PROCEDURES, Report No. P-1696, The RAND Corporation, Santa Monica, Calif., May 1959. Beverly, R. S., COMMUNICATIONS ANALYSIS IN LP-1, Report No. P-1457, The RAND Corporation, Santa Monica, Calif., August 1958. Clark, A. J., and R. M. Paulson, IMPLEMENTING LOGISTICS POLICIES IN LABORATORY PROBLEM I (LP-1), Report No. RM-2220, The RAND Corporation, Santa Monica, Calif., June 1958. Dalkey, N. and L. Wegner, THE STRATEGIC OPERATIONS MODEL —A SUMMARY REPORT (U), Report No. RM-2221, The RAND Corporation, Santa Monica, California, July 1958, CONFIDENTIAL. Dalkey, N. C. and L. H. Wegner, THE STRAGEGIC OPERATIONS MODEL (U), Report No. RM-2250, The RAND Corporation, Santa Monica, Calif., September 1958, CONFIDENTIAL. 138

Enke, S., ON THE ECONOMIC MANAGEMENT OF LARGE ORGANIZATIONS: A CASE STUDY IN MILITARY LOGISTICS INVOLVING LABORATORY SIMULATION, Report No. P-1368, The RAND Corporation, Santa Monica, Calif., May 1958. FIRST TOOLING-UP EXERCISE FOR LOGISTICS SYSTEMS LABORATORY (OCTOBER-NOVEMBER 1956), Report No. RM-1924, The RAND Corporation, Santa Monica, Calif., July 1957. Geisler, M. A., COMMUNICATIONS AND CONTROL REQUIREMENTS IN THE AIR FORCE LOGISTICS SYSTEM, Report No. P-1615, The RAND Corporation, Santa Monica, Calif., February 1959. Geisler, M. A., DEVELOPMENT OF MAN-MACHINE SIMULATION TECHNIQUES, Report No. P-1945, The RAND Corporation, Santa Monica, Calif., March 1960. Geisler, M. A, A FIRST EXPERIMENT IN LOGISTICS SYSTEM SIMULATION, Report No. P-1415, The RAND Corporation, Santa Monica, Calif., March 1960. Geisler, M. A., INTEGRATION OF MODELING AND SIMULATION IN ORGANIZATIONAL STUDIES, Report No. P-163[, The RAND Corporation, Santa Monica, Calif., March 1959. Geisler, M. A., LOGISTICS RESEARCH AND MANAGEMENT SCIENCE, Report No. P-1868, The RAND Corporation, Santa Monica, Calif., December 1959. Geisler, M. A., MAN-MACHINE SIMULATION PROGRESS, Report No. P-2086, The RAND Corporation, Santa Monica, Calif., August 1960. Geisler, M. A., THE SIMULATION OF A LARGE-SCALE MILITARY ACTIVITY, Report No. P-1555, The RAND Corporation, Santa Monica, California, March 1959. Geisler, M. A., SIMULATION TECHNIQUES, Report No. P-1808, The RAND Corporation, Santa Monica, Calif., September 1959. Geisler, M. A., and W. A. Steger, THE USE OF SIMULATION IN TEE DESIGN OF AN OPERATIONAL CONTROL SYSTEM, Report No. P-1986, The RAND Corporation, Santa Monica, Calif., May 1960. Geisler, M. A., THE USE OF MAN-MACHINE SIMULATION IN THE DESIGN OF CONTROL SYSTEMS, Report No. P-1780, The RAND Corporation, Santa Monica, Calif., August 1959. Geisler, M. A., THE USE OF MAN-MACHINE SIMULATION FOR SUPPORT PLANNING, Report No. P-1823, The RAND Corporation, Santa Monica, Calif., October 1959. Haythorn, W. W., SIMULATION IN RAND'S LOGISTICS SYSTEMS LABORATORY, Report No. P-1075, The RAND Corporation, Santa Monica, Calif., April 1957. Haythorn, W. W., SIMULATION IN RAND'S LOGISTICS SYSTEMS LABORATORY PROBLEM 1, Report No. P-1456, The RAND Corporation, Santa Monica, Calif., September 1958. Haythorn, W. W., THE USE OF SIMULATION IN LOGISTICS POLICY RESEARCH, Report No. P-1791, The RAND Corporation, Santa Monica, Calif., September 1959. 139

Haythorn, W. W., THE USE OF SIMULATION IN ESTIMATING INTRASQUADRON LOGISTICS REQUIREMENTS: A DESCRIPTION OF LP-II, PHASE 1.1, Report No. P-1656, The RAND Corporation, Santa Monica, Calif., February 1959. Jones, W. M., M. B. Shapiro and N. Z. Shapiro, THE FLIGHT OPERATIONS PLANNER, Report No. RM-2415, The RAND Corporation, Santa Monica, Calif., July 1959. Labiner, K. H. and J. D. Tupac, EXPERIENCE INHE USE OF A SIMULATION LABORATORY IN THE DESIGN OF A MANAGEMENT INFORMATION SYSTEM, Report No. P-2115, The RAND Corporation, Santa Monica, Calif., October 1960. McGlothlin, W. H. et al, THE SIMULATED AIRCRAFT AND ITS FAILURE MODEL IN LP-1, Report No. RM-2177, The RAND Corporation, Santa Monica, Calif., May 1958. McGlothlin, W. H., THE SIMULATION LABORATORY AS A DEVELOPMENTAL TOOL, Report No. P-1454, The RAND Corporation, Santa Monica, Calif., August 1958. Mendershausen, H., PUBLICATIONS OF THE LOGISTICS DEPARTMENT, Report No. RM-2155-2, The RAND Corporation, Santa Monica, Calif., April 1958. Nelson, H. Wayne, and J. W. Petersen, THE DESIGN AND OBJECTIVES OF LP-III, Report B-127, The RAND Corporation, Santa Monica, Calif., November 1959. Pickrel, E. W., SUPPORT RESOURCES, Report No. P-1981, The RAND Corporation, Santa Monica, Calif., May 1960. Rauner, R. M., LABORATORY EVALUATION OF SUPPLY AND PROCUREMENT POLICIES, THE FIRST EXPERIMENT OF THE LOGISTICS SYSTEMS LABORATORY, Report No. R-323, The RAND Corporation, Santa Monica, Calif., July 1958. Rauner, R. M., and W. A. Steger, SIMULATION AND LONG-RANGE PLANNING FOR RESOURCE ALLOCATION, Report No. P-2223-1, The RAND Corporation, Santa Monica, Calif., February 1961. Schamberg, R., G RNERALIZED ANALYSIS OF AERLAL CA-PAIGNS AGAINST STRATEGIC TARGETS, Report No. RA-1558, The RATN Corporation, Santa Monica, Calif.. December 1953, CONFIDENTIAL. SECOND TOOLING-UP EXERCISE OF LOGISTICS SYSTEMS LABORATORY (JANUARY-FEBRUARY 1957), Report No. RM-1961, The RAND Corporation, Santa Monica, Calif., August 1957. Sweetland, A. and W. W. Haythorn, AN ANALYSIS OF THE DECISION-MAKING FUNCTIONS OF A SIMULATED AIR DEFENSE DIRECTION CENTER, Report No. P-1988, The RAND Corporation, Santa Monica, Calif., September 1960. Winestone, R. L., COST AND PERFORMANCE DATA FROM LP-I: THE FIRST EXPERIMENT IN SIMULATION BY THE LOGISTICS SYSTEMS LABORATORY, Report No. RM-2200, The RAND Corporation, Santa Monica, Calif., July 1958. 140

RAMO-WOOLDRIDGE Bromberg, R., et al, PROTECTION OF BALLISTIC MISSILES FROM ENEMY COUNTERMEASURES, Report No. GM-TR-155, The RAMO-WOOLDRIDGE Corporation, Los Angeles, Calif., March 1957, (SECRET). Moravec, A. F., AUTOMATIC ARTILLERY FIRE PLANNING —A SYSTEMS ENGINEERING APPROACH, RamoWooldridge, Inc., Canoga Park, Calif., July 1960. Moravec, A. F., METHODS DESIGN FOR DEVELOPING AN AUTOMATIC DATA PROCESSING SYSTEM, RamoWooldridge, Inc., Canoga Park, Calif., March 1960. STANFORD RESEARCH INSTITUTE Dixon, H. L, D. G Haney, and P. S. Jones, A SYSTEM ANALYSIS OF THE EFFECTS OF NUCLEAR ATTACK ON RAILROAD TRANSPORTATION IN THE CONTINENTAL UNITED STATES, Stanford Research Institute, Menlo Park, Calif., April 1960. Shaeffer, K. H., A METHODOLOGY FOR MODELING IN GENERAL SYSTEM STUDIES, Stanford Research Institute, Menlo Park, Calif., August-Sept. 1960. Shapero, A. and C. Bates, Jr., A METHOD FOR PERFORMING HUMAN ENGINEERING ANALYSIS OF WEAPON SYSTEMS, Report No. WADC Technical Report 59-784, Stanford Research Institute, Menlo Park, Calif., September 1959. Singleton, R. C. SELECTED MATHEMATICAL MODELS RELATED TO ARMY COMBAT UNIT LOGISTICS PROBLEMS FOR THE FORWARD AREA, Stanford Research Institute, Menlo Park, Calif., January 1960. Th'ayer, S. B., and W. W. Shaner, THE EFFECTS OF NUCLEAR ATTACKS ON THE PETROLEUM INDUSTRY, Stanford Research Institute, Menlo Park, Calif., July 1960. SYSTEM DEVELOPMENT CORPORATION Bellman, W., WARM THREAT SCRIPT INPUT PROGRAM (STS) DESCRIPTION, Report No. FN-3786, System Development Corporation, Santa Monica, Calif., June 1960. Totschek, R., MATHEMATICAL SPECIFICATIONS FOR THE SAGE EVALUATION PROGRAM (SEV) FOR THE WEAPONS ASSIGNMENT RESEARCH MODEL (WARM), Report No. FN-3936, System Development Corporation, Santa Monica, Calif., July 1960. Totschek, R., THE MODIFIED HUNGARIAN METHOD, Report No. FN-3317, System Development Corporation, Santa Monica, Calif., March 1960. Totschek, R., OPERATIONAL SPECIFICATIONS FOR THE SAGE EVALUATION PROGRAM (SEV) FOR THE WEAPONS ASSIGNMENT RESEARCH MODEL (WARM), Report No. FN-3935, System Development Corporation, Santa Monica, Calif., July 1960. 141

Totschek, R., A WEAPONS ASSIGNMENT EVALUATION MODEL, Report No. FN-2676, System Development Corporation, Santa Monica, Calif., December 1959, CONFIDENTIAL. Totschek, R., WEAPONS ASSIGNMENT LINEAR PROGRAMMING (WALP) MANUAL OF OPERATION, Report No. TM-454, System Development Corporation, Santa Monica, Calif., January 1960. TECHNICAL OPERATIONS INC. Batten, D., R. Dion, and M. Neary, CONCEPT PAPER, MISSILE BATTLE MODEL II, Report No. TO-B 60-54, Technical Operations, Inc., Burlington, Mass., November 1960. Behnke, R. B., 433L SYSTEM SIMULATION MODIFICATION REVISED WEATHER MODEL, Report No. TO-B 61-433-3, Technical Operations, Inc., Burlington, Mass., March 1961. CARD FORMAT MANUAL, VOLUME I: AIR BATTLE MODEL II AND PLAN CONVERTER I, Report No. TR 59-4, Technical Operations, Inc., Washington, D. C., January 1960. CARD FORMAT MANUAL, VOLUME II: AIR BATTLE MODEL II AND PLAN CONVERTER I, Report No. TR 59-4, Technical Operations, Inc., Washington, D. C., January 1960. Cheatham, T. E., A PROGRAMMING SYSTEM FOR INFORMATION PROCESSING PROBLEMS, Report No. CL-1, Technical Operations, Inc., Burlington, Mass. DETAIL FLOW CHARTS, VOLUME I: AIR BATTLE MODEL II, Report No. TR 59-3, Technical Operations, Inc., Washington, D. C., March 1960. DETAIL FLOW CHARTS, VOLUME II: PLAN CONVERTER-I, Report No. TR 59-3, Technical Operations, Inc., Washington, D. C., January 1960. Dion, R., et al, AN INTERIM REPORT ON THE MISSILE BATTLE MODEL, Report No. TO-B 60-5, Technical Operations, Inc., Burlington, Mass., February 1960. Dion, R., MISSILE BATTLE MODEL CARD FORMAT MANUAL, Report No. TO-B 60-12, Technical Operations Inc., Burlington, Mass., July 1960. Dion, R., MISSILE BATTLE MODEL CARD FORMAT MANUAL, Report No. TO-B 60-12, Revision B, Technical Operations, Inc., Burlington, Mass., May 1960. Fain, C. G. and D. Schimelfenyg, DESCRIPTION OF THE 433L SYSTEMS SIMULATION (With Input Specifications), Report No TO-B 61-433-18, Technical Operations, Inc., Burlington, Mass., January 1961. Friedman, J. and M. Lautzenheiser, USERS AND OPERATORS MANUAL, VOLUME III: OUTPUT, Report No. TR 59-1, Technical Operations, Inc., Washington, D. C., January 1960. GLOSSARY, Report No. TR 59-5, Technical Operations, Inc., Washington, D. C., February 1960. 142

Greenstone, R., SIMULATION CONCEPTS AND AIR WARFARE, Report No. SM 58-3, Technical Operations, Inc., Washington, D. C., February 1958, CONFIDENTIAL. GROSS FLOW CHARTS, VOLUME I: AIR BATTLE MODEL II, Report No. TR 59-2, Technical Operations, Inc., Washington, D. C., April 1960. A GUIDE TO THE AIR BATTLE MODEL II, CARD FORMAT MANUAL, VOLUME III: OUTPUT, Report No. TR 59-4, Technical Operations, Inc., Washington, D. C., November 1959. Jacoby, J. E. and S. Harrison, EFFICIENT EXPERIMENTATION WITH SIMULATION MODELS, Report No. TR 60-2, Technical Operations, Inc., Washington, D. C., June 1960. Lautzenheiser, M. and J. Friedman, DETAIL AND GROSS FLOW CHARTS, VOLUME III: OUTPUT, Report No. TR 59-3, Technical Operations, Inc., Washington, D. C., February 1960. OPERATING INSTRUCTIONS (MAGNETIC TAPE SYSTEM) SUPPLEMENT TO "USERS AND OPERATORS MANUAL" (TR 59-1,VOLUME I), Report No. TR 60-3, Technical Operations, Inc., Washington, D. C., September, 1960. Peters, C. B., GROSS FLOW CHARTS, VOLUME II: PLAN CONVERTER I, Report No. TR 59-2, Technical Operations, Inc., Washington, D. C., February 1960. Peters, Carol, A GUIDE TO THE SIMULATION OF PLANS IN THE AIR BATTLE MODEL, Report No. SM 57-2, Technical Operations, Inc., Washington, D. C., December 1957. Peters, C. B., USERS AND OPERATORS MANUAL, PLAN CONVERTER IA, Report No. TR 60-4, Technical Operations, Inc., Washington, D. C., December 1960. Peters, C. B., USERS AND OPERATORS MANUAL, VOLUME II: PLAN CONVERTER-1, Report No. TR 59-1, Technical Operations, Inc., Washington, D. C., May 1960. Peters, C. B., THE TECHNICAL MANUAL FOR THE AIR BATTLE MODEL, Report No. TM 58-4, Technical Operations, Inc., Washington, D. C., August 1958, CONFIDENTIAL. Petree, B. R., OUTPUT IA, A REVISED OUTPUT PROGRAM FOR THE AIR BATTLE MpDEL II, Report No. TR 60-1, Technical Operations, Inc., Washington, D. C., August 1960. Pond, J., CONCEPT FOR ATTRITION IN STAGE, Report No. CP 60-4, Technical Operations, Inc., Washington, D. C., June 1960 USERS AND OPERATORS MANUAL, VOLUME I: AIR BATTLE MODEL II, Report No. TR 59-1, Technical Operations, Inc., Washington, D. C., June 1960. WEAPONS SYSTEMS EVALUATION GROUP Everett, H. and G. E. Pugh, SIMPLE FORMULAS FOR CALCULATING THE DISTRIBUTION AND EFFECTS OF FALLOUT IN LARGE NUCLEAR WEAPON CAMPAIGNS (WITH APPLICATIONS), Report No. WSEG-RM-5, Weapons Systems Evaluation Group, Washington, D.C., January 1958. 143

Pugh, G. E., REVISION OF FALLOUT PARAMETERS FOR LOW-YIELD DETONATIONS, Report No. RM-10, Suppl., Weapons Systems Evaluation Group, Washington, D.C. Pugh, G. E., and R. J. Galiano, AN ANALYTICAL MODEL OF CLOSE-IN DEPOSITION OF FALLOUT FOR USE IN OPERATIONAL-TYPE STUDIES, Report No. WSEG-RM-10, Weapons Systems Evaluation Group, Washington, D. C., October 1959. WRIGHT AIR DEVELOPMENT CENTER ATTRITION OF HIGH-ALTITUDE HIGH-SPEED AIRCRAFT PENETRATING EUROPEAN SOVIET-BLOC DEFENSES WITHOUT PENETRATION AIDS, Report No. AD 319899, Wright Air Development Division, Ohio, September 1960, SECRET. Robinson, A. C., SIMULATION STUDY OF SIDEWINDER CAPABILITIES, Report No. WADC-TN-56-401, Wright Air Development Center, Wright-Patterson Air Force Base, Ohio, May 1958, SECRET. THE UNIVERSITY OF MICHIGAN ANALYSIS OF RADIO INTERFERENCE, Report No. 4095-1-T, The University of Michigan, Ann Arbor, Michigan, October 1960. Appel, K., et al, MODEL OF A TYPE AIR DEFENSE SYSTEM, Report No. 2354-13-R, Willow Run Laboratories, Ann Arbor, Michigan, July 1959. Bacon, F. R., R. E. Cline, and P. C. Pedrick, AN INVESTIGATION OF THE PHASE-IN AND PHASEOUT PROBLEMS FOR CONTRACT FIELD TECHNICIANS (U), Report No. 2796-43-T, Willow Run Laboratories, Ann Arbor, Michigan, July 1960, CONFIDENTIAL. Cline, R. E., INTERIM TECHNICAL REPORT, Report No. 3681-10-L, Institute of Science and Technology, The University of Michigan, Ann Arbor, Michigan, December 1960. Doss, H., et al, A DESCRIPTION OF COORDINATE CONVERSION AND TRANSFORMATION IN THE SAGE AND MISSILE MASTER SYSTEMS, VOLUME I, Report No. 2354-30-T, The University of Michigan, Ann Arbor, Michigan, January 1961. Doss, H. W., A DESCRIPTION OF COORDINATE CONVERSION AND TRANSFORMATION IN THE SAGE AND MISSILE MASTER SYSTEMS (U), VOLUME II, Report No. 2354-31-T, Institute of Science and Technology, Ann Arbor, Michigan, March 1961, CONFIDENTIAL. THE EVALUATION PROGRAM FOR THE AN/FSG-l ANTIAIRCRAFT DEFENSE SYSTEM, VOLUME I, PRELIMINARY TEST PLAN FOR THE AN/FSG-1 FIELD TEST, Report No. 2354-8-T, Engineering Research Institute, The University of Michigan, Ann Arbor, Michigan, October 1957. THE EVALUATION PROGRAM FOR THE AN/FSG-1 ANTIAIRCRAFT DEFENSE SYSTEM, VOLUME II, THE AN/FSG-1 SYSTEM FIELD TEST, Report No. 2354-9-T, Willow Run Laboratories, The University of Michigan, Ann Arbor, Michigan, December 1958, CONFIDENTIAL. Hoagbin, J. E., et al, THE EVALUATION PROGRAM FOR THE AN/FSG-1 ANTIAIRCRAFT DEFENSE SYSTEM (U), VOLUME III, Report No. 2354-14-S, Willow Run Laboratories, The University of Michigan, Ann Arbor, Michigan, July 1959, SECRET. 144

Miller, J. M., THE EVALUATION PROGRAM FOR THE AN/FSG-1 ANTIAIRCRAFT DEFENSE SYSTEM, VOLUME IV, MISSILE MASTER MODEL, Report No. 2354-29-T, Institute of Science and Technology, The University of Michigan, Ann Arbor, Michigan, January 1961, CONFIDENTIAL. Lewkowicz, R. E. and J. M. Miller, THE EVALUATION PROGRAM FOR THE AN/FSG-1 ANTIAIRCRAFT DEFENSE SYSTEM (U), VOLUME V, ATABE-MISSILE MASTER MODEL (U), Report No. 2354-35-T, Willow Run Laboratories, The University of Michigan, Ann Arbor, Michigan, October 1960, SECRET. Johnson, M. S., REPORT OF PROJECT MICHIGAN, MILITARY LEADERSHIP AND WAR GAMING, Report No. 2144-249-S, Engineering Research Institute, The University of Michigan, Ann Arbor, Michigan, February 1958. PROCEEDINGS OF THE SYMPOSIUM ON DIGITAL SIMULATION TECHNIQUES FOR PREDICTING THE PERFORMANCE OF LARGE-SCALE SYSTEMS (U) 23, 24, 25 MAY 1960, Report No. 2354-33-X, Willow Run Laboratories, The University of Michigan, Ann Arbor, Michigan, June 1960, CONFIDENTIAL. PROCEEDINGS THIRD WAR GAMES SYMPOSIUM (U) 6, 7, 8, OCTOBER 1960, ANN ARBOR, MICHIGAN, Report No. 36943-18-X, Willow Run Laboratories, The University of Michigan, Ann Arbor, Michigan, November 1960, SECRET. SYMPOSIUM ON PREDICTION OF PERFORMANCE OF LARGE-SCALE SYSTEMS 3, 4, 5 SEPTEMBER 1958, VOLUME I, Report No. 2354-11-S, Willow Run Laboratories, The University of Michigan, Ann Arbor, Michigan, January 1959. MISCELLANEOUS PUBLICATIONS Adams, R. and J. Jenkins, "Simulation of Air Operations with the Air-Battle Model", OPERATIONS RESEARCH, Vol. 8, No. 5 (1960). Barlow, R. and L. Hunter, "Optimum Preventive Maintenance Policies", OPERATIONS RESEARCH, January-February 1960, Vol. 8, No. 1, pp. 90-100. Blachman, N. and F. Proschan, "Optimum Search for Objects Having Unknown Arrival Times", OPERATIONS RESEARCH, September-October 1959, Vol. 7, No. 5, pp. 625-638. Black, G. and F. Proschan, "On Optimal Redundancy", OPERATIONS RESEARCH, SeptemberOctober, 1959, Vol. 7, No. 5, pp. 581-588. Weiss, H. K., "Lanchester-type Models of Warfare", PROCEEDINGS OF THE FIRST INTERNATIONAL CONFERENCE ON OPERATIONAL RESEARCH, September 1957. Weiss, H. K, "Some Differential Games of Tactical Interest and the Value of a Supporting Weapon System", OPERATIONS RESEARCH, March-April 1959, Vol. 7, No. 2, pp. 180-196. 145

APPENDIX E: SUMMARY OF CONFERENCE ON WEAPON SYSTEM EVALUATION TECHNIQUES Perhaps the most interesting session of the conference was devoted to five-minute summaries by attendees of the modeling and simulation work of their respective organizations. These summaries provided general background information for much of the subsequent discussion. A number of specific points were discussed relative to the technology and use of models. These were as follows: (1) Computer simulation is only one of a number of tools employed as techniques for weapon system evaluation. (This is probably the only point on which there was unanimous agreement.) (2) It was generally agreed that efficient construction of a model requires careful planning. In this planning both the uses of a model and the output required should be considered. Moreover, it was noted by a number of attendees that the time elapsed from the planning stage to the completion in a modeling program was exceedingly long and that when completed, the outputs of the model often did not satisfy the users requirements. (3) Frequently models are constructed for educational purposes. For such purposes it may be more appropriate to consider constructing game-type simulations in which the people to be educated can participate. (4) In large models, some law of large numbers may make extensive replication unnecessary. This is true particularly where saturated cases are being studied. It was asserted by a number of attendees that saturation in large models produces damping effects on one or more variables. A technique suggested as a possible method for reducing the number of replications required was to examine only the values of the parameters which describe the worst condition. A heuristic justification for using such a procedure was made by observing that the military is interested in winning the next war, not merly an average war or, in a non-military application, an air traffic control system is needed which will work well under the worst conditions, not only on the average. (5) Complete validation of a model is probably never possible. It is possible, however, to increase the level of confidence in some models. Since information concerning a real-life situation, if known, would be represented in some form of a model, validation in effect may ultimately reduce to comparisons of models. Thus, it was generally agreed that if results 146

obtained using two conceptually different models agree, the model builder has increased confidence in both. These types of comparisons can serve as a partial validation of models. For example, results obtained with a computer model may be compared with information collected in a field test. In this discussion there were a number of ideas set forth in regard to such comparisons. Specifically, it was suggested that one might profitably begin by attempting to model the field tests rather than a hypothetical real life situation. Moreover, it was observed that the use of business games employed in conjunction with data collected concerning an organization would provide an excellent area in which the concepts of testing and evaluating a computer simulation model against a model of real life situation can be examined. (6) Although a point not unanimously agreed upon by all attendees, the opinion was expressed that a greater interchange of information among modelers would be beneficial. Such an interchange can be facilitated by better documentation of programs and the establishment of standards for preparing reports. Further, it was observed that some interchange of information about unsuccessful applications in modeling and simulation would also be helpful. If such a program were to be established, the development of a modular approach to programming so that subroutines from various programs can be interchanged would also be beneficial. It was observed, however, that major difficulties are to be encountered in attempting to facilitate the interchange of information among modelers. These difficulties result from security regulations and proprietary interests among military services or among commercial organizations, and from the general philosophy that the model builder does not have time to worry about documentation. 147

LIST OF REFERENCES 1. A COMPREHENSIVE BIBLIOGRAPHY ON OPERATIONS RESEARCH, Through 1956 with Supplement for 1957, Publications in Operations Research No. 4, Case Institute of Technology, Cleveland, Ohio. 2. Shubik, M., BIBLIOGRAPHY ON SIMULATION, GAMING, ARTIFICIAL INTELLIGENCE AND ALLIED TOPICS, American Statistical Association Journal, December 1960. 3. AN EVALUATION OF THE AN/GSG-2 ANTIAIRCRAFT DEFENSE SYSTEM WITH NIKE, Volume I - The Model Program, Report No. 2354-5-T, The University of Michigan, Ann Arbor, Michigan, August 1957, CONFIDENTIAL. 4. Batten D., R. Dion, and M. Neary, CONCEPT PAPER, MISSILE BATTLE MODEL II, Report No. TO-B 60-54, Technical Operations, Inc., Burlington, Massachusetts, November 1960. 148

UNIVERSITY OF MICHIGAN 3 9015 02652 512411111111 I 3 9015 02652 5124