THE UNIVERSITY OF MICHIGAN COMPUTING RESEARCH LABORATORY1 SYMPOSIUM ON COMPUTER APPLICATIONS TO CARDIOLOGY INTRODUCTION and AUTOMATED ELECTROCARDIOGRAPHY AND ARRHYTHMIA MONITORING Janice M. Jenkins CRL-TR-20-83 APRIL 1983 Room 1079, East Engineering Building Ann Arbor, Michigan 48109 USA Tel: (313) 763-8000 lAny opinions, findings, and conclusions or recommendations expressed in this publication are those of the authors and do not necessarily reflect the views of the funding agency. This report also appeared in Cardiovascular Disease, VoLXXV,NO.5 (March/April), 1983.

ABSTRACT This paper informs us that the electrocardiogram was the first physiologic signal to be processed and analyzed by digital computer, a technique which dates back to 1957. Early attempts at computer-derived measurement of the waveforms progressed to pattern recognition methods for discrimination and classification, and eventually to diagnostic interpretation of the electrocardiogram (ECG). A historical review is presented of the evolutionary stages of computer interpretation of the diagnostic ECG, and the reader is acquainted with various methods employed for signal processing, techniques for waveform and contour analysis, and algorithms for diagnostic interpretation. Commercial systems which evolved from those developed in research settings are described and distinctive features are pointed out. Computer-based arrhythmia monitors for coronary care were a natural outgrowth of the interpretive system but ineffective P-wave recognition continues to be the major limiting factor in rhythm analysis. The development of a miniaturized esophageal electrode for computer-detection of P-waves provided a dramatically improved registration of atrial activity and enabled for the first time automated analysis of complex arrhythmias. The two-lead system which recognizes P-waves on the esophageal signal, QRS complexes on the surface lead, and computes PP, RR, and PR intervals is described in Section V of the paper. Initial attempts at evaluation of the diagnostic ECG have been reported but no consistent scheme has emerged which can be applied in general for comparative purposes. Proposals for the development of a test library with clinical' documentation from non-ECG sources have been advanced, but the complexity of the task has served to inhibit its implementation. A promising move in this direction is the creation of an annotated data base for arrhythmia monitors developed under the auspices of the American Heart Association for purposes of testing automated detection systems. The introduction of computers into clinical electrocardiography has not resulted in any widespread improvement of diagnostic accuracy or dramatically altered the delivery of medical care, but the technique of computer analysis and storage of large numbers of ECGs within a single system provides a powerful tool for epidemiologic studies and possible early detection of cardiovascular disease. Technologic advances include remote acquisition and telephonic transmission of ECGs to a central computer facility and advanced electrocardiographs with internally contained microprocessors which instantly deliver a diagnostic report at the patient site.

Progress in Cardiovascular Diseases VOL. XXV, NO. 5 MARCH/APRIL 1983..,.,,,.,,.,.,,....,,, Symposium on Computer Applications to Cardiology Introduction Janice M. Jenkins, Ph.D., Guest Editor T HE IMPACT of computers in the field of medicine has been a dramatic one. The computer has emerged as a major technical tool in the acquisition, analysis, interpretation, and storage of medical data both in the research laboratory and in the clinical setting. Cardiology represents a discipline where some of the earliest computer applications occurred, particularly in the area of electrocardiography. With the advent of integrated circuit technology computing power has expanded dramatically as the physical size of computers has shrunk. Microprocessors no larger than a postage stamp are an integral part of many medical devices, and larger computer systems are interfaced to a variety of clinical instruments. As the use of these computerized devices increases the physician will find it essential to know and understand their operation. This symposium examines eight areas in cardiology in which the computer has become a significant feature, ranging from electrocardiography and cardiovascular imaging through drug infusion. The paper, "Automated Electrocardiography and Arrhythmia Monitoring," apprises us that the electrocardiogram was the first physiologic signal to be processed and analyzed by digital computer. The technique dates back to 1957 when Hubert Pipberger developed a system for automatic recognition of electrocardiographic waves by computer. Early attempts at computerderived measurement of the waveforms progressed to pattern recognition methods for discrimination and classification, and eventually to diagnostic interpretation of the electrocardiogram (ECG). In the first section a historical review is presented of the evolutionary stages of computer interpretation of the diagnostic ECG. The reader is acquainted with various methods employed for signal processing, techniques for waveform and contour analysis, and algorithms for diagnostic interpretation. Commercial systems which evolved from those developed in research settings are described and distinctive features are pointed out. The article presents a chronicle of early attempts at rhythm analysis (Section III) and the continuing difficulties associated with this problem. Pattern recognition and measurement techniques applied with success to analysis of the QRS complex perform poorly in P-wave detection due to the small amplitude of the signal and its low frequency content. Computer-based arrhythmia monitors for coronary care were a natural outgrowth of the interpretive system but ineffective P-wave recognition continues to be the major limiting factor in rhythm analysis. The development of a miniaturized esophageal electrode for computer-detection of P-waves pro From the Department of Electrical and Computer Engineering, The University of Michigan. Ann Arbor, Mich. Reprint requests should be addressed to Janice M. Jenkins, Ph.D., Department of Electrical and Computer Engineering, The University of Michigan, Ann Arbor. Mich. 48109 ~ 1982 by Grune & Stratton, Inc. 0033-0620/83/2505-0001$01.00/0 Progress in Cardiovascular Disease, Vol. XXV, No. 5 (March/April), 1983 361

362 JANICE M. JENKINS vided a dramatically improved registration of atrial activity and enabled for the first time automated analysis of complex arrhythmias. The two-lead system which recognizes P-waves on the esophageal signal, QRS complexes on the surface lead, and computes PP, RR, and PR intervals is described in Section V of the paper. Computer methods for ambulatory monitoring, high-speed playback and analysis are discussed in Section VI. The problems associated with evaluation and testing of computer ECG systems remain a major consideration and are presented in Section VII. Initial attempts at evaluation of the diagnostic ECG have been reported but no consistent scheme has emerged which can be applied in general for comparative purposes. Proposals for the development of a test library with clinical documentation from nonECG sources have been advanced, but the complexity of the task has served to inhibit its implementation. A promising move in this direction is the creation of an annotated data base for arrhythmia monitors developed under the auspices of the American Heart Association for purposes of testing automated detection systems. The introduction of computers into clinical electrocardiography has not resulted in any widespread improvement of diagnostic accuracy or dramatically altered the delivery of medical care, but the technique of computer analysis and storage of large numbers of ECGs within a single system provides a powerful tool for epidemiologic studies and possible early detection of cardiovascular disease. Technologic advances include remote acquisition and telephonic transmission of ECGs to a central computer facility and advanced electrocardiographs with internally contained microprocessors which instantly deliver a diagnostic report at the patient site. One paper in our series deals with mapping of the electrical activation of the epicardial and endocardial surfaces of the heart, a procedure which has become an important technique in clinical cardiology and also in basic research. There have been striking advances in the use of mapping in the surgical treatment of drugresistant arrhythmias, and mapping during surgery has provided increased knowledge of car diac activation patterns during sinus rhythm, artificial pacing, and spontaneous arrhythmias. The authors, Smith and Ideker, review the computer techniques which have arisen and contributed to the advancement of mapping technology. Early traditional methods with limited data acquisition and analysis capabilities have been replaced by modern systems for recording, manipulation, and data display. Their paper, "Computer Techniques for Epicardial and Endocardial Mapping," covers all aspects of the mapping operation from the application of electrodes to the graphical presentation of results. Two major phases of cardiac mapping are discussed: data acquisition and data analysis. The authors describe both computer and noncomputer methods for data acquisition, but point out that computer methods are essential for analysis purposes because of the vast quantity of data produced by the mapping technique. Another review in this symposium presents "Cardiac Catheterization and Angiographic Analysis Computer Applications." From the middle 1960s efforts have been made to utilize computer systems for the analysis of data from cardiac catheterization. This paper presents an overview of work done in this area by numerous investigators and describes in detail the features of a computerized cardiac catheterization system developed at the University of Alabama in Birmingham Medical Center. The system has been in clinical use since 1973 and currently supports two adult catheterization laboratories. Four analog signals, three hemodynamic pressure measurements, and the electrocardiogram can be monitored simultaneously. The completely automated system computes all hemodynamic and electrocardiographic parameters, determines measurement of cardiac output by the thermodilution technique, stores calculated data, and provides reports of the procedure to the physician. An additional program which is not part of the routine pressure analysis system is capable of computing isovolumic indices of muscle function. These algorithms are described by Zissermann, et al. in Section II of the paper. In Section III, an angiographic analysis system is described in which on-line data acquisition and quantification of left ventricular (LV)

COMPUTER APPLICATIONS TO CARDIOLOGY 363 dimensions is computer-controlled. Instrumentation in the angiographic data laboratory for digitizing LV silhouettes and computer processing techniques for quality control and report generation are presented in detail. Stroke volume and ejection fraction measurements are determined by computer and results of volumetric analysis and segmental wall motion analysis are provided in the form of a clinical report. A paper in this series on Computer Applications in Cardiology describes the "Construction and Interpretation of Body Surface Maps." The authors, Barr and Spach, examine the procedure of deriving maps of cardiac electrophysiologic activity at the body surface. Such maps, which portray the variation in space of potentials generated by the heart, are constructed for a single instant in time, and a sequence of such maps depicts the spatial distribution during the cardiac cycle. The fact that spatially distributed sources within the heart produce corresponding surface maps instant by instant is the basis for the interpretation of surface maps. The principal advantage of examining potential variation in space, rather than in time, is that the relationship between the spatially distributed electrical sources within the heart and potential distributions produced on the body surfaces for any instant in time can be considered independent of the relationship that exists at any Qther instant in time. In particular, there is a more or less direct relationship between the spatial distribution of the cardiac sources and the observed spatial distribution of potentials on the body surface. The authors describe the measurements necessary for a body surface map and discuss the problems of establishing baseline voltages for each of the multiple electrodes, and of precise time alignment between leads. For better visualization of the voltage pattern, maps are generally presented in the form of contour patterns with the lines on the map corresponding to isopotentials. Barr and Spach utilize 24 measurements to construct a map and demonstrate that this number of measurements is sufficient to determine the entire map within a tolerance close to the noise level. The relative accuracy of maps constructed from 150 versus 24 measurements is determined both by theoretical and operational methods. Interpretation of surface maps by evaluation of the features of the waveform relative to past experience has not become widely used due to the fact that it is more difficult technically to obtain maps; thus only small numbers are available for analysis and the precise format varies from place to place. Interpretation of maps can be accomplished by mathematical estimation. Epicardial potentials computed inversely from body surface potentials are compared to measured epicardial surface maps in dog studies and the inverse maps appear to be sufficiently good approximations to the measured maps. The authors conclude that the widespread use of mapping as a clinical or research tool will await development of standard commercial devices for map processing, so that potential users are not required to begin by taking on the construction of a surface mapping device. They predict the advent of such systems and a continued increase in the use of mapping procedures. One review in this symposium is entitled "Computerized Cardiovascular Imaging" authored by Bulawa and Meyers. The application of computer technology to image analysis in cardiology is surveyed and current techniques are described for automated extraction of data from imaging modalities common in cardiovascular laboratories. The derivation of physiologically relevant information from an image through computer processing can be described as a two-step process: object characterization and parameter extraction. The purpose of object characterization is to retrieve from the image those features which are needed for subsequent analysis. Bulawa and Meyers discuss digital methods for image enhancement, computer determined object characterization, and calculation and display of object functional parameters. Computer applications to radiological studies for cardiac volume and ejection fraction determination, left ventricular wall motion analysis, and arteriography are presented in detail. Digital image processing is emerging as an increasingly important factor in the evolution of radiological image acquisition and analysis and it has become an integral element in the technologies that are being developed and explored. The digital image offers an alternative to film as the

364 JANICE M. JENKINS primary recording medium for radiography, and interactive image processing methods can be applied which make real time analysis a reasonable prospect for the near future. The paper written by Jelliffe presents a review of "Computer-Controlled Administration of Cardiovascular Drugs." The author concentrates on those aspects of cardiovascular drug administration which actually have employed a computer in the process of planning, monitoring, or adjusting the delivery of the pharmaceutical agent. Two general types of application have developed: one which achieves automatic control of a system in a data-rich situation, and another which operates in a data-poor situation whereby the response has been sampled only infrequently. In data-rich situations, computer-controlled infusion strategies have been developed for control of blood pressure, cardiac arrhythmias, and blood sugar. Pharmacokinetic models have been advanced which determine the desired concentration of drug and are used in simulation of closed-loop adaptive control systems for drug delivery. Open-loop (nonfeedback) procedures are almost entirely dependent upon pharmacokinetic models and one of the earliest clinical applications of such a method was for digitalis therapy. These concepts have been incorporated into a computer program for developing loading and maintenance dosage regimens of glycosides adjusted to body weight and renal function. The program has been in use since 1973 by community hospitals over internationally accessible time-sharing facilities, and results have illustrated the utility of applying pharmacokinetic concepts in clinical situations in a quantitative manner. A computer-assisted regimen for achieving and maintaining therapeutic serum levels of lidocaine was developed which overcame the therapeutic hiatus of low concentrations during the first 10 minutes of therapy typically seen with conventional regimens. Applications of feedback procedures in open-loop control of drug therapy, in which the patient's clinical response is compared to the model's computed body concentrations and therapeutic adjustments resulted in more accurately achieved serum levels than those achieved by physicians armed with similar phar macokinetic knowledge. The development of optimal strategies for monitoring serum concentration to yield the greatest accuracy in the computation of each patient's pharmacokinetic values is described by Jelliffe. However, he points out that closed-loop strategies consider only how to use information that has already been obtained and make no provision for the fact that future data will also be obtained. A truly optimal closed-loop strategy would compute an optimal balance between the past information already available and that to be obtained from other data in the future. This promises to be the next stage in computer-controlled drug infusion. The paper in this symposium which examines the role of "Computing in Echocardiography," provides insight into difficulties and limitations surrounding computer interpretation of ultrasound data. The author, Gibson, describes the predictable-adoption.by echocardiographers of image enhancement techniques applied earlier to other modalities, but reminds us that mechanisms underlying image degradation must be clearly understood before algorithms can be developed to reverse them. Greyscale manipulation, smoothing, and integration of information from several beats are methods which have been borrowed from other imaging systems and applied to echocardiographic images. The first application of computers in the echocardiographic field was the analysis of M-mode records in which routine measurements were made automatically, and later, analysis was done of cardiac motion throughout the cardiac cycle. Although this information was the same as that available from direct manual measurements, automation increased the speed anil convenience with which it could be extracted. Two-dimensional echocardiography with its comprehensive display of left ventricular cavity size, shape, and wall motion was a logical candidate for the next stage in the application of computer techniques. The promise of computerderived quantitative measurements of ventricular volume, wall thickness, and regional wall motion suggested an appealing new feature that could be added to an already valuable clinical tool. Early techniques required the manual outlining by an operator of endo- and epicardial

COMPUTER APPLICATIONS TO CARDIOLOGY 365 surfaces for subsequent computer digitization and measurement, but more recent methods employ automated methods of boundary recognition. Two distinct problems exist with this technique: the location of a pictorial boundary on the image, and determining the relation of this pictorial boundary to an anatomic one. The latter problem is much more complex since the relation between the two depends closely on ill-understood mechanisms underlying ultrasound image formation; thus even the best edge detection is imprecise. The ability to gain complex information about cardiac anatomy and. motion requires a corresponding complexity in the methods used to its display. Gibson turns to the works of early military cartographers and a nineteenth century geographer for solutions to the problem of display of three-dimensional information. The writer suggests the most satisfactory means of demonstrating complex anatomy is likely to be a stereoscopic approach citing a number of most compelling images still extant in photographic atlases of anatomy produced at the turn of the century. Other display methods are described as well. A major section of the work is dedicated to the theme of tissue characterization which refers to the possibility of using echocardiography to gain information about tissues beyond simply their position and motion. Unlike other fields in which diagnostic ultrasound is used, relatively little effort has been made to characterize the ultrasonic properties of cardiac tissues themselves. Nevertheless, the idea has proved a stimulating one to cardiologists in view of its possible clinical significance. Gibson suggests that the detection of collagen, the relationship of ultrasonic attenuation to myocardial ischemia, and textural analysis for identification of myocardial infarction comprise new areas which might be fruitful to explore. The final paper in this symposium is entitled "Computer Analysis of Cardiac Radionuclide Data," and treats the relatively young specialty of nuclear cardiology. The authors, Froelich, Thrall, et al., have chosen to review selected areas of nuclear cardiology which arc in the process of greatest evolution and assuming more important roles. These areas require the application of quantitative techniques to the evaluation of nuclear cardiologic data and are thus dependent upon computational resources and developments. The first application of computer technology to radionuclide ventriculography was acquisition of gated end-diastolic and end-systolic frames into computer memory a frame at a time. Manual tracing of ventricular contours was done after which area-length formulas were used for calculating volumes and ejection fraction. Newer computer methods bypass the more laborious geometric calculations and derive ejection fraction from the net ventricular count rate of the intravascular tracer. This automated procedure results in more reproducible and more accurate measurements than hand analysis and has served to move the technique from the laboratory into the clinical setting. Other important breakthroughs in nuclear cardiology have been the development of new computer data acquisition techniques for gated list mode ventriculography and multigated frame mode ventriculography. Following a brief historical review Froelich presents a complete description of equipment required for nuclear cardiology procedures and discusses various hardware components which can be interfaced to a gamma camera. Functional descriptions are given of current software techniques found in commercial systems and a section is devoted to a description of Fourier phase analysis as a method for demonstrating subtle regional differences in cardiac contractility. Image processing methods employed in quantitative myocardial perfusion are reported and promising results as well as limitations are presented. The final section deals with tomographic imaging in which serial slices through the long axis of the heart are obtained from multiple projections, thus providing a means of threedimensional reconstruction and measurement of the distribution of radionuclides in the body in a full three-dimensional fashion. Tomographic imaging provides a means of determining not only ventricular chamber volumes but also the size of infarcted or ischemic tissue. With the refinement of tomographic imaging and the

366 JANICE M. JENKINS development of new radiopharmaceuticals it may soon be possible to measure regional myocardial blood flow and metabolic rates noninvasively. These reviews provide the clinician with some insight into those areas where computers have played a significant role. Commercial systems are described and research systems still under development are presented. Each of the authors has provided an extensive bibliography which should serve the interested reader with a comprehensive road map should the pathway prove inviting.

Automated Electrocardiography and Arrhythmia Monitoring. t Janice M. Jenkins C OMPUTER PROCESSING of the electrocardiographic signal (ECG) began over two decades ago when Dr. Hubert V. Pipberger undertook a project for the Veterans Administration in which he employed a digital computer for the automated detection of ECG waveforms. The original attempt at automation merely delineated and measured the P, QRS, and T waves, detecting the onset and termination of each wave, and measured intervals between waves. Contour analysis of the waveforms followed and, as this technique became more highly developed, Pipberger and others began to apply decision tree logic to the results in order to arrive at a specific diagnostic interpretation. At a later stage, second generation programs were designed that employed statistical methods for diagnosis. Clinical implementation of computerized electrocardiography occurred in the early 1970s and has continued to develop at a rapid rate. Computerized electrocardiography falls into two broad categories: computer-assisted interpretation of the diagnostic ECG and computer monitoring of cardiac arrhythmias. In the first category, pattern recognition techniques are applied to an ECG signal that has been previously acquired and stored in a digital computer for examination at length. In arrhythmia monitoring the dynamic ECG signal is analyzed online such as in coronary intensive care monitoring, or long-term recordings are processed at speeds faster than real-time, as in Holter analysis. In both categories, a feature extraction stage detects waveforms, determines boundaries, examines morphology, computes amplitudes and duration, and measures interwave intervals. A contour-analysis stage applies clinical criteria to these measurements to arrive at a diagnostic classification, and a contextual string is then examined for rhythm analysis. This paper will describe the data acquisition and signal processing techniques that have been applied to computer-assisted electrocardiography since its advent in 1957. Methods for contour analysis and interval measurement will be described, and the application of diagnostic criteria to these results will be examined. Rhythm analysis and serial comparison, which are undergoing further development will be discussed. A historical review of the evolutionary stages of computerized electrocardiography will be presented leading to a discussion of the present state of the art and future trends. Rhythm analysis represents a particularly difficult aspect of computer interpretation. Because the QRS complex is the most easily detected waveform of the ECG, QRS morphology and RR interval measurements constitute the major features for rhythm determination in both the computer-assisted diagnostic ECG and computerized arrhythmia monitoring. Logic exists in most systems for P-wave information to be incorporated into the rhythm decision, but the frequent failure of P-wave detection represents a serious flaw in the accuracy of rhythm interpretation. New techniques for reliable P-wave measurements such as more optimally located electrodes. particularly those which hold promise for improved arrhythmia classification, will be presented. The problem of testing and evaluation of existing ECG systems continues to present difficulties and will be examined in light of recommendations that have been advanced. A library of tape recorded arrhythmias has been collected, diagnosed, and annotated by experts to serve as an instrument for the assessment of rhythm monitoring systems, but, to date, no such data exist for testing diagnostic systems. COMPUTER INTERPRETATION OF THE DIAGNOSTIC ECG Lead Systems The heart beat is an electrical process; that is, currents flowing within the heart are the cause of the beat itself.' This electrical impulse is initiated in the sinoatrial (S-A) node, spreads through the atria, and coincidentally produces an atrial contraction. The impulse after traversing From the Department of Electrical and Computer Engineering, the University of Michigan. Ann Arbor. Reprint requests should be addressed to Janice M. Jenkins. Department of Electrical and Computer Engineering. The University of Michigan, Ann Arbor, MI 48109 ~ 1982 by Grune & Stratton. Inc. 0033-0620/83/2505:-0002 S05.00/0 Progress in Cardiovascular Disease, Vol. XXV, No. 5 (March/April), 1983 367

368 JANICE M. JENKINS the atrial chambers reaches the atrioventricular (A-V) node, which constitutes a pathway between the electrically insulated upper and lower chambers. The A-V node temporarily inhibits the impulse after which it is rapidly transmitted via the His-Purkinje network to all regions of the ventricles. The global depolarization of the ventricles causes a synchronous muscular contraction that propels blood into the arteries of the body. These macroscopic events result from ionic currents operating microscopically at the cellular level.2 The electrocardiogram is a recording or graphical representation of the electrical activity of the heart. The electric current distributions within the human body that result from the spontaneous depolarization of the heart can be detected by sensors located at various positions on or within the body. Electrocardiography had its birth in 1903 when Einthoven, a Dutch physiologist, developed a special string galvanometer for recording minute variations of current or electric potential. He employed three leads in his electrocardiographic investigations, and required that the galvanometer be connected such that the string be deflected upwards when the "base of the heart was negative with respect to the apex." There are two fundamental lead systems employed for conventional electrocardiography as well as for computer processing: (1) the standard 12-lead configuration consisting of the Einthoven limb leads (I, II and III), the augmented leads (aVR, aVL and aVF), and the precordial leads (VI through V6); and (2) The 3-lead orthogonal sets such as Frank leads (X, Y, and Z) or corrected sets (McFee or Schmitt).4'5 The classical electrocardiogram is plotted sequentially with vertical deflections representing the potential difference between electrodes, and a horizontal axis that represents time. The deflections that are registered on the electrocardiogram are each associated with a particular electrical event in the heart and the overall tracing provides a wealth of diagnostic clues to cardiac structural and functional abnormalities. During the two decades of computer-assisted electrocardiography, numerous investigators have studied the accuracy of the 12-lead versus the 3-lead sets.6" In general it was concluded that neither the 12-lead nor 3-lead set demon strated a significant improvement of performance over the other. The orthogonal system, given that it contains the same information as the 12-lead system, offers an advantage for computer analysis in that it effects data reduction by a factor of 4:1.9 Nevertheless, at present, systems that employ 12 leads constitute 95% of all computer-assisted ECGs, while 3-lead systems comprise less than 4%."1 This is probably due to the familiarity most physicians have with the 12-lead set. Signal Processing The standard direct-wiring electrocardiograph is the primary and most widely used instrument in electrocardiography. The device consists of multiple electrode connections to an amplifier that provides a signal to drive a strip chart recorder or hot stylus recorder. Electrocardiographs are generally one channel or three channel with switching mechanisms for selecting a particular lead or lead set. To comply with American Heart Association recommendations, 1 an electrocardiograph must have a frequency response of 0.05 Hz to 100 Hz (3 db down) in order to insure accurate reproduction of clinically used measurements such as wave amplitudes and durations. Data acquisition methods for computer processed ECGs include the electrocardiographic instrument as a first stage. The analog signal is detected and amplified, generally with a gain factor of 1000. This transforms the electrocardiographic signal, which falls in the 1 to 10 mV range, into a 1-10 V range for further processing. A variety of storage media have appeared throughout the years for automated electrocardiography. Early methods included manually determined measurements keypunched into computer cards, storage of the entire ECG in analog form on magnetic tape or on magnetic strips bonded to computer cards, microfilm storage on computer aperture cards, and the most common method in present use, storage in digital form on magnetic disks. The availability of disk packs that contain three million bytes of storage with access times of nanoseconds has made instant retrieval and serial comparison of multiple ECGs a reasonable task. Since it is not feasible to have all electrocardiograms recorded in a location that is adjacent

AUTOMATED ELECTROCARDIOGRAPHY 369 to the computer facility, the second stage of the system generally consists of magnetic tape recording of the data or telephone transmission to a central computer. The analog signal is frequency modulated and, in the case of telephone transmission, three channels of data are multiplexed onto the standardized carrier frequencies of 1075, 1935, and 2365 Hz. This analog transmission over voice-grade lines can be plagued by noise that badly distorts the original signal. (The signal to noise level is about 40 db.) Even in optimal FM transmission the bandwidth of the transmitted signal is limited to 100 Hz due to the frequency response of the electrocardiographic recording device. Digital transmission has frequently been advanced as an alternative to analog transmission. While this provides an improved signalto-noise ratio (about 52 db), bandwidth limitations of voice-grade lines have made widespread use of digital transmission of electrocardiograms unfeasible at present. At 12-bit conversion precision and typical 500-Hz sampling rates, each I-sec segment of 3-channel ECG data would constitute 18,000 bits of data. At baud rates of 2400, which are common, a 1-sec segment would require over 7.5 sec to transmit. At 9600 baud this would be reduced to less than 2 sec but still cannot be achieved in real time. New techniques are emerging that hold promise for rapid development in the direction of improved digital transmission. These include schemes for data compression before transmission and dedicated transmission lines and transmission links with broader bandwidth capabilities. Prior to any computer processing or analysis, the analog signal must be converted to.a digital representation by an analog-to-digital converter. Techniques for direct digital data acquisition are being developed,'2 but at present the analog signal is presented via magnetic tape or telephone transmission to the remote computer site. The electrocardiographic signal (a voltage varying as a function of time) is converted at uniform, preselected time intervals into discrete numerical values representing the magnitude of the signal at each sampling point. There are two components that determine whether the original signal can be accurately represented and reconstructed: the sampling rate and number of quantizing levels. Typical analog-to-digital (A/D) convert ers range from 8-bit to 12-bit precision. If the dynamic range of the input signal is 10 mV, an 8-bit A/D converter would have a precision (or maximum quantization error) of 80 giV with respect to electrode potential. A 10-bit converter provides a precision of 20 /V, and a 12-bit converter, a precision of 5 jV. While the major information content of the signal falls within a 0-100 Hz bandwidth, high frequency components are sometimes present. Sampling rates in all of the current computerized ECG systems range from 100 Hz to 1000 Hz. Although a 250-Hz sampling rate may be adequate for an electrocardiographic signal that has already been bandlimited by the electrocardiograph or the limitations of telephone transmission, the Nyquist sampling theorem holds that a sampling rate that is at least twice the highest frequency content in the signal is necessary for accurate reproduction. The. American Heart Association recommendations" for digitally sampled data are as follows: "As a minimum requirement, reconstruction of the original electrocardiographic waveform with a fidelity comr parable to that of a direct writer can be accomplished with equal interval sampling of 500 per second, digitized with a precision of 10 microvolts, referred to electrode potential." Thus an 1 l-bit A/D converter and a 500 Hz sample rate should be provided as a minimal configuration for accurate computer processing of the electrocardiogram. Waveform Detection The beginning and end of a significant waveform of the electrocardiogram (P, QRS, T) can be defined numerically by the rate of voltage change. This rate of voltage change can be expressed by first differences between consecutive ECG data points.'3 If, for instance, input data is sampled at 500 points per second (a temporal resolution of 2 msec), then the rate of voltage change can be expressed in terms of ditigal conversion units. Suppose an analog-todigital converter has a 12-bit precision with an input range of 10 mV. In this case the electrocardiogram can span 4096 conversion units with a resultant resolution of 5 1iV. Requiring a voltage change that exceeds a preset threshold (specified in conversion units) is a method commonly employed for locating waveforms within the

370 JANICE M. JENKINS SLOPE OF THE CURVE Yt +at-Yt At Yt+at Yt ----,'Ytt -— I-, ' I.. I - t Fig. 1. Geometrical representation of the first difference. (Reproduced by permission from Charles C. Thomas Publishers.13) ECG. Obviously the threshold in conversion units is directly related to sampling rate and A/D precision. The QRS complex is relatively simple to detect by this method, while P and T waves because of their lower amplitude and slower rate of change are more difficult to recognize. Figure I shows a geometrical representation of a first difference, and Fig. 2 demonstrates the type of measurements that can be obtained through the application of digital differentiation to the electrocardiographic signal. Pattern Recognition All computer programs for diagnosis of electrocardiograms contain two major stages: a waveform recognition section which,, after detection of significant waves, measures the amplitudes and duration of complexes and durations of intervals; and a diagnostic section that classifies the ECG into normal or various disease or arrhythmia states."4 There are two general approaches to pattern recognition of the ECG waveforms.'3 In the first, a cardiologist or group of cardiologists determines the desired pattern of amplitudes and durations that represent normal and abnormal states, and then associates specific combinations of these measurements with certain diagnostic statements. In the second approach, pattern matching is established by mathematical techniques such as cross correlation, or by fitting a mathematical expression such as a Fourier series to the ECG waveform. Diagnostic Interpretation After the initial waveform detection, pattern recognition, and measurement algorithms have been applied, the diagnostic stage is entered. There are two major strategies employed to arrive at the diagnostic interpretation: decision tree logic (a deterministic approach), or maximum-likelihood (a statistical approach). In the first scheme, measurements fall within or without certain ranges, and Boolean combinations of each of these' results determine whether. the criteria for a certain diagnostic state are met. As an example, a QRS deflection exceeding a certain value in a V lead might elicit a diagnosis of left ventricular hypertrophy. In the second method, Bayesian statistical techniques are applied, and the outcome, or diagnostic statement, has a probability associated with it. The diagnosis with () POINT OF STEEPEST POSITIVE SLOPE (a POINT OF STEEPEST NEGATIVE SLOPE ' MAXIMUM V MINIMUM Q CORNER 0 ZERO CROSSING A-_ Fig. 2. Examples of points that can be detected on the electrocardiographic signal through the application of digital differentiation. (Reproduced by permission from Charles C. Thomas Publishers.'3)

AUTOMATED ELECTROCARDIOGRAPHY 371 the highest probability is selected, and this is not only by the indices of the electrocardiographic measurements, but by the prior probability of the condition existing within the population under observation. The results are highly dependent upon a priori probabilities, thus a large population that accurately represents the incidence of disease states is required in order for a priori probabilities to be determined. The Bayesian approach is sometimes combined with decision tree logic for final classification. Historical Review In 1957 the electrocardiogram was chosen for a pilot study in automatic processing of medical data because of its widespread use as a diagnostic aid.9 A system developed by Pipberger at the Veterans Administration Hospital, Washington, D.C., was capable of automatic recognition-of electrocardiographic waves by digital computer. The original system for sampling and converting ECG data into digital form for entry into a computer was developed specifically for this project.15'"6 The computer program was capable of accurate determination of beginning and end of P waves, QRS complexes, and end of T waves. The Frank orthogonal lead system was employed for the majority of the ECGs and the Schmitt SVEC III lead system for the remainder. Technically poor records were selected for processing from a tape recorded electrocardiogram library containing 2500 cases because it was felt that an automatic wave recognition program should be tested with tracings such as might be encountered under unfavorable clinical conditions. A total of 395 electrocardiograms were analyzed by digitizing the signal at 1000 Hz, applying a digital filter with a high frequency cutoff of 60 Hz, and computing the spatial velocity. It was found that spatial velocities exceeding 3 AuV per msec were found only in the significant waveforms: P, QRS, and T. Computation time for the entire wave recognition program averaged 15 sec per record. The beginning and end of each electrocardiographic wave was identified and durations of waves and intervals were determined. Failures in measurement were encountered only in cases with cardiac arrhythmias. The program represented a major advance in computer technology and served as a model for many programs which followed. Caceres"7 began work on computer analysis of electrocardiograms in 1959 at the Medical Systems Development Laboratory in Washington, D.C. The intention was to demonstrate the feasibility of a computer program to extract clinically useful measurements of electrocardiographic parameters. The selection of the ECG for computer analysis was based on the availability of a backlog of electrocardiographic data on subjects known to be normal or abnormal, thus providing the capability of statistical analysis of results. The initial system utilized tape recorded data that was analog-to-digital (A/D) converted at 625 Hz. The data were converted to punched cards for input to the computer. Thirty-six leads were analyzed and each lead took 64 min of computer time to process. Parameters selected for measurement were P, Q, R, S, T, and U waves and the PQ, ST, QT, and RR intervals. Three characteristics were determined: amplitude, duration, and slope. The system was later rewritten in numerous languages and modified to run on a variety of computers. The version eventually distributed by the U.S. Public Health Service was known as ECAN (ECg ANalysis Program) and employed the 12 classical leads. These two early attempts at computer measurement of electrocardiographic waveforms provided the basis for a second stage in which pattern recognition techniques were applied to the results of the waveform detection; these pattern recognition techniques were combined with the employment of diagnostic criteria in order to arrive at an electrocardiographic interpretation. The first successful program to interpret an ECG diagnostically was an outgrowth of the early Pipberger work."8 In 1966 another computer program was reported that was capable of diagnosing the electrocardiogram.19 Decision tree logic applied to measurements of waveform amplitude and duration produced reports of right and left ventricular hypertrophy, bundle branch block, intraventricular conduction disturbance, and posterior and anterior myocardial infarction. Unlike the Pipberger and Caceres programs, the ECG measurements were done manually by technicians and keypunched onto IBM cards (four cards per ECG) for computer input. The 3-yr project to write a workable program, test its validity, and establish its usefulness in a private nonuniversity hospital was undertaken to demonstrate that computer-assisted interpretation

372 JANICE M. JENKINS could be valuable especially to the noncardiologist physician. In a total of 4469 electrocardiograms processed by the system, all but 19 were in agreement with the cardiologist who overread them. At the same time, a hybrid computer system for both measurement and interpretation of electrocardiograms appeared.20 Threshold detectors were employed by an analog editor to detect P, QRS, and T waves. A template representing a typical heart cycle was generated to serve as a standard with which to compare successive heart cycles. For each heart cycle, a binary word (match word) was produced, which described how well the heart cycle matched the template. Figure 3 shows a schematized version of an electrocardiographic passage and the associated template. The numbered points in the representation of the template are the significant points of the waveform that are recognized and flagged by the analog circuit. The program was divided into two parts: contour and arrhythmia. The contour interpretations were "electrical" rather than clinical in. nature, and rhythm analysis posed problems because of inaccuracy of. waveform measurements, particularly P waves. But the system advanced computerized electrocardiography dramatically and served as a predictor of things to come; During this early period while numerous systems emerged using decision tree logic for analy sis, Klingeman and Pipberger turned to statistical classification techniques for the assignment of electrocardiograms into various diagnostic categories.2' An initial study in which ECGs were classified into normal and left ventricular hyperptrophy (LVH) types was reported in 1967. Four statistical methods were applied to measurements taken from orthogonal electrocardiographic record samples. Amplitude measurements of various selected points were summed, and vector differences in three-dimensional space were calculated both with and without weight factors. A class-separating transformation was tested as well. These 4 statistical techniques were applied to 3 sample groups containing 100 ECGs from normal subjects and 100 from patients with clinically documented left ventricular hypertrophy (LVH). Best results were obtained with weighted vector differences based on 8 amplitude measurements (84% correct classification). The class-separating procedure produced an 80% result, while the method of summed amplitudes led to 67% separation. Table I shows results of separating normal (N) from left ventricular hypertrophy (LVH) for each of the four procedures tested. These early results demonstrated the practicality of utilizing the computer for the automatic measurement of numerous ECG amplitudes and the application of complex statistical procedures for data analysis. Table 2 summarizes the systems that were in D Start. A B C E ~-4 ^/-^ *-^^^Jl ^^J^ — ^f -^~ ^r " Template Fig. 3. Schematized version of an electrocardiographic passage and the associated template of a normal beat. The numbered points on the template are the significant points of the waveform which are recognized and flagged by an analog circuit. (Reproduced by permission from the Annals of the New York Academy of Sciences.29)

AUTOMATED ELECTROCARDIOGRAPHY Table 1. Classification of ECG Records Based on Record Samples Correctly Classified Estimate of M~ Equal Error Classification Procedure* N LVH (%) Experiment 1 ( 100 N and 100 LVH records) 1 74 75 74.5 2 85 76 80.5 3 93 78 85.5 4 83 78 82 Experiment 2 (new samples of 100 N and 100 LVH records) 1 69 59 64 2 86 69 77.5 3 90 73 81.5 4 89 69 79 Experiment 3 (new sample of 100 N records, LVH sample of experiment 2 retained) 1 89 59 74 2 88 69 78.5 3 95 73 84 4 86 69 77.5 Average recognition rates derived from experiments 1 to 3 1 77.3 64.3 70.8 2 86.3 71.3 78.8 3 92.7 74.7 83.6 4 87 72 79.5 Evaluation of four statistical methods applied to measurements taken from orthogonal electrocardiograms for the purpose of separating normal subjects from those with left ventricular hypertrophy. *Procedure 1 - sum of amplitude measurements; procedure 2 - vector differences; procedure 3 - weighted vector differences; procedure 4 - class-separating transformations. The weight factors for the vector differences were determined on the basis of all records. Tihe matrices for procedure 4 were obtained from the samples of experiment 1. For further details see text, Adapted and reproduced by permission from Computers and Biomedical Research. 2 use in the late 1960s and some of the characteristics of each. Two separate groups of investigators, each collaborating with International Business Machines (IBM), began serious development of computerized electrocardiography in the late 1960s. Bonner and Schwetman from Advanced Systems Developmental Division at IBM worked with Pordy at Mt. Sinai Hospital, New York City, on the development of a 12-lead computer system.225 The ECG signal was sampled at 400 Hz by an A/D converter with 10-bit resolution. An analog preprocessor employed noise rejection 373 Table 2. A Summary of the Characteristics of the Early Systems for Computer Detection and Analysis of Electrocardiograms Pipberger Analog data recorded on FM tape Analog-to-digital conversion at 1000 Hz 15 sec ECG, XYZ leads IBM 704 computer Spatial velocity to detect waveforms Caceres Analog data recorded on FM tape Analog-to-digital conversion at 625 Hz Data stored on punched cards Control Data 160A computer and DEC PDP8 64 min to process one lead Staples Human measurements of waveforms Decision logic for RVH, LVH, BBB, ICD, IRBBB, CRBBB, CIBBB, PMI, AMI,. T-wave changes IBM 1441 computer 4469 ECGs processed Wortzman Analog data recorded on FM tape Sampled only at points of significant changes in ECG IBM 1401 computer Low pass filtering at 60 Hz features and flagged points of exceptional interest. Digital filtering was applied to each flagged point using 20 adjacent points before and after the point of interest. The filtered value of the flagged point was compared to the unfiltered amplitude. If it fell within.025 mV of the original point it was retained; if not, the original value was retained. The rationale was that any point that was not greatly affected by filtering was assumed to be in a region of low frequency. If there was a large discrepancy between the filtered and unfiltered value, it was assumed that the point fell within a region of high frequency, i.e., QRS, and filtering was not justified. The waveform was divided into a series of overlapping segments which were defined as starting or ending whenever the slope-difference changed sign. The high slope parts of the waveform were taken to be reliable indicators for identification of the QRS complex. The frequency distribution for segment slope-difference for each lead was found and a threshold applied to produce a slope constant (8 converter units per millisecond). A segment with a slope-difference greater than this was considered to represent a QRS complex. Since the highest slope on aberrant QRS complexes might be markedly lower than predominant QRSs, bizarre beats were often overlooked. Additional logic to avoid this

374 JANICE M. JENKINS was employed which lowered the threshold to include aberrant beats. This threshold caused peaked T waves to be detected as well; therefore, logic was included that detected the proximity of a new waveform to the preceding wave and permitted rejection of T waves. The widths of QRS complexes were determined by a complicated logic, after which all beats were compared. For each beat found to match a certain type, points were added to that type's rating and the type with the highest rating was considered to represent the dominant rhythm. All QRS complexes not identical to the dominant type were remeasured to correct possible measurement errors. This procedure was important so that arrhythmia analysis would not be in error. Each segment in the interval between a pair of QRS complexes was given a P or T rating depending upon weighting factors applied to slope,.amplitude, and duration. The segments with the largest P and T ratings were candidates for P and T waves. If these segments coincided, the P rating was compared to T rating and the larger value was the determinant. In all, a total of 288 measurements were extracted from the 12 leads and stored in a measurement matrix for further analysis. The arrhythmia analysis program- examined the information from the measurement matrix, i.e., a table of the time of onset and termination of each complex and its' type. For arrhythmia analysis, the information was organized as shown in Fig. 4. The widths, positions in time, and types of all the waves were known in addition to their measurements. As the first step in the determination of rhythm, a P-train test was performed in which a search was made for a series of regularly spaced P waves. The representative interval of a group of P waves was used as the hypothetical P spacing in order to search for possible P waves buried in T or QRS complexes. If no dominant interval could be found. it was judged that there was no discernible P train. Each interval flanked by two adjacent QRS complexes was considered to be a unit, and was classified by the lefthand-side and right-hand-side QRS types [first QRS type (Q1), second QRS type (Q2). etc.] and interval between. Units which contained only the dominant QRS type were used to determine the dominant rhythm. Each lead was examined separately for rhythm. Two kinds of basic rhythms were presumed possible: type A, which one would expect to find exhibited in many leads (such as a sinus rhythm): and type B, which could be expected to appear in only a few leads (such as Wenckebach phenomenon or atrial flutter). A sum was accumulated for each lead in which a Type A diagnosis was seen, with an additional four points added for a diagnosis derived from the rhythm strip. The rhythm with the largest total. provided it exceeded six, was selected. If a type B rhythm was found in more than two leads, that diagnosis would supersede the type A diagnosis unless the type A diagnosis accumulated 10 or more points. Table 3 shows the rhythms that comprise type A and type B diagnoses. The special statements reflect arrhythmias that are difficult to distinguish, therefore a composite list of possibilities was provided for a cardiologist to review. Results of rhythm analysis from this system were reasonably successful for commonly seen rhythms (sinus rhythm, sinus tachycardia, sinus bradycardia), but not so promising in other arrhythmias. The diagnostic errors were seen to arise from the inability of the measurement program to find P waves superimposed on T waves, or in discriminating against noise that might mimic a P wave. Initial testing25 on 2060 electrocardiograms that were processed over a PI P2 Q0l T P2 PI P3 l0 TI Q2 T2 PI QIP3 T3 P3 PI Q0 Fig. 4. Passage of an electrocardiogram that has been coded for arrhythmia analysis. Each QRS type is designated Q1. Q2. and so on where Q1 represents the dominant beat and 02 an ectopic beat. P and T waves of different morphologies are similarly coded. This example shows atrial fibrillation with a ventricular ectopic beat. (Reproduced by permission from Computers and Biomedical Research.24)

AUTOMATED ELECTROCARDIOGRAPHY Table 3. Rhythm Statements from the Bonner Program Type A rhythms* Normal sinus rhythm Sinus tachycardia Sinus bradycardia Atrial fibrillation with A-V block Atrial fibrillation Atrial filbrillation with 'A-V dissociation Nodal tachycardia Supraventricular paroxysmal tachycardia Nodal rhythm Nodal paroxysmal tachycardia Nodal paroxysmal tachycardia with I-V block Ventricular paroxysmal tachycardia Ventricular tachycardia Ventricular rhythm Normal sinus rhythm with the Wolff-Parkinson-White syndrome Type B rhythms Atrial flutter Atrial flutter with 2-1 response Atrial flutter with 4-1 response Atrial flutter with 2-1 to 4-1 response Atrial flutter with A-V block Wandering pacemaker between the S-A and A-V node Third-degree or complete A-V block lsorhythmic A-V dissociation Second-degree A-V block Special statements "This tracing could reflect any of the following: ventricular paroxysmal tachycardia supraventricular tachycardia with I-V block or aberrant ventricular conduction atrial fibrillation with WPW. Deciding between these alternatives is important but difficult. A cardiologist should be consulted." "Supraventricular tachycardia. The tachycardia could be due to: paroxysmal supraventricular tachycardia atrial flutter with regular 2:1 A-V conduction sinus tachycardia. Consult a cardiologist." *Two types of rhythms which were recognized by the early Bonner program. See text for details. Reproduced by permission from Computers and Biomedical Research.24 period of I yr showed a 91% success rate in contour analysis. Of the contour statements that were in error, 48% were due to measurement logic, and the remainder due to logic following the measurement program. Computer analysis of rhythm resulted in an 8% false positive rate (124 of 1731 normals were misclassified as abnormal), and a 9% false negative rate (26 of 329 abnormal rhythms were classified normal). In the case of rhythm abnormalities other than 375 basic rhythm (i.e.. ectopic beats), 50% were missed completely. As a screening device to separate normal from abnormal electrocardiograms, the system performance was found to be 91% accurate using both contour and rhythm. At this same period in the late 1960s, Smith and Hyde26 at Mayo Clinic collaborated with IBM in the development of a computerized ECG system that processed data acquired from three simultaneously recorded orthogonal leads. The lead set used was a modified Frank set in which (I) a neck electrode was used instead of the Frank head electrode, (2) V4 and V6 were used instead of the C and A electrodes specified by Frank, and (3) the patient was supine instead of sitting. The leads were recorded for 8 sec on FM magnetic tape and transmitted by telephone to the central computer (IBM 7040). A/D sampling at a rate of 350 Hz was performed at one-eighth the recording speed and digital filering applied. A time difference function was derived and a fiducial mark established when the function exceeded a preset threshold for 15 msec. Data from 250 msec following the fiducial mark were considered to contain the onset, peak. and offset of R, and were stored for further analysis. The region after the R wave was searched for a T wave, and the region from the R back to the previous T wave was examined for P waves and other indications of atrial activity. Regular rhythm was defined as that having intervals within a 10% tolerance of normal, and the following rhythms were reportedly recognized: sinus, ventricular, nodal, artifical pacemaker, and second or third degree block. Irregular rhythms diagnosed by the system were atrial or ventricular premature complexes, sinus arrhythmia, and atrial fibrillation. The system reported an overall agreement with cardiologists in 85% of the tracings, with a 98% agreement in normals, and a 14% variation in descriptions of abnormals. At this stage there was serious deliberation about the need for objective analysis of the newly emerging systems for ECG analysis. Caceres'7 had proposed a central pooling of all data in order to establish a uniform national criteria. Pordy22 discussed the 12-lead system versus the Frank orthogonal system, and concluded that the 12-lead was probably preferable but suggested simultaneous utilization of both. Standards were simply nonexistent and evaluation of available

376 JANICE M. JENKINS systems not possible since there existed no library of ECG records with a suitable variety of cardiac abnormalities, and no method for testing existing systems against diagnoses established by nonECG sources. Another program for automatic interpretation of electrocardiograms utilizing the Frank orthogonal leads27 was developed and implemented at Latter-Day Saints Hospital, Salt Lake City. Utah, and was run routinely on all elective admissions by 1969. Interpretive statements were made concerning only the QRS and ST-T waves and consisted of I I1 QRS categories and five ST categories. The diagnoses that were possible are shown in Table 4. The system sampled X, Y, and Z leads at 200 Hz and used decision logic to arrive at a classification. Preliminary results on 287 ECGs produced a 1% false positive rate (2 out of 195) and a 9% false negative rate (8 out of 92) in the QRS analysis. In the ST analysis the false positives ranked 4% (7 of 197) and false negatives 16% (14 of 89). The underdiagnosis tendency was the result of a deliberate emphasis imposed by the desire of the cardiologists using the program. The criteria applied for the diagnostic logic was admittedly simplistic. Additional development of more sophisticated and appropriate criteria was planned, as was the proposed development of Table 4. Diagnostic Statements Made by the Pryor Program* ORS diagnosis Normal Left bundle brancn mock Right bundle branch block Intraventricular conduction defect Left ventricular hypertrophy Right ventricular hypertrophy Anterior myocardial infarction Inferior myocardial infarction Lateral wall myocardial infarction Left axis deviation Right axis deviation ST diagnosis Normal Injury pattern Subendocardial injury pattern Ischemia pattern Digitalis effect pattern *The Pryor program analyzed electrocardiograms from patients admitted electively at the Latter-Day Saints Hospital, Salt Lake City, 1979. Reproduced by permission from Computers and Biomedical Research. 2 logic for P-wave determination and arrhythmia diagnosis. By 1969 computer recognition of complex waveform patterns such as those seen in electrocardiography was considered routine. It was recognized that the major task facing further clinical application lay in the area of establishing uniform diagnostic criteria. The dillicult pattern recognition problem of arrhythmia diagnosis was as yet unsolved. During this year the Medical Systems Development Laboratory (MSDL) was absorbed into the newly formed National Center for Health Services and Development. The ECG analysis system developed during the previous 10 yr by MSDL under the auspices of the U.S. Public Health Services (Caceres Program) was translated by the National Center into computer languages with more universal application and released publicly as ECAN Version D in October Table 5. Listing of Certified Programs: Health Care Technology Division. EKG Data Pool, EKG Analysis Programs (Version "D"), Certified by the Health Care Technology Division Initial Certification System Received by: Date.... CDC 160-A Medical Systems De- 10/1/69 velopment Laboratory CDC 8090 Medical Systems De- 10/1/69 velopment Laboratory CDC 1700 Control Data Corpora- 11/2.1/69 tion Universal FORTRAN Data Pool Task Force 12/18/69 IV (minus (I/O) Sigma 5/6/7 FOR- Xerox Data Systems 1/9/70 TRAN IV-H IBM 360 (Model 40 Beckman Instruments 8/11/70 and larger) SIGMA 2/3 FORTRAN Xerox Data Systems 10/1/70 IV DEC PDP 8 (pseudo Berkeley Scientific La- 11/3/70 assembly lan- boratories guage)* DEC PDP 9' MEDAC 4/14/71 DEC PDP 8 Searle Medidata 5/14/71 DEC PDP 80 Phone-A-Gram Sys-. 10/5/71 terns IBM 360/50 (DOS) Touro Infirmary 11/1/71 IBM 360/30 (DOS) Space Age Computer 4/5/72 Systems Adaptations of the ECAN Version D program which were certified by July 1972. Certification verified that the modified versions functioned identically to the original program. 'Segmented programs. Reproduced by permission from Academic Press.28

AUTOMATED ELECTROCARDIOGRAPHY 1969. The program was adapted by academic and industrial organizations to execute on a variety of computer systems, and a certification system was established to verify that modified versions functioned identically to the original program. Table 5 lists programs which were certified as of July 1972. Following the absorption of MSDL by National Center for Health Services and Development. a demonstration project was undertaken in 1969 by the Community Electorcardiographic Interpretative Service (CEIS), Denver, Colorado, to evaluate the ECAN analysis system and to make recommendations about further refinements of diagnostic criteria.28.The ECAN program was deemed to be a dynamic program which would require periodic updating. The 1-yr study revealed a complete agreement between electrocardiographer and computer in 42% of the electrocardiograms and some sort of disagreement in 58%. A panel of 35 cardiologists was convened to propose tentative criteria changes designed to improve computer-cardiologist agreement. This procedure was intended to be repeated yearly in order to refine and improve the program on a continuing basis. In the year 1970, over 200,000 ECGs were commercially processed by computer in the United States. Programs were achieving an accuracy of greater than 90% and the value of the technique as a screening device for separating normals from abnormals was recognized and accepted. Serial comparison was not feasible in 1970 because the capability of storing a large number of ECGs in digital form on a direct access device was not yet a reality. The technology existed, but the cost of such storage was prohibitive. Magnetic tape storage was possible 377 but access times were not reasonable for comparative purposes. A synopsis of programs that were currently available was published in 197029 and excerpts are shown in Table 6. The new Bonner program that appeared in 1972 was developed by IBM in conjunction with three cardiologists-Crevasse, Ferrer, and Greenfield.30 Changes trom the previous BonnerSchwetman program23-25 included the following features: (1) A sampling rate of 250 Hz was employed in the new version. (2) Five sets of three simultaneously recorded leads were used as input to the program. (3) Digital filtering and data compression (average ratio 4:1) replaced the analog preprocessing of the earlier program. (4) The new preprocessing program selected only those data points necessary to define the waveform and discarded the remainder. (5) The search for P waves takes place in the interval from end of T to start of QRS. (6) A noise factor was computed for each lead and, since the measurement program operates on three simultaneously recorded leads, unnoisy leads dominate the analysis. (7) All of the waves in the entire record were measured and the measurements used to represent the normal beat were appropriate averages of these, suitably screened against including erroneous results into the average. This new version of the IBM (or Bonner) program had a false positive rate of 6.5% and false negative rate of 1.2% on a test set of 1435 electrocardiograms (427 normal and 1008 abnormal). Overall, the program demonstrated a 97% capability of differentiating normal from abnormal tracings (40 errors in 1435 records). Thus a significant improvement was achieved over the earlier version that reported a 91% success rate.:5 Table 6. Synopsis of Computer ECG Programs That Were Available for Public Distribution in 1970 Digital Sampling Developer Leads Available Computer rate Caceres Classical Yes CDC 1700 500 CDC 3200 Smith XYZ Yes IBM 1800 250 IBM 360 CDC 1700 Pordy Classical/XYZ Soon IBM 1800 250 IBM 360 Pipberger XYZ Yes CDC 3200.000 Pryor XYZ Yes CDC 3200 200 CDC 1700 Adapted and reproduced by permission from Marquette Electronics Technical Publication.29

378 JANICE M. JENKINS Rhythm analysis continued to present major difficulties. New logic was applied that evaluated five separate rhythm interpretations from each of the five lead sets and arrived at a decision based on majority vote. The test data for unusual arrhythmias did not constitute a large enough sample to allow any conclusive results. The joint development project of Mayo Clinic and IBM (Smith-Hyde program) was terminated in 1970, but an extensive study of performance of the computer vectorcardiograph (VCG) analysis was undertaken in 1971. The results of computer versus physician diagnosis of 6162 ECGs were reported in 1973.31 Errors discerned in basic functions (detection, measurement, and typing of waves) showed missed P waves outnumbered all other errors combined. Rhythm diagnosis that depends heavily on Pwave detection ranged from 31.4% accuracy (supraventricular premature complexes with aberration) to 93.6% (ventricular premature complexes) in rhythms other than sinus mechanism. The first attempt at comparative analysis of serial electrocardiograms by computer appeared in 1972.32 The serial comparison technique became part of the routine, automated interpretation at Latter-Day Saints Hospital and delivered a comparative analysis of the parameters measured from a current ECG and the most recent previous tracing. The report indicated what changes, it any, were found. The technique proved successful in detecting clinically significant changes but encountered difficulties that resulted from errors in the interpretation of one or both of the ECGs. Errors found in the measurement of a parameter or in the final diagnostic decision were compounded in the comparative program since a later subsequent tracing would report changes from the erroneous "changes" previously reported. An initial (1972) assessment of performance of the system found the program to be successful in detecting all changes that occurred in a sample of 50 ECGs. Only ECGs that did not contain initial diagnostic or measurement errors were submitted for analysis. The years 1972-73 probably represent the stage at which computerized electrocardiography passed from infancy into a state of mature adolescence. A decade and a half had elapsed since early, primitive attempts at automation of the ECG had begun. A number of major programs had emerged and some were finding their way into commercial distribution. The ECAN Version D. Pordy-IBM. and Bonner-IBM programs utilized the standard 12 leads while the Pipberger-VA, and Mayo-IBM programs employed vector leads. MacfarlaneS meanwhile had developed a program that separately employed the standard 12 leads and the 3 orthogonal leads (McFee modified) and compared computer-processed results from the same ECG population. He found, in a total of 1093 records studied, that there was no clinically significant difference between the two lead systems. A subsequent study was performed by Talbot et al.,33 on 4019 patients on a system utilizing information from 15 leads (the standard 12-lead system and the orthogonal 3-lead set). This hybrid system showed greater diagnostic accuracy than that from either lead set above. The 3-lead system gave the most satisfactory results for left ventricular hypertrophy and left bundle branch block, and the 12-lead analysis was the most satisfactory for the diagnosis of ischemia, axis deviation, and left anterior hemiblock. The investigators concluded that both 12- and 3-lead systems (and a combination of these systems) were essential for the computer-aided diagnosis of ECGs. The commercial firm Hewlett-Packard entered the field with a system that originally provided the user a choice between the IBMBonner program or ECAN-DI. By 1972 an inhouse program had been developed for dedicated use on the HP system.34 The 12-lead electrocardiogram was the data source with a sampling rate of 250 Hz, an A/D precision of 10 bits, and 2.2-5.1 sec of data recorded per lead group. The innovative feature offered by the HP program was the capability of user-defined criteria. In this system medical criteria were accessible for modification by the user without requiring any sophisticated programming skills. In addition, a serial comparison was done by comparing the current report to the previous one that had been confirmed by a physician. Another entry into the commercial field of computerized electrocardiography was the service bureau Telemed. The system received transmissions of the 12-lead ECG from hospitals, clinics, and physicians' offices, and provided a

AUTOMATED ELECTROCARDIOGRAPHY 379 computerized report within 5 min, with the option of physician overread.3 Telemed used the ECAN-D program for its initial system but began shortly afterwards developing its own software. The Telemed version used derived parameters as well as measured features. The rhythm analysis used statistical deductions, examining the following values: (1) RR regularity-(three types of rhythm were identified: regular, regular with compensatory pauses, and irregular); (2) degree of confidence that atrial activity was present-(consistent PR duration, P-wave amplitude, and P-wave shape); (3) ancillary information-(pacemaker present, course atrial fibrillation wave, presence of a dropped beat). Rhythms were classified into 1 of 9 basic rhythm groups: sinus (with or without block), regular rhythm without 'consistent P-waves, atrial fibrillation, AV dissociation, Wenckebach rhythm. atrial flutter, atrial fibrillation with slow regular response, and electronic pacemaker. Little or no information about details of the program logic has been published, and performance results are considered proprietary and are generally unavailable. The bureau has grown rapidly and processes at present a large percentage of the total ECGs analyzed by computer in the USA. During this period in the early 1970s the major difficulty in assessing accuracy of computerized ECG analysis, as well as assessing the performance claims of developers and users, continued to be the lack of a standardized diagnostic criteria to employ for evaluation purposes. There was inadequate standardization of definitions and of measurement rules. Each program functioned differently. For instance, some programs used decision tree logic that resulted in a finite number of interpretive statements, while others were based on multivariate analysis that could yield an infinite number of combinations of measurements. A proposal for objective evaluation procedures was proposed by Helppi et al.36 and performance specifications were advanced for precision of measurements both in logical-decision and statistical-decision programs, and in rhythm diagnosis. In general, a test library was suggested that would contain a sufficient number of cases of normal and each abnormal morphology, and rhythm, for use as a test population. A standardization of diagnostic criteria was called for as well as instrumentation standards. Participants at the conference on Computerized Interpretation of the Electrocardiogram in 197537 called for the development of a test set of electrocardiograms to be used for evaluation of existing programs. The test set would ideally contain 300 to 1000 ECGs encompassing a wide patient age distribution, and a wide. range of waveform types including rhythm and conduction disturbances from commonly encountered pathology. The recommended sample rate was 1000 samples per second in order to satisfy both ordinary and research applications. The medium suggested for the data base was to be 9-track digital tape with 800 bit per inch density. The proposal has at present not yet been realized despite almost unanimous agreement regarding its value as a test set for evaluation of pattern recognition algorithms. In the interest of developing a "gold" standard for ECG criteria and terminology, proposals at this same conference were advanced for criteria for diagnosis of left ventricular hypertrophy,38 atrial enlargement and infarct size,39 and bundle branch block or ventricular conduction delay.40 The continuing need for standards for diagnostic criteria and a standardized test set of data in order to evaluate commerical and developmental systems became the banner-cry of numerous researchers during the last half of the 1970s. Each group or system reported performance based on internally designed evaluations because no uniform method had emerged. Meanwhile, a second generation program for computer interpretation of the electrocardiogram was unveiled in 1975."4 The terminology of advanced "generation" in computer science generally refers to a significant advancement in degree of complexity, and indeed, the Pipberger program broke new ground by employing a statistical approach to electrocardiographic classification. Since ranges for normal and disease states are frequently overlapping and no distinct cutoffs exist between states, a method for determining the probability that a certain disease state exists offered a giant step forward in ECG analysis. The new Pipberger scheme used Bayesian classification procedures to compute the probabilities for all diagnostic categories that might be encountered in a given record. The Frank orthogonal leads were employed for analysis. Computer results were compared with inter

380 JANICE M. JENKINS pretations of 12-lead recordings. Pipberger had amassed and continues to expand what is considered to be the largest data base of electrocardiograms that has associated nonelectrocardiographic documentation of cardiac disease states (i.e., catheterization data, echocardiographic, and other clinical assessment). A total of 1192 electrocardiograms was selected in 1975 in which the clinical diagnosis could be positively established by means other than electrocardiography. In 21% of the cases there was no evidence of cardiac disease; in 79% various cardiac disorders were present. The diagnostic electrocardiographic classifications were considered correct when they were in agreement with documented clinical diagnoses. The results of this study revealed an 86% correct classification (5% partially correct, 9% misclassified). A separate noncomputer analysis of the standard 12-lead ECG demonstrated a 68% agreement (4% partially correct and 28% misclassified). The multivariate classification scheme was found to function best when prior probabilities were adjusted according to the diagnostic problem under consideration. By 1975 it had become generally accepted that computers were expected to play a large role in clinical electrocardiography. There was no question that the extraction of various measurements from ECG signals, the determination of relevant. probabilities of disease states, the computation of spatial angles and magnitudes, and the storing ofextensive files of previous data, were techniques that were feasible only with computer assistance. Two types of data remained currently in vogue, the standard 12-lead ECG, and the Frank orthogonal (XYZ) lead system. A new serial analysis program was reported in 1975 by a third group of investigators, Macfarlane et al.,42 in which a modified orthogonal lead system (1, AVF, and V2) replaced X, Y, and Z. The ECGs were processed on a PDP8 and the method employed was the storage of a small selection of wave measurements together with a coded form of the interpretation. Three electrocardiograms could be stored for each patient: a primary ECG, and a secondary one (usually that most recently acquired), and. the ECG under current analysis. This afforded a comparison of up to three serial ECGs, the current one plus two in storage. The technique was initially applied Table 7. Diagnostic Criteria for Myocardial Infarction* Location of Clockwise or Counterclockwise Rotation Mvocardial -- Infarction T, T, 6/8 ST-T, 6/8 ST-T, Inferior 30~ - 70~ - Anteroseptal - 30~ - 700 Anterior (or anterolateral) - 30~ -.70~ Widespread 30~ 30~ 70~ 70~ *Note that rotation can be clockwise or counterclockwise since this depends on the initial polarity of the Twave in leads not affected by the infarct. Reproduced by permission from Computers and Biomedical Research.42 mainly to detection of sequential changes, specifically ST-T changes, following myocardial injury. An important feature was an attempt to establish diagnostic criteria for sequential changes in orthogonal leads. The criteria developed at the Royal Infirmary of Glasgow are shown in Table 7. The search for optimal lead systems for computer analysis of the electrocardiogram continued. Kornreich et al.43 made 5 pairwise comparisons for each of 4 lead systems in the determination of normal (N) versus myocardial infarction (MI), coronary heart disease (CHD), bilateral ventricular hypertrophy (BVH), left ventricular hypertrophy (LVH), and right ventricular hypertrophy (RVH). The lead systems examined were the Frank and the McFee 3-lead sets, the standard 12-lead set, and a new 9-lead system designed to register all significant electrocardiographic information without redundancy. Overall results of diagnostic accuracy were 75% for the orthogonal sets, 79% for the 12-lead set, and 87% for the new 9-lead set. A later multivariate analysis of four diagnostic statements (N, MI, LVH, RVH) demonstrated the diagnostic performance of the new 9-lead set exceeded the Frank leads by 14%.44 In the meantime, Willems45 attempted a comparative analysis of measurement results obtained from 4 distinct programs: AVA (Pipberger), TNO, HP-5 (Hewlett-Packard), and ECAN-D. Digital data from 252 consecutive Frank and 1-2-lead ECG recordings were submitted to each of the 4 computer programs for an evaluation of basic measurement results. It was found that substantial differences in time measurement results were present when identical

AUTOMATED ELECTROCARDIOGRAPHY 381 ECG records were analyzed by various ECG computer programs, as well as systematic differences in the reporting of small Q and R waves. The 2 programs employing the 3-lead sets applied strategies for location of fiducial points on simultaneously recorded leads, and produced significantly greater measurement reliability and reproducibility than the ECAN-D and HP-5 programs that performed single lead analysis. The need for the establishment of common standards for definition of waves and measurements was highlighted by the study, particularly in view of the rapid growth in the application of computerized electrocardiography. This growth was 35-fold over the period from 1970 to 1978, increasing from 200,000 to over 7,000,000. RHYTHM ANALYSIS Pattern recognition and measurement techniques that were applied with success to analysis of the QRS complex have not performed as well in P-wave detection. Numerous authors4655 have called attention to the difficulties in computer detection of P-waves from surface leads. The problem is twofold: (1) normal waves are slow and of small amplitude and therefore difficult to separate from baseline shifts and noise; and (2) abnormal P waves are often coincident with QRS complexes or T waves and cannot be separated even by manual means. Rhythm analysis is highly dependent upon accurate P, QRS, and T waveform distinction and thus the failure of automated systems in diagnosis of rhythm is most often associated with failure to discern P waves. The two conventional lead systems employed for computer-assisted electrocardiography (the standard 12-lead and the orthogonal 3-lead) do not provide complete information on atrial activity in a manner that can be easily extracted. The early Pipberger waveform measurement program9 used spatial velocity for QRS detection and, after determining the location of the QRS, initiated a backward search for a spatial velocity that exceeded 3 aV/msec in order to find the P wave. If no such value was found it was assumed that no P-wave was present. In a manner similar to finding the P-wave location, the end of the record was searched in retrograde fashion for the T wave. This automatic recognition of electro cardiographic wave by digital computer was a notable accomplishment, yet determination of P waves was consistent only in normal sinus rhythm, i.e., when the P wave appeared in the expected location with reference to the QRS. No P waves were recognized in the 10 instances in which PVBs appeared, none was recognized in 9 cases of atrial fibrillation, and in 3 cases of nodal rhythms the P was buried in the QRS and not detected. The package of computer programs produced by the collaborative efforts between Mt. Sinai Hospital, New York City, and IBM contained a measurement program,23 an arrhythmia analysis program,24 and a program for contour analysis with clinical results of rhythm and contour interpretation.25 The measurement program examined 6-sec samples of groups of 4 of the standard 12 leads, while the rhythm program included an additional 20-sec segment of leads CR2 and 11. Of the 2060 cases subjected to computer analysis, only 4.4% contained abnormalities of rhythm, yet results showed incorrect rhythm analysis of 6% of all cases. "Extra Statements" which should have been printed for abnormalities such as ectopic beats and aberrant ventricular conduction were missing 20% of the time and were incorrect in 49% of those reported. The basic measurement and contour analysis by contrast was found to be correct in 91% of the cases. During the same period in the late 1960s Miyahara and colleagues46 were pursuing computerized arrhythmia diagnosis and chose to concentrate on the temporal location of P and R waves without regard for contour information. They analyzed records 60 beats in length and determined interval lengths by human observation. Their rationale was that, "at present, the ability of a human observer to detect indefinite or hidden P waves seems far better than that of currently available machine techniques." Their logic assumed that all P and R waves had been detected, although they conceded that, in clinical practice, the detection can in some cases (hidden P wave) be quite difficult. A scheme encompassing complicated logic was used: an initial histogram identified dominant rhythm, after which P waves were sought that bore a relationship to R waves (thus exhibiting a PR or RP relation);

382 JANICE M. JENKINS next, a tachogram check examined beat-to-beat relationships looking for events of marked arrhythmias. Four general categories were classified: regular sinus rhythm, sinus arrhythmia, marked sinus arrhythmia, and complicated arrhythmia. Occasional abnormalities were also recognized and reported. Reasonable success was found with recognizing arrhythmias belonging to the sinus rhythm groups and with segregating complicated arrhythmias from sinus rhythms. Occasional abnormalities could only be recognized when the dominant rhythm had been specifically diagnosed (categories I and 2). The investigators proposed the eventual addition of morphological information in an effort to improve the technique. At about the same time, Stark and colleagues undertook development of a computerized remote real-time diagnosis of clinical electrocardiograms using a multiple adaptive matched filter technique. They devised a series of typical filter patterns for the P, QRS, and ST-T segments of the electrocardiogram.47 The signal of ten successive beats was time-aligned and averaged, providing a smoothed version for pattern recognition. The method for determining P waves involved subtraction of an averaged QRS template, thus leaving a residual signal containing P waves. This imaginative scheme was the first one that appeared that provided for P wave recognition when it occurred anywhere other than the isoelectric T-Q interval. The technique supposedly afforded accurate determination of independent atrial rhythm, P wave distribution, and P wave polarity with respect to QRS complexes. The adaptive filter process is described theoretically but no results appear to be published. In the case'of rhythm analysis. Stark admits that thorough analysis of rhythmicity must include P wave as well as QRS complex information, and concedes that the averaging process did not work for irregular P waves because the nonsynchronized wave was cancelled out. The system supposedly yields 20 mutually exclusive rhythm interpretations but no further results have been published since the original paper appeared in 1966. Wortzman and colleagues20 were concurrently developing a hybrid computer system for the measurement and interpretation of electrocar diograms. The realization that nonarrhythmic ECGs might be interpreted by examining a single heart cycle, but that arrhythmia interpretation would require the characterization of a consecutive string of cycles, suggested the necesity of analog pre-editing, selective averaging, measuring, monitoring, and interpreting. The measurement portion of the program utilized a template that was a statistical average of a string (up to ten) of normal QRS complexes, and from this were determined amplitudes and intervals relative to the beginning of the QRS. A novel technique for the monitoring segment of the program was devised: a binary word (match word) was produced for each heart cycle and compared to a heart cycle template. Each bit position corresponded to a particular point in the template, and points of the match word that aligned with the appropriate point of the template were coded 0 and mismatches coded 1l. A similar arrangement was made with masks for individual waves, i.e., subsets of the original template word. This monitoring program was developed as an input to an arrhythmia analysis program that was to examine a consecutive string of heart cycles. Unfortunately, the arrhythmia program was unfinished at the time of the original report, and appears not to have been reported later in the literature. In 1972 Willems and Pipberger reported the addition of an arrhythmia section to the earlier program that did automatic analysis of the PQRS-T complex.48 They took pains to clarify the distinction between determination of cardiac arrhythmias for routine ECG analysis and that for continuous rhythm monitoring, and conceded that complex arrhythmias can hardly be recognized on short strip recordings of surface ECG leads. Nevertheless, the original program was modified to include automatic interpretation of some of the common arrhythmias that could be recognized in 6- and 10-sec ECG recordings. A decision tree method was used for classification with logic that attempted to compensate for pattern recognition failures. Two 6-sec segments from the main diagnostic program were merged to form a longer, 10-sec strip for the rhythm analysis segment. In a discussion of the limitations of the arrhythmia detection program the authors attribute numerous difficulties to prob

AUTOMATED ELECTROCARDIOGRAPHY 383 iems with P-wave recognition. The program makes no distinction between ventricular beats and supraventricular beats with aberrant conduction, and searches for P waves occur only in the interval between the end of the T wave and the onset of QRS. The investigators recognized the need for "special techniques" for P-wave detection since "the human ability for recognition of low-voltage P waves... is indeed far superior to the performance of all presently available ECG wave recognition programs." In the second Bonner program for diagnosis of the electrocardiogram which appeared in 1972,30 5 sets of 3 simultaneously recorded leads were used as input to the program, and either X, Y, Z or VI, II, V6 were used for rhythm analysis. Each 3-lead segment was 5 sec long and a separate decision as to the nature of the basic rhythm was made for each lead set. A "combining" program then evaluated the 5 rhythm interpretations and made a decision as to the final rhythm statement. If inconsistencies existed the program prints "undetermined rhythm." Results showed a 93% accuracy in rhythm analysis, but 91% of the total 1435 ECGs that were diagnosed had normal sinus rhythm that is easily diagnosed by any program. Bonner claimed that the rhythm analysis section of any computer program will always have difficulty because of the complexity of certain rhythm disturbances and the problems associated with P and T wave differentiation. He concluded that his test data for unusual arrhythmias was not sufficient for detailed analysis other than to illuminate the difficulty in the diagnosis of varying A-V block and complex arrhythmias. Caceres, known for the development of one of the first complete systems for automated electrocardiographic analysis, evaluated the state of electrocardiographic automation in a presentation before the American College of Cardiology in 1972.49 He favored the further development of computer-based analysis but pointed out several areas in which the computer interpreted ECG had not completed with ECGs read by conventional methods. Arrhythmias, said Caceres, are an example of the need for a physician's presence, especially for those that were nonusual, since computers only pick up the very common ones. By 1975 the Pipberger program had become a "second-generation" program4' that used Bayesian classification, a classical pattern recognition scheme in which the probability of an entity belonged to a disease class is based on the prior probabilities of the incidence of that disease class. Applying Bayes theorem to the feature vector of measurements extracted from a given record, the probability that the record belonging to I of 7 classes (normal, plus 6 disease classes) was computed and the class exhibiting the highest probability was given as the diagnosis. Accuracy of the technique exceeded conventional readings by 20%. Unfortunately, computer analysis of cardiac rhythm was not included in the report. LeBlanc and colleagues50 entered the field of arrhythmia detection in automated ECG analysis with a treatise in 1975, charging that no interpretation program would be acceptable to cardiologists without a rhythm interpretation that performed as well as shape analysis. Although the.technique developed by LeBlanc et al. falls within the category of diagnostic ECG (as opposed to rhythm monitoring) they initially record a continuous 90-sec segment of the ECG from X, Y, and Z leads and use a segment of that for their rhythms analysis rather than the short, segmeted sections used by other systems. The program has the traditional two main sections: a measurement section and an interpretation logic. The measurement section calculates intervals (RR and PP) and measures waveforms and amplitude (P, QRS, and T). The interpretation logic uses fixed diagnostic rules to determine the type of arrhythmia. LeBlanc et al., were well aware of the limitations concerning detection and measurement of the P wave and used part of the interpretation logic in an attempt to correct possible errors. The signals from all three channels were processed simultaneously, with the intention of locating, first, a "good QRS candidate" and then determining the RR interval measurement. A nice scheme was advanced for P wave detection, using computer logic based on the maximum amount of information that can be obtained concerning -the atrial rate. The procedure to obtain "good P wave candidates" first established an approximate S-Q interval for the P wave search, filtered the signal with what

384 JANICE M. JENKINS amounts to a moving average filter while retaining information about the slowly varying P and T waves, and finally distinguished the T wave from the remaining waveforms. (A T waveform is the first one found chronologically.) Upon establishment of a P-wave location, the waves identified are subjected to a screening based on an examination of PP and PR intervals, and only those P waves that fit into a given interpretation context are retained to constitute the P wavetrain. The approach adopted by LeBlanc and colleagues takes cognizance of all of the shortcomings of other programs with regard to P-wave detection, yet it provides no solution for finding P waves in unexpected places, i.e.,,buried in the QRS or falling on top of T waves. An original technique for QRS detection was employed and QRS morphology was determined. LeBlanc states, "Conceptually, this program may be presented as a process extracting the information from two channels: atrial and ventricular.... The outputs of the two channels contain the three.pertinent features of an ECG record which permit identification of the type of rhythm: the QRS wavetrain, the QRS morphological characteristic:. —nd the P wavetrain."' He admits the most diffic:.~i feature to obtain is the P wavetrain and he claims some success in arrhythmia diagnosis, with the exception of second degree A-V block, Wenckebach, and A-V dissociation. This is to be expected considering the limited portion of the cardiac cycle to which the P wave search was restricted. This protocol appeared to be universal. Rhythm analysis as part of the diagnostic ECG procedure followed a similar method in CIMHUB at Brussels according to a report by Sajet et al.5' Spatial velocity was employed for QRS detection but considered ineffective for detection of P waves since the noise that often masks P waves would be increased by this method. For P-wave detection the segment between the previous T wave and current QRS was filtered and linear interpolation applied to eliminate baseline drift. Noise was measured during the 30 msec preceding the onset of the QRS complex, then the noise was subtracted by a method that is undisclosed. The resulting signals of the 3 leads of each group were rectified and added. Four 4.8-sec groups of 3 leads were processed. The system successfully enhanced P waves, but only, of course, when they fell within the T-QRS segment. Computer detection of P waves was attempted by Hengeveld and van Bemmel, and 2 algorithms were presented in 197652 for analysis of 10- to 15-sec ECG recordings. A search for P waves occurred only before dominant types of QRS complexes. The range of the PR interval distributions was computed and coupled P waves were assumed and searched for. A second alternate algorithm handled uncoupled P waves using shape information. No attempt was made to find P waves associated with nondominant type beats (i.e., PVBs). The system was relatively primitive and rhythm classification results were not given. Miyahara et al. examined the performance of rhythm analysis by two commercial systems (Telemed and IBM) and reported results in 1979.53 The investigators considered arrhythmia diagnosis to be inferior to contour diagnosis because detection and location of abnormal P waves were unsatisfactory and the length of the record too short for analysis of complicated arrhythmias. Results of sensitivity and specificity of the two programs are shown in Table 8. While the investigators did not consider one program to be more acceptable than the other, they concluded that the detection and identification of complex rhythm by both programs was still in need of improvement. The pattern recognition techniques applied to automated ECG interpretation during the 1960s Table 8. Evaluation of Rhythm Analysis of Two Computer ECG Programs Rhythm Rhythm Diagnosis Including Diagnosis Excluding Artificial Pacemakers Artificial Pacemakers Telemed Sensitivity 98.4% (2003/2036) 98.4% (2003/2036) Specificity 79.0% (214 /271) 84.7% (194/229) IBM Sensitivity 98.5% (2153/2185) 98.5% (2153/2185) Specificity 88.0% (219/249) 86.4% (190/220) From Miyahara H. et al.53

AUTOMATED ELECTROCARDIOGRAPHY 385 and early 1970s included a variety of schemes that attempted a rudimentary rhythm analysis. All relied upon surface leads where P wave registration is of small amplitude, slow velocity, and frequently obscured by noise. In almost all cases the search for P waves was restricted to the location where a P wave could be expected given that normal sinus rhythm is present. Naturally, aberrant rhythms exhibit P waves in erratic relationships and locations with respect to the QRS. Since these unusual sequences are what are of interest in arrhythmia analysis, the determination of P waves only in normal habitat is mostly trivial and clinically uninteresting. COMPUTER-BASED ARRHYTHMIA MONITORS The application of computer technology to pattern recognition of the electrocardiogram. caught the attention of those who sought improvement in coronary care instrumentation. The computer appeared to be a logical candidate for relieving the staff of the tedious chore of monitoring a multichannel oscilloscope if reliable pattern recognition techniques could be developed. The real-time analysis of a dynamic physiologic signal was a difficult and demanding problem and a variety of solutions began to emerge. This section surveys the computer-based arrhythmia monitoring schemes that have appeared from the early 1960s until the present time. A developmental program for computer monitoring of the electrocardiogram in a coronary care unit appeared in 1969 in which the ECG was transmitted by telephone line and at fixed time intervals the interpretation was transmitted to a teletype receiver. The system was capable of monitoring two beds.56 At about the same time the Bonner program was rewritten for CCU monitoring purposes with a single patient capability.5 Only initial reports appeared on these developmental systems that had not at that time been clinically implemented. Watanabe58 devised a four-patient monitor on a small scale computer in which classification of arrhythmias was based on X, Y, and Z lead input. He reported successful recognition of P waves from a highly amplified and filtered Y lead, but required a special procedure to be invoked to detect P waves superimposed on T waves on the ST segment. Limited success was reported in PVB detection. and Watanabe thought improvement of diagnostic accuracy depended upon the development of an adequate lead system in which the atrial activity would be sufficiently manifest for computer measurement. Haywood and associates59 engaged in some preliminary work in 1970 in the development of an on-line system that was capable of uninterruped monitoring while a stable rhythm was present. The method of analysis required finding significant reference points on the QRS complex. The parameters measured by the computer for each cycle were the cycle duration, and four specific parameters for each wave within the cycle. Values detected during each cycle were compared to startup values acquired during a stable rhythm state, usually sinus rhythm. An interrupt occurrred when R-R intervals exceeded normal values. The system was reported to be the beginning stage of a reliable continuous real-time monitor for detection of intra- and intercycle variations from a stable rhythm for the purpose of arrhythmia detection. The svdetect e tected all arrhvthmia onsets but was unable to analyze a basically irregular rhythm. Gersh et al.60 considered RR interval measurements to contain all the relevant information required for arrhythmia classification. They devised "prototypic models characteristic of different cardiac disorders" and compared a string of symbols representing 100-beat intervals to the models to determine the prototype corresponding to the largest probability of the observed sequence. The symbols were S, N, and L representing short, normal, or long RR intervals. They modelled six arrhythmias with prototypes by computing an average transition matrix from the available sample of that disorder. They reported error-free classification, but the training set that was used to develop the prototypic patterns was then later used to test the classification, so these results are not too surprising. In an effort directed toward improved P-wave detection, Rey and colleagues6' monitored patients from the three Frank leads and, after preprocessing which differentiated the vectorcardiograph and computed the spatial velocity, applied threshold methods to detect P, QRS, and T waves. Initial results reported for seven patients showed no improvement over a similar

386 JANICE M. JENKINS system used for diagnostic ECGs. Large, welldefined P waves were reliably detected; the remainder were not consistently identified. A more basic approach was adopted by Feldman et al.62 in the design of a real-time monitor. The system concentrated on PVB detection because episodes of ventricular fibrillation were thought to be frequently preceded by the incidence of PVBs. The eight-patient monitor performed a beat-by-beat analysis of each QRS complex. Prematurity was defined as less than 90% of the average RR interval. The QRS was sampled and subjected to a normalized cross-correlation with a standard normal beat. If the beat were both abnormal and premature, it was recognized as a ventricular ectopic. The system correctly identified 71% of the PVBs. By concentrating on premature ventricular ectopics the system could rapidly process large amounts of data on-line, but specificity was sacrificed by the simplicity of the logic applied to PVB detection. Other investigators chose the same route, that of searching for PVBs only. Oliver and colleagues developed a similar online system for analysis of a single-lead electrocardiogram. A method of transforming the sampled ECG data to a more compact waveform representation was developed at Washington University.63 The data compression scheme, AZTEC, produced a sequence of flat lines and sloping segments characterizing the ECG with approximately 25 elements per second.64 The AZTEC algorithm was the first of a series of processing stages, each of which reduced redundant information. The second stage provided detection, delimitation, and description of ECG waves, and the third stage grouped together waves of similar shape. The fourth stage produced a clinical rhythm diagnosis, and the system, known as ARGUS, was implemented in a coronary care unit in 1969.65 Morphological evaluation of the QRS complex plus a clustering technique to group identical complexes resulted in an overall detection rate of 78%. Late PVBs (fusions) were frequently missed. Other systems limited to PVB detection only were designed by Gerlings6 and Geddes and Warner67 with slightly different logic but similar levels of performance. A statistical method was advanced by Haisty and colleagues68 for on-line arrhythmia diagnosis as a solution to economical monitoring of several patients on a small computer. The current techniques were thought to be excessively demanding of CPU time for analog-to-digital conversion and for exacting pattern recognition algorithms. Haisty et al. suggested reducing the patient's ECG to a sequence of intervals and applying multivariate discriminant function analysis to RR intervals. Only three arrhythmias were classified, based on somewhat oversimplified models of the rhythms, but careful steps were taken to segregate the training set that produced the means, standard deviation, and autocorrelation coefficients for each rhythm, and the testing set upon which the analysis was performed. Results of 85% correct classification were considered to be impressive for a program that evaluated simple measurements when compared to results reported by elaborate pattern recognition schemes. Another inexpensive and simply contrived design for on-line arrhythmia analysis was reported by Sasmor and King.69 An analog preprocessor was employed to add identification markers to the beginning and end of the QRS complex of the ECG signal fed to the digital computer. The algorithm performed simple measurements to determine QRS width, area, polarity, and interval. No specific information was published on the arrhythmia analysis algorithm, but it is apparent that beats with aberrancy and prematurity represent the focus of the system. Dell'osso70 restricted his attention to anomalous beats with the following criteria: area increase, QRS width increase, or polarity reversal. His monitoring system delivered an alarm if the number of premature beats or the number of successive anomalous beats exceeded a preset level, if an "early" premature beat occurred, or if a multiform beat was detected. System performance that was 99.99% free of false positives and completely free of false negatives was reported. Anomalous beats only were sought and no effort was made to distinguish between types. In the interest of accomplishing reliable QRS detection, P and T wave deemphasis was employed. In 1973 with the appearance of the variety of computer arrhythmia monitors, some of which had found their way into commerical systems, Romhilt71 undertook an evaluation of conventional monitoring versus a computer monitor (Hewlett-Packard) to determine efficacy. Dur

AUTOMATED ELECTROCARDIOGRAPHY 387 ing a period of conventional monitoring of 31 patients, tape recordings were made for submission sebsequently to the computer monitor. Although postcoronary patients were expected to show a high incidence of arrhythmias, it was found in this study that 100% experienced an arrhythmia during the 5 days of observation but only 64.5% of these were noticed by the coronary care staff. Serious arrhythmias accounted for 93.5% of incidents and conventional monitoring recognized only 16.1% of these. Premature atrial contractions were detected in 96.8% of the patients but only 45.2% were discovered by conventional means. It was apparent that many arrhythmias occurred,after myocardial infarction (MI) that were not recognized by conventional electrocardiographic monitoring that is commonly in use in the CCU. Romhilt reported that the failure of conventional monitoring to detect arrhythmias was greatest for serious ventricular arrhythmias, which have been reported to be premonitory for ventricular fibrillation. Results suggested that "virtually all patients, even those considered good risks, have ventricular arrhythmias after an acute myocardial infarction" and he emphasized the need for automated real-time arrhythmia detection performed on-line in the coronary care unit to achieve reliable recognition of all arrhythmias. Yanowitz et al.72 echoed these sentiments while admitting that cardiac rhythm has been fairly difficult to diagnose by digital computer. Their system at the University of Chicago employed the AZTEC preprocessor63 for QRS detection but employed a simpler logic for diagnosis of various arrhythmias than the AZTEC designers. This modified system demonstrated 90% accuracy in recognition of PVBs with most errors of the fusion and late PVB type. Arrhythmias identified by the system included ventricular tachycardia, bigeminy, PABs, paroxysmal supraventricular tachycardia, and escape beats, but only QRS and RR interval information was taken into account. By the year 1974 there were 20 hospitals around the world that had full-fledged computerbased ECG arrhythmia detection systems of which half were research and development systems, while the remaining were commercial systems sold for purely clinical use.7 The five major commercial systems at this time were: the Men nen-Greatbatch system developed at Washington University, the Electronics for Medicine system developed at Worcester Polytechnic Institute, the Gould Medical Electronics system developed at Palo Alto Veterans Administration Hospital, the American Optical system developed at George Washington University, and the Hewlett-Packard system developed at Stanford University. Harrison and colleagues74 discussed the state of the art of commercial arrhythmia detectors at a conference in 1974 and submitted a list of 16 arrhythmias that a system should detect. They concluded that "while all systems have concentrated on PVC and its quantification, only a few systems permit monitoring of the other premonitory arrhythmias outlined.... In addition, since atrial activity is not detected by most systems, many of the interesting arrhythmias... are not detected. They will be detected only when adequate P wave detection becomes possible." It thus appeared that only with a nonstandard lead especially designed for high atrial discrimination would effective arrhythmia analysis by computer be possible. LeBlanc and Roberge"5 stated, "The detection of auricular activity is the most important limiting factor of existing arrhythmia analysis programs. No good criteria have been found to permit reliable identification of P waves on a beat-to-beat basis.... Furthermore, since the timing of the P wave relative to the QRS complex cannot be assumed a priori, a systematic search for a possible P wave must cover the entire RR interval." Bernard and associates76 came close to this goal with their use of an intra-atrial lead simultaneously with a surface lead. Unfortunately, their selection of a unipolar atrial electrode produced a signal with not only a large atrial deflection but a large QRS as well. In order to segregate the two deflections, the QRS "artifact" on the intra-atrial lead was blanked out for the entire duration of the surface lead QRS complex. So Bernard et al. detected almost all of the P waves, the exception being those that occurred concurrently with the QRS. Of course the technique is highly invasive. Thus "arrhythmia detection" remained essentially "PVB detection" in the mid-1970s, and most of the research efforts were directed toward upgrading existing systems through improved signal processing techniques, more elaborate

388 JANICE M. JENKINS logic applied to the ECG signal to extract the maximum of information, and development of more sophisticated alarm systems to placate hospital personnel. The Stanford arrhythmia monitoring system was updated to include a graded set of alarms based on priority, severity, and association with other alarms.77 Four new developmental systems for coronary intensive care appeared. Frankel and colleagues78 reported a developmental system that detected PVBs, PABs, and runs of PVBs by use of a pattern recognition algorithm utilizing first derivatives and second derivatives of the QRS and RR intervals. No attempt was made to recognize P or T waves. An on-line system in use at Louvain, Belgium,79 added measurements of systolic, diastolic, and mean pressures to a monitor that separated premature and abnormal beats. The software for the system was designed originally for the monitoring of isolated perfused organs. Bussman and associates8~ effected on-line arrhythmia analysis by coding each beat in 5 bits of a binary word. Bits were set for the RR interval preceding the beat, the interval following the beat, and for the shape of the beat. Short or long intervals could be designated this way and abnormal shape could be indicated. The assignment of codes to certain forms of arrhythmias was kept in a program table that could be changed by the operator. Their intention was high accuracy in the discrimination of PABs and PVBs. The five bit code contained the following categories: premature ventricular extrasystole, supraventricular extrasystole, block (missed beat), ventricular ectopic, paired ectopics, no diagnosis, artifact, QRS too late, muscle potentials, and low QRS amplitude. Only QRS and RR interval analysis was done to create the beat code and thus the system, like almost all of the others, called all aberrantly shaped early beats PVBs no matter what their origin. Swenne and van Hemel8' devised a system in Utrecht, Netherlands, that incorporated an interactive clustering scheme (such as those used by automated ambulatory monitoring analysis systems) into a coronary care monitor. An observer is required so that when the pattern recognition algorithm analyzes the ventricular contraction (QRS), the observer must allocate the waveform to an existing class or create a new class. Clusters are enlarged or created by man machine interaction. Fusion beats or alternating conduction disturbances cause difficulties for the human observer because of the differing shapes of the waveform. The only totally automated portion is an alarm system. based on heart rate. Despite the limitations of the computer-based arrhythmia monitors, a study conducted by Vetter and Julian in 1975"2 in which conventional monitoring techniques were compared with an automated monitor revealed the computer to be overwhelmingly superior to conventional methods. The on-line arrhythmia computer with a series of graded alarms detected more than 99% of potentially serious arrhythmias and 95% of patients with these arrhythmias were treated immediately. In those monitored by conventional means, a large proportion of such arrhythmias went undetected and only 17% of affected patients received antiarrhythmic therapy. They commented, "In the CCU, the best criterion is the speed with which antiarrhythmic therapy is initiated. With conventional monitoring, treatment was frequently delayed or not given at all. In the computer-monitored patients, treatment was given immediately to all those for whom it was indicated." False alarms were found to be a problem in both conventional and automated systems, but the graded alarms delivered by the computer were found preferable by the nursing staff. The arrhythmia system of Knoebel,83 which appeared in 1976, resembled many of her predecessors' in that PVBs were the primary interest. Beats were assessed by the criteria of QRS configuration, T wave configuration, and timing (RR), and four classifications were assigned: normal, premature ventricular, interpolated or late ventricular, and atrial premature. Difficulties encountered in false positive classifications fell into two categories: noise and premature atrial beats. Knoebel purposely eliminated those sections from consideration and reported a false negative and false positive rate of less than 2% and 3%, respectively. The ignoring of PABs raises questions regarding specificity of the classification. Another system employing an intra-atrial lead was devised by Mantle et al.4 for the purpose of monitoring the bipolar electrogram, the pulmonary arterial pressure, the systemic arterial pressure, and the thermodilution cardiac output. A

AUTOMATED ELECTROCARDIOGRAPHY 389 surface electrocardiogram was monitored also, and the investigators reported the advantage of an atrial signal in the analysis of complex arrhythmias. They reported a large A-wave to baseline-noise ratio that greatly facilitated computer monitoring of the atrial rhythm. The AZTEC preprocessing system was employed for waveform detection in the two-channel electrocardiographic part of the system.63 Mantle and colleagues found identification of A waves to be more reliable than identification of R waves because of the low baseline noise on the intracavitary signal, yet no comprehensive logic had been developed for complex rhythm diagnosis at the time of their report. In the following year, 1977, the software for the ARGUS/H85 rhythm monitor was refined and incorporated into a microcomputer based arrhythmia-monitoring system for use as -a stand-alone patient monitor or for connection to a host computer for research data collection. Efforts to overcome the shortcomings of on-line CCU monitors led Zencka et al.86 to utilize the ARGUS/H and AZTEC programs in a system that allowed considerable human interaction. They believed that parameter modification by nurses would enhance the performance of a computer program designed for PVB detection. Results were disappointing in that nurses' responsibility with regard to direct patient care did not permit time for properly adjusting the system. A new design using a statistical approach for detection and classification of arrhythmias was presented in 1977 by Gustafson et al.87 and described more' fully in 1978.8889 The system used a set of dynamic models that described the sequential behavior of RR intervals and applied two statistical techniques for the identification of persistent and transient arrhythmias. A distinction is made between persistent arrhythmias, i.e., those in which the RR interval pattern possesses some regularity, and transient arrhythmias characterized by abrupt changes or irregularities, since these two classes lend themselves to somewhat different statistical analysis techniques. In the first class, persistent arrhythmias, four models were presented: (1) small variation-a category that includes sinus rhythm, sinus tachycardia, and sinus bradycardia; (2) large variation-sinus arrhythmia and atrial fibrillation; (3) period-two oscillator-RR intervals that are alternately long and short, i.e., bigeminy or A-V block of every third impulse; and (4) period-three oscillator-RR interval sequence that repeats over a period of three beats. Four Kalman filters, which represent the four models, operate on the data and compute likelihoods or probabilities of the various potential diagnosis.88 In the transient rhythm classification scheme, a generalized likelihood ratio technique is used in which the rhythm category is identified by means of a maximum-likelihood hypothesis test. Four simple models of transient phenomenon are used in this classification: rhythm jump, noncompensatory beat, compensatory beat, and double noncompensatory beat.89 The overall scheme is imaginative and promising although only simple models of easily defined arrhythmias are presently employed. The authors selected the RR interval as the single measurement variable "since most arrhythmias do manifest themselves in some way by altering the RR pattern." Although this is true, RR interval sequences do not uniquely describe every rhythm and, thus, a number of arrhythmic patterns producing the same RR interval characteristics would be placed in the same category. The authors concede this limitation and plan to incorporate additional information such as P-wave location into the system. The extension of the single-beat analysis using the Arzbaecher esophageal electrode90 into a contextual-analysis algorithm with capabilities for recognizing 14 complex arrhythmias was reported in 1977.9' The details of this esophageal technique for arrhythmia monitoring will be described in the following section. A NEW TECHNIQUE FOR P-WAVE DETECTION With the exception of the two schemes that used an intra-atrial catheter lead, and are therefore impractical for most applications, all of the systems described in the above section treat the QRS as the only significant electrocardiographic event, focusing attention on QRS morphology or RR interval or a combination of both. Thus computerized arrhythmia monitors recognize only a few of the significant arrhythmias and generally fail to detect arrhythmias of supraventricular origin. This is because conventional surface leads, which are sufficient for QRS recogni

390 JANICE M. JENKINS Fig. 5. Drawing of the swallowable esophageal electrode encased in a gelatin capsule. The pill-electrode is a closely spaced bipolar pair, as shown in the inset, suspended from a pair of thin flexible wires. The electrode passes into the esophagus with- a swallow of water followed by additional swallowing until it is positioned immediately posterior to the left atrium. (Reproduced by permission from the American Heart Association.4) tion, are highly inadequate for automated Pwave detection.. The development by Arzbaecher92 of a miniaturized esophageal electrode provided a dramatically improved registration of atrial activity. The electrode, which resembles a small pill and can be easily swallowed, has caused a revival of esophageal electrocardiography especially for purposes of computer analysis. The bipolar device consists of a pair of closely spaced (1 cm) electrodes attached to two threadlike stainless steel wires for connection to an electrocardiograph (Fig. 5). The pill-electrode is encased in a gelatin capsule for swallowing, and within; seconds the electrode descends to a position below the left atrium. After a few minutes the capsule dissolves and the electrode is slowly withdrawn a few centimeters until its electrocardiographic registration shows a P-to-QRS ratio of 3:1 or more. Then the wire is taped to the patient's chin to prevent further peristaltic lowering. The procedure is painless and because the wire is so lightweight and flexible, the electrode is tolerated comfortably for extended periods of time without restricting normal activity, eating, and sleeping. Figure 6 shows a two channel ECG in which the esophageal signal is seen on the top channel and the signal from a standard surface lead appears on the lower channel. This pill electrode (Arzco Medical Electronics, Inc., Chicago) is the basis for the completely new system of Jenkins. Wu, and Arzbaecher93 that distinguished a variety of complex arrhythmias based on P as well as on QRS information. The algorithm was intended originally for a two-channel system in which R waves were detected from the surface electrocardiogram and P waves. from an intracardiac electrogram. The thrombogenic possibilities inherent in a lead passed percutaneously into the right atrium and maintained for extended periods of time prompted investigators to discard the intra-atrial method as unfeasible and use the pill electrode, which provided an almost identical signal to the intra-atrial one with none of the potential hazards. 0' SROT" I aowM -* AIaI1.HAc rTi 1 NOBIAL.-4Iq, 4-.! '.'!. 1 I i * '6:- - l i- - A A- 4 I -r I I — -!-. --- L:L!-.- I — t — -. I-t -; -. -1 11 ITT.,~RR, _ L I ~ )4~.4 W, 4 Fig. 6. A two-channel ECG with the esophageal tracing on the top channel and the surface lead on the lower channel. "A" represents atrial depolarization recorded from the esophageal lead, and "R" represents the QRS complex recorded from surface lead (see text for discussion). (Reproduced by permission from the American Heart Association."4) I I I I ~-t — Ht-l~f~ ---H H --- -T-t~)4 f ~ -1 - 7- r TF ---

AUTOMATED ELECTROCARDIOGRAPHY 391 Computer Arrhythmia Detection Based on this new two-lead system, which includes the swallowable capsule-electrode for esophageal monitoring of P waves and a surface lead for QRS, we have developed an on-line arrhythmia monitor.90'9' Three interval measurements (AA, AR, and RR) and a QRS shape measurement provide the foundation for a detailed interpretation of each beat. Using the single-beat analysis as a basis, the contextual diagnostic algorithm recognizes and reports online the following arrhythmias: couplets, bigeminy, trigeminy, ventricular tachycardia, supraventricular tachycardia, artial flutter, atrial fibrillation, ventricular tachycardia with retrograde conduction to the atria, first-degree block, second-degree block, Wenckebach periodicity, advanced block, third degree block, and sinus bradycardia. For computer processing the esophageal ECG is amplified 2 to 4 times and bandlimited from 5-100 Hz to eliminate low-frequency artifact due to respiration, peristalsis, and cardiac contraction. The signal is differentiated and passed to a threshold detector that produces a trigger signal followed by a blanking period sufficient to prevent multiple detection during multiphasic waveforms. The surface lead ECG is recorded at the standard bandwidth of 0.05-100 Hz, rectified to produce its absolute value, differentiated, and applied to a threshold detector for a trigger output with suitable blanking. Two trigger signals reflecting atrial and ventricular occurrences are delivered to a computer' with appropriate analog-to-digital channels. Upon receiving the trigger signals corresponding to P waves (A) from the esophageal channel and QRS complexes (R) from the surface channel, the computer automatically measures the AA, AR, and RR intervals. The surface ECG is digitized at 500 Hz for analysis of the shape of the QRS.93 Because the shape of the A wave is not used in this scheme, the esophageal channel is not digitized. Arrhythmia analysis is done in real time by detecting each incoming QRS complex as it occurs and comparing it and its three associated intervals with a previously stored normal beat. If the QRS complex has abnormal morphology, the shape measure is assigned a code of 0 for purposes of computer analysis; if the shape is normal, it is assigned a code of 1. The intervals (AA, AR, and RR) associated with the best are graded short, normal, or long and assigned a code of 0, 1, or 2, respectively. The codes for these 4 features are combined into a 4-digit number with the following format: AAAR-RR-CC, where CC represents the coefficient of correlation, or shape index (Fig. 6). The 4-digit code for each beat is computed upon the appearance of the QRS complex. Therefore, if the ventricular impulse precedes the expected A wave, the AA and AR intervals cannot be calculated for that beat and the first 2 digits of the code are set to 3, indicating that the measurements are indeterminate. With this scheme for coding the 4 features, 60 distinct, 4-digit combinations exist. Contextual Arrhythmia Analysis The individual beat codes are the foundation for the contextual recognition of more complex arrhythmias. The contextual analyzer examines sequences of beat-codes for patterns that reflect specific arrhythmias. Rhythm patterns that can be identified by the program are normal sinus rhythm (NSR), couplets, bigeminy, trigeminy, ventricular tachycardia, supraventricular tachycardia, atrial flutter, atrial fibrillation, ventricular tachycardia with retrograde conduction to the atria, first-degree block, second-degree block, Wenckebach periodicity, advanced block, thirddegree block, and sinus bradycardia. As each QRS is detected, the computer delivers a one-line report to a display terminal: the first entry is the beat number; the second entry is the four-digit individual beat code; the third entry is the singlebeat diagnostic report for that code; the fourth entry is a contextual diagnosis derived from a serial evaluation of recent beats, and appears only when one of the previously listed arrhythmias is recognized. Computer Logic and Decision Rules Initially each beat is assigned to a general category: normally conducted; ventricular premature; ventricular aberrant; atrial premature; conducted with A-V delay; escape (junctional or ventricular); or bradycardic. Within each of these categories certain arrhythmias are likely. Thus we restrict our secondary pattern recogni

392 JANICE M. JENKINS tion search to specific rhythms that might reasonably be expected given our preliminary classification. This eliminates an exhaustive search of all possible arrhythmic combinations. The specificity of the first stage of diagnosis. i.e., the precise recognition of all pertinent P and QRS relationships, simplifies the second-stage contextual analysis. Given the detailed atrial and ventricular information that is available from the two-lead system, computer analysis of complex arrhythmia is now possible with greatly improved accuracy. Table 9 shows the two-level hierarchical scheme in which the final diagnosis is determined following an initial assignment of each beat into a tentative rhythm class. Results from computer processing94 of atrial bigeminy are seen in Fig. 7 and ventricular bigeminy is shown in Fig. 8. Ventricular tachycardia with a 2:1 V-A ratio is computer detected as a continuing sequence of PVBs. This is due to the alternating relationship of each R wave to the prior A wave (Fig. 9). Atrial flutter as exhibited by the esophageal signal is easily diagnosed by Table 9. Rhythm Classes and Computer Contextual Diagnoses* computer as a sequence of PABs with a 2:1 A-V relationship and a rate in the range of 250 to 350/min. The contextual diagnosis is shown in Fig. 10. First and second-degree atrioventricular block are seen in Fig. 11. Note that dropped beats are recognized and reported upon the subsequent QRS. and Wenckebach periodicity is detected. Although the data base of patient recordings is too small for a statistical analysis of diagnostic accuracy, 29 12-sec passages of the episodes analyzed by computer were chosen at random for diagnosis by an electrocardiographer who was not told the computer diagnosis. Of the single-beat diagnoses, 95.5% were confirmed by the cardiologist as correct, 4.5% were found to be false negatives (the abnormality was detected by both the computer and cardiologist, but the diagnostic statements disagreed), and there were no false positives. In these same tracings, there were 38 distinct arrhythmic events, consisting of three or more beats, that required contextual analysis for identification. Of these sequences, two of the contextual analyses delivered by the computer (6%) were found to be in error when compared with the diagnosis of the electrocardiographer. One passage was diagnosed as NSR by the computer, while the reader called it low atrial rhythm; in the second instance a computer report of NSR was given for a passage interpreted by the cardiologist as atrial escape rhythm. This new technique combining esophageal electrocardiography and computer analysis95'96 can differentiate supraventricular from ventricular arrhythmias, measure all PR intervals, and detect dropped beats. COMPUTER ANALYSIS OF AMBULATORY ECG RECORDINGS The techniques of long-term recording of ambulatory patients with a miniature portable tape recorder (Holter) to detect transient arrhythmias constitutes a clinical procedure of major significance in electrocardiology.97 The recorder, which uses either a cassette or reelto-reel magnetic tape, typically can record up to 24 hr of two channels of electrocardiographic *data. In this way, transient arrhythmic events that occur unpredictably can often be captured during the 24-hr period in which the patient engages in normal activity. The long-term recording is essential if one is to observe a Rhythm Class I. Normal sinus beats II. Occasional abnormal beats 111f. Frequent premature ventricular beats IV. frequent premature atrial beats V. Delayed or dropped beats Vl. Bradycardic beats VII. Junctional or ven Contextual Diagnostic Statements Normal sinus rhythm None Couplet Bigeminy Trigeminy Ventricular tachycardia Supraventricular tachycardia Atrial fibrillation Atrial flutter Supraventricular tachycardia with bundle branch block or ventricular tachycardia with retrograde conduction to the atria First-degree block Second-degree block Wenckebach Advanced block Bradycardia Third-degree block tricular escape beats *Sequences of beats belonging to the- rhythm class shown in the left column may lead to contextual diagnostic statements shown on the right, provided other criteria are met. Reproduced by permission from the American Heart Association.94

-.4 ~ 4IA 8 d - — 4 - -.-....tit 45 112111 ISINUl SEA~T 49 1121 SINUS BEAT 48 1121 SINUSl BEAT 521121 SIN~US BEAT )AFTER PL.PAS I4FTER EL. PAS i.FiTE EL r o,-. AFTEP EL. PAE8 MF TEP EL.PAS8 (SINUSh ( SI t?.: (Silus (SINJUS P=':ET)... rE T )... Ti) =CT)': P. s -r ).... Fig. 7. Atrial bigemniny recorded from -the esophagus. The esophageal ECG is shown on the top channel, with a simultaneous signal from lead 11 beneath it. What appears to be sinus bradycardia wvhen viewed from the surface lead is easily recognized when the E. lead is included. Precise computer detection of both the normal and the premature atrial beat WPAS) is possible. Each line of text is the diagnosis of a single beat as it appears on a video screen at the occurrence of a QRS complex. Beat numbers are indicated on the lower tracing. The diagnostic report. which is printed in real time, contains the beat number, a single-boat code representing interval measurIements and QRS shape. anid the diagnosis associated with that beat code. BL - blocked. (Reproduced by permission of the American Heart Association.) 6 1 1 S I U E T A F E715 ( O I E S A S ) 7 3308 PUS BIWIIHY 10 12 S NU TAT A F ET -B(C P NS -."..14 1121 SINUS BEAT AFTER PUS (COMPENS. PAUSE).... le 1111 SINUS EEAT.ATRPSCOPES AS). 16 3300 PUS.TR -IGEiIMY Fig. 8, Computer arrhythmia classification with contextual diagnosis. This figure and each of the following figures is ordered as in Fig. 7. A beat-by-beat diagnosis of alternating premature ventricular beats (PVBsJ and sinus beats with compensatory pause generates a further contextual report of bigeminy. Upon recognition of each PVB in a bigemninal pattern. the contextual diagnosis is repeated. The second consecutive s inus. beat at beat 1 5 interrupts the bigeminy report. and at beat 16 a. diagnosis of trigeminy appears. (Reproduced by permission from the American Heart Association.")

394 JANICE M. JENKINS V - t 'T -. * =7. r29 3300 PUB.........U-TACH 30 1000 PUB OR FUSION..U-TACH 31 0 P..-TACH 32 1000 PUB OR FUSION....TACH 34 100'3 PUB OR FUSION........................ U-TACH 35 3300 P...U-TACH................ '..............._, —. _..,... I - - 3 330C PUB...U-TACH 4- w3o ORU...s e.U-TACH Fig. 9. A sequence of premature ventricular beats (PVBs) that elicits a diagnosis of ventricular tachycardia (V-TACH). The PVBs occur in a 2:1 ratio with respect to atrial activation. Thus, the types of beat codes are affected by the relationship of the QRS to the preceding A wave. Both cases are recognized as PVB types. however, and three consecutive such beats produce a diagnosis of V-TACH. (Reproduced by permission from the American Heart AssociAtion.94) __ -_____ ___:- _'._.3: *.m,F,:.-,___ -L1t'......: -i A1_ i-':.:..: — -A.........?;^t-4 0101 P...........................ATPIL FLUTTER 37, t0 P,.....................................ATRIfL FLUTTER 0101t FE B.........................TRIL FLUTTER 3. 10101 PB........................ATPL FLUTTER 4 0101 P...................................... FLU'TTER 41 0101 P~B........................................ ~T~:IL FLUTShR 4 0101 P B.........................ATRAL FLUTTER 43 0101 P.......................A............... TRIL FLuITTER 44 0101 P..A.......................L........ FLUTTER 45 010 P........................................ATRIAL FLUTTER 4-? 0101 P...I.............................TP.L FLUTTER 47 /10! P~B........................................TIL FLUTTER 4 0101 P8.............rRLFLUTTER Fig. 10. Atrial flutter registered from the esophageal lead. A string of beats recognized as premature atrial beats (PABs) is further examined for atrial rate to distinguish specific atrial arrhythmias. This patient's atrial rate of 320 beats/min is recognized by the contextual analyzer as atrial flutter; this diagnosis is reported upon the appearance of each QRS complex. (Reproduced by permission from the American Heart Association.~94

AUTOMATED ELECTROCARDIOGRAPHY 395, ~-..l 7-r I -.!;.,i; I I! t - I I I I;,! I!, I.. I....,.,:. 1 - I ".... -.............:... -, -Z, -, - - - -...; I -. I.1. -..-. -.,, - A-: -, -- - P.- - -^ A- ^1 - I%,- -LLJN, - i -..I ~Cecil A-..-..._ _........- 8 '...._:, A 6 1211 SItNUS BEAT WITH A-U DELAY.................. 1ST DEGREE BLOCK 7 1211 SINUS BEAT WITH A-U DELAY.................. ST DEGREE BLOCK 8 1211 SINUS BEAT WITH A-U DELAY............ ST DEGREE BLOCK 9 1211 SIHUS BEAT 1ITH A-U DELAY.................. 1ST DEGREE BLOCK 10 1211 SINUS BEAT tIITH A-U DELAY..................1ST DEGREE BLOCK 11 1211 SINUS BEAT WITH A-U DELAY..... ST DEGREE BLOCK 12 1211 SINUS BEAT WITH A-U DELAY.................. 1ST DECGREE BLOCK 13 1211 SINUS BEAT WITH A-U DELAY............. ST DEGREE BLOCK 14 1211 SINUS BEAT WITH ^-U DELAY...................ST DEGREE BLOCK 15 1221 SIHUS BEAT.A-U DELAY, AFTER BLOCKED BEAT...210 DEGREE BLOCK 16 1211 SINUS BEAT WITH A-V DELAY.2ND D.............2 DEGEE BLOCK -'ti 7 -. V-.__- _,- -...__ " —,, -.:!:::.;.,._" r-?- - - _-.-. -." - -,.,.I _.... _-..t —.- tm.. -I:1j _ _ r - r- _ u _ - _l _ _. -_r — a Z I - — v ZZ Z - - - U _-,_ - _ _.. _. '-~-~-~La..L~~~~h. a.(-..L L... L_. '4 - I f. 4- 1 -4 -i i 111,411, — H -i7-1 - —rl - i.-...-.. 1... I -1 —T —l. -T- i, - 1 - L. -;- - t -I — &. I I - i I ji I LI 1 -- I a A.i - B t - "" 1".. B _ — 77-r--- -.- I I i I I - - I.-, - ---.,. - I — --, - T-...-..!, I., L I k;. A -. - L-J ----.-. 12 1211 SIIUS 13 1211 SINUS 14 1211 SIlUS 15 1211 SINUS 16 1221 SItUS 17 1211 SINUS 18 1211 SINUS 19 1211 SINUS 20 122-1 SIHUS 21 1211 SINUS BEAT WITH A-U DELAY.. 214D DECREE BLOCK E'.ET WITH A-U DELAY..................2D DECEE BLOCK BEAT WITH A-U DELAY.................2D DEC-EE BLOCK BEAT WITH A-U DELAY..................20D CEGEE BLOCK BEAT,A-U DELAY, -F=TER BLOCKED BEAT...l:ENCKEEBACH BEAT WITH A-U DELAY.................2 DECREE LOCK BEAT WITH AR-U DELAY.................2ND:DECREE BLOCK SEAT WITH A-U DELAY..................2D EEE BLOCK BEAT.-A-U DELAY, AFTER BLOCKED BEAT...D:ENCrKE^.CH E;EAT WITH A-U DELAY......2..D..:,2MD DECREE BLOCK Fig. 11. First-degree atrioventricular (A-V) block, second-degree block, and Wenckebach episodes. This patient exhibited a wide spectrum of arrhythmias indicative of A-V conduction disturbance. (A) AR intervals exceeding 300 msec are evident. and first-degree block is the diagnosis reported with each beat. The appearance of a blocked beat at beat 15 causes the contextual statement to be modified to second-degree block. (B) The AR interval progressively grows until a blocked beat appears after beat 15, eliciting a Wenckebach diagnosis at beat 16. This statement reverts to second-degree block, which continues to be reported if a blocked beat was present within the past five beats. (Reproduced by permission of the American Heart Association.94)

396' JANICE M. JENKINS patient's electrocardiographic response during a complete spectrum of physical activities, including sleep and exercise, and during emotional stress. The recordings are later reviewed at 60 or 120 times real-time in order to observe episodes of rhythm abnormalities that may be present. High-speed analysis is necessary in order to process the approximately 100,000 QRS complexes present in a typical 24-hr recording. Early manual systems (scanners) required an operator to detect all incidents of ectopic activity and generate a report based on this manual assessment. Such analyses were qualitative rather than quantitative in nature and results were highly dependent on the skill of the technician who scanned the tape. More recently, computer techniques for high-speed scanning have been developed for purposes of distinguishing normal from abnormal QRS complexes and for cataloging abnormal activity. The rate of error in quantification of arrhythmic events is expected to be reduced significantly as computer-assisted systems become more common. It should be noted that while commercial versions of these systems are commonly designated as automated analysis systems, this is a misnomer in that all present systems currently depend on the interaction of a human operator.98 In the more sophisticated systems the operator technician preselects the parameter settings for various detection algorithms. For instance, normal QRS morphology must be pre-identified, QRS widths or interval lengths specified, and other thresholds adjusted for the particular ECG under analysis. The ability of the system to distinguish PVBs from aberrantly conducted supraventricular complexes is highly operatordependent. This is because none of the currently available lead systems adequately or reliably records P waves. Computer techniques applied to high-speed analysis of Holter electrocardiograms roughly parallels systems developed earlier for diagnostic ECG and coronary case monitoring. A method for segregating normal QRS complexes from abnormal must be employed, and in addition, RR intervals are evaluated. Clustering of similarly shaped QRS complexes is a feature of most computer-based systems. Work in computer-assisted Holter scanning during the early 1970s consisted mostly of PVB detection with efforts directed toward quantifi cation of ectopic activity.99"'03 Improved versions appeared in the mid-1970s,04'1'05 and at this time interactive systems were developed.'06 Computer analyzed ambulatory recordings were employed for evaluating anti-arrhythmic drugs,107 and a finite state machine approach was advanced for rapid analysis of ventricular arrhythmias.'08 Continued improvements in accuracy of quantification of Holter scanning results were seen in the late 1970s'~0"u3 as microprocessors and digital logic became standard on most playback units. Selective recorders, which were automatically activated upon arrhythmia detection or patient activated, replaced some of the continuous models as integrated circuitry became less costly and more miniaturized. These "smart" recorders are capable of real-time analysis and store only significant events. Limitations include missed episodes of arrhythmia, noise-produced activation, and an incomplete record for later analysis. Continuous recorders remain the more commonly accepted device, and playback units continue to become more sophisticated as microprocessor technology is applied to the problem. A computer interfaced to a Holter scanner that automatically digitized and stored each 10 -sec episode upon arrhythmia detection was reported in 1979."4 Off-line rhythm analysis was envisioned as a long-term goal, but editing and plotting of the 10-sec passages were the only features available at the time of reporting. Analysis of ST-T intervals was added to a high-speed playback system that digitized data at 60x real time,'t5 and to the ARGUS/H system"~~ during the late 1970s. Recent innovations in Holter monitoring include microprocessor controlled analysis and high speed transtelephonic report generation,'6 an almost "operator-free" automated system in which only the initialization stage requires operator interaction,"'7 and a twochannel preprocessor that detects QRS complexes independently on each channel with improved accuracy in the face of signal dropout. 18 The limitations of complete automation of Holter monitoring and analysis remain much the same as those facing coronary care monitors. Separation of supraventricular from ventricular events (frequently the reason for ordering the test) is not possible without accurate recognition of atrial impulses. The shortcomings of surface

398 JANICE M. JENKINS "l-.r I':!" *' I ':' l" ''r:'Y " -' '!"'.l" t;:;. [ '"!:. *. I ' ":, 'l.~ t."X " '"'r' n'''.''.'' -q-} '~ } ~ ~ ~ ~: ';,.;:- -,:'';~''-':.': ',: )'-,, "-;- ''_' - ~;..*-_.,,....-.. - *,;: -,-, *. -*'i.__;-.. — ___._...~.. - '..&.;..'_ Ca _;_C._._'.-,,,;, ':'t_.1.~ ', -...................I..:___._.ml ' '....... _C _i -;...... _. _._ _____,. _.._._..,_ __ __.. m............ _ 1._-~ *...... _*_..._._._ _~ ~ * LI-I I-I E 7- Z i LLIZ.74 1ZIJ...4Z~T.1 4 L.L77 L;...;AT -..-:AI -, - -- r- — l- 1, I: -—.,A I * A* Fig. 14. Idioventricular rhythm as seen from the esophagus. This figure is ordered as in Fig. 12. The patient exhibits an accelerated idioventricular rhythm with retrograde conduction to the atria. After three beats the sinus node recovers the atria and the ventricles are dissociated. In the last three beats shown the sinus beats are conducted with aberrancy. In particular. the first of the conducted QRSs is almost impreceptible. in the early 1960s clinical implementation became a reality. Despite the fact that 20 yr have elapsed since the initiation of automated analysis of ECGs, the question of performance evaluation raised early in this history remains essentially unanswered and continues to becloud the future of this sophisticated technique. Caceres49 pointed out one aspect of the problem in 1973 when he discussed the lack of consistency among ECG readers or "observer variability," which he called a polite term for error. The advent of automation, he claims, forces us to consider standardization and requires us to examine our individualistically styled criteria. Studies done to evaluate computer quality in the past focused on comparing the computer results with cardiologist results without examining the manual system to determine its accuracy. Thus performance was computed vis-a-vis agreement with an expert or panel of experts. without regard for a standardization criteria against which to evaluate all ECG readers, human or machine. Automation of the electrocardiographic process forces a rigid adherence to defined criteria. Since we can apply this criteria in order to evaluate the computer objectively and thus readily determine performance, Caceres asks whether we should not impose a similar evaluation upon the manual system. The computer is never plagued with day-to-day variability or the obvious lack of agreement between individual cardiologists. Indceed the clinical accuracy of a computer interpretation is often a reflection of the ability of the computer to follow religiously the defined criteria, while a human observer is free to ignore strict adherence when he chooses. In a 20-yr overview of computerized electrocardiography published recently, Cacerest23 pointed out that past experience shows that once a computer has interpreted an electrocardiogram on the basis of defined criteria, the computer program designers rapidly receive feedback from those who have other data from other sources, and he suggests that data on electrocardiographic criteria and clinical correlation may be far more complete at any large "commercial' ECG processor's office than in any single academic heart station. In our quest for developing standards to evaluate programs, he recommends that we develop standards for human readers first: then the computer can duplicate whatever it is that we wish a human being to do, and probably improve the performance. Evaluation of Diagnostic ECG Systems Attempts to provide some sort of quantitative evaluation of systems were engaged in by both developers and eventually manufacturers of ECG systems. The methods of analysis varied broadly. but invariably utilized the interpretation of the cardiologist (or team of cardiologists) as the "'gold standard" for purposes of arriving at some figure of merit. Bailey et al.14 introduced a method for evaluating ECG computer programs that avoided the pitfalls of previous studies, namely, the separation of disagreements between computer program and readers into criteria differences and program errors. The reasoning was

400 JANICE M. JENKINS ments that can be made by human readers or computer programs. Type A. Statements that refer to the diagnosis of an anatomic lesion or pathophysiologic state that is determined from nonelectrocardiographic evidence, i.e., cardiac catheterization, serum enzyme levels, ventriculograms, echocardiograms, scintigrams and autopsy findings.Type B. Statements that refer to the diagnosis of an electrophysiologic state that are primarily detected by the electrocardiogram itself. (If these are later confirmed by invasive methods, they could become type A statements.) Type C. Purely descriptive statements of electrocardiographic features, the meaning of which cannot be precisely defined, i.e., flat T waves. To measure the accuracy of type A statements it is necessary to gather a large test library in which a particular non-ECG method has been established as a fixed criteria for a given diagnosis. The accuracy of type B statements would require a similar database of common and uncommon arrhythmias, and a sufficient number of conduction defects, such as bundle branch block and pacemaker rhythm. The "gold standard" for type B statements might be the consensus of a group of expert electrocardiographers, except in difficult cases where invasive techniques may be required. Type C statement accuracy is particularly difficult to evaluate because purely descriptive statements may or may not carry diagnostic significance, and may reflect nomenclature common to local institutions or centers. Resolving semantic equivalencies is of major importance, as is determining a fixed criteria against which to test the statements. It is clear that construction of type A and B databases constitutes a difficult, tedious, timeconsuming task and if done, should include a spectrum of disease states of all severities and contain a representative number of difficult conditions or combinations of diseases.'28 In the meantime, investigators continue to attempt an evaluation by a variety of methods. A system (TELAVIV) for automated analysis and verification of long-term 3-channel ECG recordings has been designed by a group in Israel,129 and an annotated digital ECG data base comprised of 48 half-hour, 2-channel ECG excerpts containing a variety of arrhythmias, conduction abnormali ties, and artifact has been collected by a team in Boston.)30 Willems'3' recently examined two vectrocardiograph (VCG) and four 12-lead ECG programs in order to assess what measurement discrepancies existed. He found considerable measurement differences between programs. particularly time measurements. The differences were considered not too surprising since "various programs apply different algorithms and references for wave recognition beat selection and parameter extraction leading to significant differences in measurement results and diagnoses." A plea was made by the investigator toward common standards as well as toward standardization of the definitions of waves and measurement references. Evaluation of Arrhythmia Monitors The American Heart Association has sponsored the preparation of an annotated arrhythmia database for the purposes of evaluating arrhythmia monitors.32'-34 The development of the database for evaluation of ventricular arrhythmia detectors is complete after 10 yr of planning including 3 yr of collection and annotation of data. The database was compiled for purposes of development and evaluation of automated detection systems, specifically computerized systems using digital data.'35 The compendium consists of 160 digitized 3-hr segments of 2-channel recordings of ECGs. The last half hour has been annotated on a beat-by-beat basis by a panel of expert electrocardiographers. Eight arrhythmia classes are represented: (1) no PVBs, (2) isolated uniform PVBs, (3) isolated multiform PVBs, (4) bigeminy, (5) R-on-T, (6) couplets, (7) ventricular rhythms, (8) ventricular fibrillation. Computer processing on the ARGUS/2H was used in almost all phases of the project from initial processing of candidate recordings to reconciliation of beat-by-beat diagnoses of individual electrocardiographers. The database is available through an official distributor selected by the American Heart Association. Current work is underway for design of a methodology for performance evaluation of arrhythmia detectors. There will be an associated catalog that will briefly describe the contents of each tape, including the source of the data, rhythm, arrhythmia

AUTOMATED ELECTROCARDIOGRAPHY 401 content, noise content, electrode placement, pertinent morphologies, and relevant clinical data. The catalog will also include a statement of policy regarding database usage. SUMMARY AND DISCUSSION A study of the use of computer assisted ECG was performed for the federal government in 1975 by Arthur D. Little'36 in which was cataloged the major ECG programs that were in routine clinical use in the United States: CroMed, ECAN, Hewlett-Packard, IBM-Bonner, Mayo '74, Phone-a-Gram, Telemed, and VA. The Telemed, Cro-Med, and Phone-a-Gram programs were available only through a service contract with the individual vendors; the Hewlett-Packard program was available only with the H-P dedicated system; the ECAN program was available to general users and also implemented on the Roche commercial system; the IBMBonner program was an option offered by Hewlett-Packard, Marquette, TELEMED, and Roche; the Mayo program was available on the Marquette system; and the VA program was available in general. The VA and Mayo programs performed analysis on the orthogonal (X, Y, Z) leads and the remainder of the systems employed the standard 12-lead system. The volume of computer-assisted ECGs in the Table 11. Utilization of Computer Electrocardiographic Programs in 1976 Annual Volume* United States in 1975 was roughly 4 million-a 4-fold increase from the I million figure in 1972, and a 20-fold increase from the 1970 figure of 200,000. Thirty-four percent of all computerassisted ECGs were processed using the IBMBonner program and 29% were processed with the Telemed program. The statistics for 1976 reported by Task Force III for Optimal Electrocardiography'27 can be seen in Table 11. The number of ECGs processed using the IBM program shown equals those processed using the Telemed program (roughly 28% of the total). These figures do not agree precisely with those published by Arthur D. Little136 because they include some non-USA processors, but the chart reveals that 89% of the total volume of ECGs processed by computer are being processed by proprietary commercial programs, some of which have not disclosed any information about measurement or diagnostic logic. Rautaharju'37 comments, "Perhaps for the first time in history, a medical diagnostic laboratory test is widely distributed without a full disclosure of the procedure." Concerns that many programs have been released without adequate testing, withourt documentation of test results, without disclosure of criteria and key features, further complicates the issue of standardization. A later study by Arthur D. Little'~ revealed the trend in the use of ECG analysis programs from 1975 to 1978 (Table 12). Cro-Med had virtually disappeared from the market, and Table 12. Use of ECG Analysis Programs* % of all Computer-Assisted ECGs 1975 1977 1978 IBM/Bonner 34 44 42 TELEMED 29 26 36 CRO-MED 7 7 - Hewlett Packard 5 7 9 Phone-a-Gram 3 5 6 ECAN(USPHS) 8 3 < 1 Mayo-Smith 8 4 3 VA 1 < 1 < 1 Other 4 3 2 *The overall growth in use of computer-assisted reading in the U.S. is shown in Figure 15. The growth rate in 1978 (13% annual increase in volume) was lower than in the previous 2 years (21 % annual increase). Reproduced by permission from IEEE Press." Proprietary programs IBM TELEMED Mayo Cro-Med Hewlett-Packard Phone-A-Gram Siemens Public domain ECAN VA CEIS LDS Glasgow Hannover CIMHUB 1,400,000 1,400.000 325,000 500,000 600,000 140.000 150,000 195,000 160,000 110,000 18.000 12,000 18,000 22,000 *Annual volume is given for programs for which the reported volume exceeds 10,000 electrocardiograms. Reproduced by permission from the American Journal of Cardiology. 127

402 JANICE M. JENKINS ECAN had slipped to less than 1% of the total. Those systems that processed X. Y, Z leads only (Mayo and VA) constituted less than 4% of the total. The commercially available program developed by IBM-Bonner accounted for 42% of the total market, with the bulk of the volume due to its use in the computer ECG system marketed by Marquette Electronics. The proprietary programs used by Telemed (36%), Hewlett-Packard (9%), and Phone-a-gram (6%) constituted 51% of the total. Figure 15 shows the growth of computer-assisted ECG reading in the United States during the last decade, and the depicted growth rate of about I million ECGs per year is expected to continue unabated over the next 2 to 5 yr. The introduction of computers into clinical electrocardiography has had a variety of effects. There is no question that computers are ready to replace and should replace some of the functions of the electrocardiographer,'37 and given that event, the very nature of the function of the electrocardiographer will undergo fundamental changes. While the use of computer assistance in electrocardiography has not resulted in any widespread improvement in diagnostic accuracy, or dramatically altered the delivery of medical care, the inherent inadequacy of current ECG classification criteria has been made evident to users and developers. The availability of computer-assisted electro~.. cardiography may help to promote compliance with a recommendation advanced by Task Force IV of the American College of Cardiology: that a baseline electrocardiogram be obtained on all adults.'38 Certainly advances can be expected in the early detection of cardiovascular disease if the procedure is broadly used on a routine basis for those patients who are at high risk of developing heart disease. The technique of computer analysis and storage of large numbers of ECGs within a single system provides a powerful tool for epidemiologic studies and. also for amassing large data bases of ECGs for purposes of further refinement of diagnostic criteria. Indeed the creation of computer systems for ECG analysis has not only revealed the lack of standard and universal diagnostic criteria, but most probably will be the mechanism by which improvements in this area will be realized. Future directions of electrocardiography appear to be closely linked to future advances in related computer techniques. The Task Force VI of the American College of Cardiology'39 listed the following categories in clinical electrocardiography in which major work would develop: (1) body surface mapping, (2) direct cardiac mapping, (3) intracardiac electrocardiography, (4) signal-averaging techniques, (5) conventional electrocardiography, (6) magnetocardiography, and (7) stress testing. The development of many of these specialized techniques are possible only GROWTH IN USE OF COMPUTER-ASSISTED ECG READING UNITED STATES 8. 7 6 5 Volume of ECGs (Millions) 4 3 2 1 Fig. 15. Growth of computer-assisted electrocardiograms processed in the U.S. during the past decade. (Reproduced by permission of the IEEE 1980 Press.'0) 0 1970. 1975

AUTOMATED ELECTROCARDIOGRAPHY 403 with the utilization of computers for data acquisition, mass storage, and efficient analysis and interpretation of results. Major technologic advances are emerging that will further enhance computer techniques in electrocardiography. Microprocessors are being included in the design of electrocardiographs that essentially convert the device to a sophisticated stand-alone computer. Preprocessors condition the signal prior to display or transmission to a central computer. Such preprocessing can digitally filter the signal to eliminate excessive noise, effect optimal trace positioning, and eliminate baseline wander. Some advanced electrocardiographs are stand-alone computers that complete all analysis within internally contained microprocessors, deliver a diagnostic report on a cart-mounted printer and write the electrocardiographic data to floppy disk for archival storage.'4~ A commercial system unveiled in- 1978 (IBM Corporation) executes the Bonner program on a microprocessor-based electrocardiographic cart and delivers an immediate diagnosis. Patient information can be entered via a keyboard and monitored on the display. The system generates a comprehensive report and records the ECG on a floppy disk for subsequent storage. The central system with the capability of telephonic reception of ECG data from system compatible nondiagnostic carts provides a central data base facility for instant diagnosis and serial evaluation of ECGs. Remotely acquired ECGs are temporarily stored on floppy disks in digital form and later hand carried to the central facility. New leads and lead systems for specialized purposes can be accommodated by computer systems without creating great computational problems. The.esophageal lead for arrhythmia monitoring is one example,'4' and leads for body surface mapping (the registration.of electrocardiographic activity from an array of multiple electrodes located across the entire thoracic region) are another. The computer provides a mechanism for integrating information from new leads into existing systems'42 and, in the case of multiple surface leads, can be employed to assess redundancy and thus provide a more sharply focused view of cardiac activity. Other new technologic advances that are predicted to have an impact on the field of computerized electrocardiography are occurring in the area of digital transmission and the related processes of data compression and reduction. Past limitations of low-speed transmission devices (300 baud) and the need for faithful reproduction of 10- to 20-sec of multichannel data sampled at rates of 200 to 1000 per second (48,000 to 720,000 bits) made digital transmission unfeasible. New trends in high speed modems (2400 to 9600 baud) coupled with data compression techniques providing compression ratios of 3:1 to 30:1 143-154 offer the distinct possibility of real-time digital transmission. The improved signal-to-noise ratio of digital versus analog transmission, plus error detection and correction algorithms, should serve to move digital transmission techniques from the experimental laboratory to clinical use within the near future. Shortly to be introduced by one manufacturer of computerized ECG systems is an electrocardiographic.cart capable of analog-to-digital conversion at 250 Hz, digital preprocessing, and digital transmission to the central processor at 2400 baud with a standardized protocol. A major modification in the format of the unit record has been incorporated in order to effect efficient data storage at the central computer. Signal averaging is employed to extract a median cycle (a representative single beat) for each of 12 leads. The direct writer of the electrocardiograph produces a record with each of the 12 median cycles and a 10-sec rhythm strip derived from leads V 1, II, and V5.'55 Eventually the signal averaging will be done on the digital cart rather than at the central computer. Future trends in computerized electrocardiopraphy will also be seen in the further automation of high-speed analysis of the ambulatory electrocardiogram, in the realm of exercise electrocardiography, in signal averaging techniques for His bundle recordings from the surface and esophagus, and in further development of improved techniques for comparison of serial electrocardiograms. The computer will continue to play a major role in clinical electrocardiography as a powerful device for efficient data acquisition and high-speed storage and retrieval, as well as serve as an adjunct to the physician in measurement, comparison, correlation, logical and statistical decision-making, and analysis of the electrocardiographic waveforms.

404 JANICE M. JENKtNS REFERENCES 1. Noble D: The Initiation of the Heartbeat. London. Oxford, 1975 2. Hoffman BF, Cranefield PF: Electrophysiology of the Heart. Mt. Kisco. N.Y., Futura, 1960 3. Einthoven W: The different forms of the human electrocardiogram and their significance. (A paper read before the Chelsea Clinical Society on March 19, 1912). Am Heart J 40:195, 1950 4. Frank E: An accurate clinically practical system for spatial vectorcardiography. Circulation 13:737. 1956 5. McFee R, Parungao A: An orthogonal lead system for clinical electrocardiography. Am Heart J 62:93, 1961 6. Pipberger HV, Bialek SM, Perloff JK. et al: Correlation of clinical information in the standard 12-lead ECG and in a corrected orthogonal 3-lead ECG. Am Heart J 61:34, 1961 7. Simonson E, Tuna N, Toshima H, et al: Diagnostic accuracy of the vector cardiogram and electrocardiogram. A cooperative study. Am J Cardiol 17:829, 1966 8. Macfarlane PW, Lorimer AR, Lawrie TDV: 3 and 12 lead electrocardiogram interpretation by computer. A comparison in 1093 patients. Br Heart J 33:226, 1971 9. Stallman FW, Pipberger HV: Automatic recognition of electrocardiographic waves by digital computer. Circ Res 9:1138, 1961 10. Drazen EL, Garneau EF: Use of computer-assisted ECG interpretation in the United States, in Computers in Cardiology (Conference Proceedings). New York, IEEE Press, 1979, p 83 11. Pipberger HV, Arzbaecher RC, Berson AS, et al: Recommendations for standardization of leads and of specifications for instruments in electrocardiography and vectorcardiography. Circulation 52:11, 1975 12. Jokinen Y, Ahokas S, Joutsiniemi S-L: Data selection and data reduction for storage and retrieval of the ECG for serial comparison. IFIP-TC9 Working Conference on Optimization of Computer-ECG Processing, Nova Scotia, Canada, 1979, p I 13. Wartak J: Computers in Electrocardiography, Springfield, Illinois, Charles C. Thomas, 1970, p 132 14. Cox JR Jr, Nolle FM, Arthur RM: Digital analysis of the electroencephalogram, the blood pressure wave, and the electrocardiogram. Proc IEEE 60:10, 1137, 1972 15. Pipberger HV, Fries ED, Taback L, et al: Preparation of electrocardiographic data for analysis by digital electronic computer. Circulation 21:413, 1960 16. Taback L, Marden E, Mason HL, et al: Digital recording of electrocardiographic data for analysis by digital computer. IRE Trans Med Electron 6:167, 1959 17. Caceres CA, Steinberg CA, Abraham S, et al: Computer extraction of electrocardiographic parameters. Circulation 25:356, 1962 18. Pipberger HV: Computer analysis of the electrocardiogram, in Stacy RW, Waxman B (eds): Computers in Biomedical Research, vol. 1. New York, Academic, 1965, p 377 19. Staples LF, Gustafson JE, Balm GJ, et al: Computer interpretation of electrocardiograms. Am Heart J 72:351, 1966 20. Wortzman D, Gilmore B, Schwetman HD: A hybrid computer system for the measurement and interpretation of electrocardiograms. Ann NY Acad Sci 128:851, 1966 21. Klingeman J. Pipberger HV: Computer classification of electrocardiograms. Comput Biomed Res 1:1, 1967 22. Pordy L. Jaffe H. Chesky K, et al: Computer analysis of the electrocardiogram: A joint project. J Mt Sinai Hosp 14:69, 1967 23. Bonner RE. Schwetman HD: Computer diagnosis of the electrocardiogram 11. Comput Biomed Res 1:366, 1968 24. Bonner RE, Schwetman HD: Computer diagnosis of the electrocardiogram 111. Comput Biomed Res 1:387, 1968 25. Pordy L, Jaffe H, Chesky K, et al: Computer diagnosis of electrocardiograms. IV. A computer program for contour analysis with clinical results of rhythm and contour interpretation. Comput Biomed Res 1:408. 1968 26. Smith RE, Hyde CM: Computer analysis of the electrocardiogram in clinical practice, in Manning GW, Ahuja SP (eds): Electrical Activity of the Heart. Springfield, Charles C. Thomas, 1969, p 305 27. Pryor TA. Russell R. Budkin A. et al: Electrocardiographic interpretation by computer. Comput Biomed Res 2:537, 1969 28. Elliott RV, Simmons RL. Barnes DR: Computerassisted electrocardiography in community hospitals, in Stacy RW, Waxman BD (eds): Computers in Biomedical Research, vol 4. New York. Academic, 1974 29. Cudahy MJ: Computer EKG studies (ed 3). Marquette Electronics Technical Publication, Milwaukee, WI, 1970, p 29 30. Bonner RE, Crevasse L, Ferrer MI, et al: A new computer program for analysis of scalar electrocardiograms. Comput Biomed Res 5:629, 1972 31. Hu K, Francis DB, Gau GT, et al: Development and performance of Mayo-IBM electrocardiographic computer analysis programs (V70). Mayo Clin Proc 48:260, 1973 32. Pryor TA, Lindsay AE, England RW: Computer analysis of serial electrocardiograms. Comput Biomed Res 5:709, 1972 33. Talbot S, Drcifus.LS, Watanabe Y, et al: Diagnostic' accuracy of a 15-lead hybrid computer-aided electrocardiographic system. Eur J Cardiol 1:1, 29, 1973 34. Balda RA, Diller G, Deardorff E, et al: The HP analysis program, in van Bemmel JH, Willems JL (eds): Trends in Computer-Processed Electrocardiograms. NorthHolland, Amsterdam, 1977, p 197 35. Goetowski CR: The Telemed system, in van Bemmel JH. Willems JL.(eds): Trends in Computer-Processed Electrocardiograms. North-Holland, 1977. p 207 36. Helppi RK, Unite V, Wolf HK: Suggested minimal performance requirements and methods of performance evaluation for computer ECG analysis programs. Can Med Assoc J 108:1251, 1973 37. Sandberg RL: Workshop on pattern recognition. Pro-.ceedings of Computerized Interpretation of the Electrocardiogram, Engineering Foundation Conference. Rindge. N.H., 1975, p l 38. Crevasse L, Ariet M: Computer EKG criteria for the diagnosis of left ventricular hypertrophy. Proceedings of Computerized Interpretation of the Electrocardiogram,

AUTOMATED ELECTROCARDIOGRAPHY 405 Engineering Foundation Conference. Rindge, N.H., 1975, p 69 39. Selvester R: Criteria for atrial enlargement and infarct size (applicable to computer diagnostic programs). Proceedings of Computerized Interpretation of the Electrocardiogram, Engineering Foundation Conference. Rindge, N.H., 1975, p 81 40. Milliken JA: Criteria for bundle branch block in ventricular conduction delay. Proceedings of Computerized Interpretation of the Electrocardiogram, Engineering Foundation Conference. Rindge, N.H., 1975, p 101 41. Pipberger HV. McCaughan D, Littman D, et al: Clinical application of a second generation electrocardiographic computer program. Am J Cardiol 35:597, 1975 42. Macfarlane PW, Cawood HT, Lawrie TDV: A basis for computer interpretation of serial electrocardiograms. Comput Biomed Res 8:189, 1975 43. Kornreich F, Snoeck J, Block P, et al: An 'optimal' lead system for the diagnosis of coronary heart disease and hypertrophies using multivariate analyses, in Advanced Cardiology, vol 19. Basel, Karger, 1977, p 198 44. Kornreich F, Block P, Lebedelle M, et al: Multigroup classification by means of multivariate analysis, in Advanced Cardiology, vol 19. Basel. Karger, 1977, p 201 45. Willems JL, Pardaens J: Differences in measurement results obtained by four different ECG computer programs, in Computers in Cardiology, New York, IEEE Press, 1978, p 115 46. Miyahara H, Whipple GH, Teager HM, et al: Cardiac arrhythmia diagnosis by digital computer. Comput Biomed Res 1:277, 1968 47. Stark L, Dickson JF, Whipple GH, et al: Remote real-time diagnosis of clinical electrocardiograms by a digital computer system. Ann NY Acad Sci 128:851, 1966: 48. Willems JL, Pipberger HV: Arrhythmia detection by digital computer. Comput Biomed Res 5:263, 1972 49. Caceres CA: The case against electrocardiographic automation. Computer 6:15, 1973 50. LeBlanc AR, Roberge FA, Nadeau RA: Evaluation of a new e.c.g. measurement program for the detection of rhythm disturbances. Med Biol Eng Comput 13:370, 1975 51. Sajet M, Delcambre Y, Wurzburger J, et al: A new rhythm classification program applied to routine electro- and vectorcardiograms. Adv Cardiol 16:256, 1976 52. Hengeveld SJ, van Bemmel JH: Computer detection of P-waves. Comput Biomed Res 9:125, 1976 53. Miyahara H. Ednow K, Domae A, et al: Arrhythmia diagnosis by the Telemed ECG program. Optimization of Computer-ECG Processing, IFIP-TC4 Working Conference. Halifax, Nova Scotia. 1979, p 161 54. Wartak J, Milliken JA, Karchmar J: Computer program for diagnostic evaluation of electrocardiograms. Comput Biomed Res 4:255, 1971 55. Wolf HK, Macinnis PJ, Stock S, et al: Computer analysis of rest and exercise electrocardiograms. Comput Biomed Res 4:329, 1971 56. Hochberg HW, Wehrer AL. McAllester JW, et al: Monitoring of electrocardiograms in a coronary care unit by digital computer. JAMA 207:2421, 1969 57. Bonner RE: A computer system for ECG monitoring. IBM Technical Report 17-241, Advanced System Development Division, 1969 58. Watanabe Y: Automated diagnosis of arrhythmias by small-scale digital computer. Jpn Heart J 11:223, 1970 59. Haywood LJ, Murthy VK, Harvey GA, et al: On-line real time computer algorithm for monitoring the ECG waveform. Comput Biomed Res 3:15, 1970 60. Gersch W, Eddy DIM, Dong E: Cardiac arrhythmia classification: A heart beat interval Markov chain approach. Comput Biomed Res 4:385, 1970 61. Rey W, Laird JD, Hugenholtz PG: P-wave detection by digital computer. Comput Biomed Res 4:509, 1971 62. Feldman CL, Amazeen PG, Klein MD, et al: Computer detection of ventricular ectopic beats. Comput Biomed Res 4:666, 1971 63. Cox JR, Nolle FM, Fozzard HA, et al: AZTEC, a preprocessing program for real-time ECG rhythm analysis. IEEE Trans Biomed Eng 15:128, 1968 64. Cox JR Jr, Fozzard HA, Nolle FM, et al: Some data transformations useful in electrocardiography, in Stacy RRW, Waxman BD (eds): Computers in Biomedical Research, vol 3. New York, Academic, 1969, p 181 65. Oliver GC, Nolle FM, Wolff GA, et al: Detection of premature ventricular contractions with a clinical system for monitoring electrocardiographic rhythms. Comput Biomed Res 4:523, 1971 66. Gerlings ED, Bowers DL, Rol GA: Detection of abnormal ventricular activation in a coronary care unit. Comput Biomed Res 5:14, 1972 67. Geddes JS, Warner HR: A PVC detection program. Comput Biomed Res 4:493,-1971 68. Haisty WK, Batchlor C, Cornfield J, et al: Discriminant function analysis of RR intervals: An algorithm for on-line arrhythmia diagnosis. Comput Biomed Res 5:247, 1972 69. Sasmor L, King G: On line computer arrhythmia analysis using an analog preprocessor. Proceedings of the 24th Annual Conference on Engineering in Biology and Medicine 13:152, 1971 (abstr) 70. Dell'osso LF: An arrhythmia-anomalous beat monitoring system. IEEE Trans.Biomed Eng 20:43, 1973 71. Romhilt DW, Bloomfield SS, Chou TC, et al: Unreliability of conventional electrocardiographic monitoring for arrhythmia detection in coronary care units. Am J Cardiol 31:457, 1973 72. Yanowitz F, Kinias P, Rawling D, et al: Accuracy of a continuous real-time ECG dysrhythmia monitoring system. Circulation 50:65, 1974 73. Feldman CL: Evaluation of arrhythmia detectors, in Computers in Cardiology. New York, IEEE Press, 1974, p 21 74. Harrison DC, Sanders W, Tecklenberg P. et al: State of the art of automated arrhythmia detectors-commercial systems, in Computers in Cardiology. New York, IEEE Press, 1974, p I I 75. LeBlanc R, Roberge FA: Present state of arrhythmia analysis by computer. Can Med Assoc J 108:1239. 1973 76. Bernard R. Rey W, Vainsel H, et al: Computerized dysrhythmia monitoring with an intra-auricular lead, in Computers in Cardiology, New York, IEEE Press, 1974, p 17

406 JANICE M. JENKINS 77. Sanders WJ, Alderman EL. Harrison DC: Alarm processing in a computerized patient monitoring system, in Computers in Cardiology. New York. IEEE Press, 1975. p 21 78. Frankel P. Rothmeier J. James D. et al: A computerized system for ECG monitoring. Comput Biomed Res 8:560. 1975 79. Picrart M, Bachy JL, Marchand E. et al: Continuous arrhythmia monitoring based on a medium scale nondedicated computer, in Computers in Cardiology, New York, IEEE Press, 1975, p 181 80. Bussmann WD, Voswinckel W. Ameling W, et al: Online analysis of c.c.g. arrhythmias with a digital computer. Med Biol Eng Comput 13:382, 1975 81. Swenne CA, vanHemel NM: An interactive monitoring system for the coronary care unit, in Computers in Cardiology, New York, IEEE Press, 1975. p 187 82. Vetter NJ, Julian DG: Comparison of arrhythmia computer and conventional monitoring in coronary-care unit. Lancet 1:1151, 1975 83. Knoebel SB, Lovelace DE, Rasmussen S. et al: Computer detection of premature ventricular complexes: A modified approach. Am J Cardiol 38:449, 1976 84. Mantel JA, Strand EM, Wixson SE. et al: Simultaneous electrophysiologic and hemodynamic computerized monitoring, in Computers in Cardiology. New York, IEEE Press. 1976, p 157 85. Ritter JA. Thomas LJ, Ripley KL: ARGUS/RT: A microcomputer system for clinical arrhythmia monitoring, in Computers in Cardiology. New York. IEEE Press, 1977, p 79 86. Zencka AE, Lynch JD, Mohiuddin SM. et al: Clinical assessment of interactive computer arrhythmia monitoring, in Computers in Cardiology. New York, IEEE Press, 1977. p 317 87. Gustafson DE, Willsky AS, Wang J, et al: A statistical approach to rhythm diagnosis of cardiograms. Proc IEEE 65:802, 1977 88. Gustafson DE, Willsky AS. Wang J. et al: ECG/VCG rhythm diagnosis using statistical signal analysis-I. Identification of persistent rhythms. IEEE Trans Biomed Eng 25:344, 1978 89. Gustafson DE, Willsky AS, Wang J, et al: ECG/VCG rhythm diagnosis using statistical signal analysis-I 1. Identification of transient rhythms. IEEE Trans Biomed Eng 25:353, 1978 90. Jenkins J M, Wu D, Arzbaecher RC: Computer-based arrhythmia classification utilizing the PR interval, in Computers in Cardiology. New York, IEEE Press. 1976, p 149 91. Jenkins J, Wu D. Arzbaecher R: Extension of a single-beat algorithm into classification of arrhythmias in context, in Computers in Cardiology. New York, IEEE Press, 1977, p 305 92. Arzbaecher R: A pill electrode for the study of cardiac arrhythmia. Med Instrum 11:13, 1978 93. Jenkins JM. Wu D. Arzbaecher RC: Computer diagnosis of abnormal cardiac rhythms employing a new P-wave detector for interval measurement. Comput Biomed Res 11:17, 1978 94. Jenkins JM, Wu D, Arzbaecher RC: Computer diag nosis of supraventricular and ventricular arrhythmias. A new esophageal technique. Circulation 60:977, 1979 95. Jenkins J, Wu D. Arzbaecher R: The atrial electrogram in automated sorting of arrhythmias. in Antaloczv Z (ed): Modern Electrocardiology. Budapest, Akademiai Kiado, 1978 96. Jenkins JM, Arzbaecher R: On-line computer pattern recognition and classification of ventricular and supraventricular arrhythmias, in Macfarlane PW (ed): Progress in Electrocardiology. Glasgow, Pittman Medical, 1979, p41 97. Hotter NJ: New method for heart studies. Science 134:1214, 1961 98. Wenger NK, Mock MB. Ringquist I: Ambulatory ECG recording. Part I and-lL in Harvey WP (ed): Current Problems in Cardiology. Chicago, New York Medical Publishers, 1980. p 1 99. Fitzgerald JW, Clappier RR, Harrison DC: Small computer processing of ambulatory electrocardiograms, in Computers in Cardiology. New York. IEEE Press, 1974, p 31 100. Nolle FM, Oliver GC, Kleiger RE. et al: The ARGUS/H system for rapid analysis of ventricular arrhythmias. in Computers in Cardiology. New York. IEEE Press, 1974. p 37 101. Oliver GC, Kleiger RE, Krone RJ, et al: Application of high speed analysis of ambulatory electrocardiograms. in Computers in Cardiology. New York, IEEE Press. 1974, p 43 102. Hansmann DR: High speed rhythm and morphological analysis of continuous ECG recordings, in Computers in Cardiology. New York, IEEE Press, 1974, p 47 103. Neilson JM: High speed analysis of ventricular arrhythmias from 24 hour recordings, in Computers in Cardiology. New York, IEEE Press, 1974, p 55 104. Hansmann DR, Sheppard JJ: The new dyna-gram system for high speed analysis of ambulatory ECG. in Computers in Cardiology. New York, IEEE Press, 1975, p 155 105. Hubelbank M, Feldman CL. Lane B, et al: An improved computer system for processing long-term electrocardiograms, in Computers in Cardiology. New York, IEEE Press, 1975, p 15 106. Mead CN, FerrieroT, Clark KW, et al: An improved ARGUS/H system for high-speed ECG analysis, in Computers in Cardiology. New York, IEEE Press, 1975, p 7 107. Fitzgerald JW, Winkle RA, Alderman EL, et al: Computer analyzed ambulatory electrocardiogram for predicting and evaluating responses to antiarrhythmic agents, in Computers in Cardiology. New York, IEEE Press, 1975, p 151 108. Florenz MK, Rolnitzky LM, Bigger JT: A rapid ECG processing computer program using the finite state machine approach, in Computers in Cardiology. New York, IEEE Press. 1975, p 145 109. Bradley JB., Tabatznik B: A new computer system for processing of Holter recordings, in Computers in Cardiology. New York, IEEE Press, 1977, p 187 110. Clark KW, Hitchen RE, Ritter JA, et al: ARGUS/ 2H; A dual-channel Holter-tape analysis system, in Computers in Cardiology. New York, IEEE Press, 1977, p 191

AUTOMATED ELECTROCARDIOGRAPHY 407 I I 1. Klein MD, Baker S, Feldman CL, et al: A validation technique for computerized Holter tape processing used in drug efficacy testing, in Computers in Cardiology. New York, IEEE Press, 1977, p 199 112. Sheppard JJ, Hansmann DR: Applications of the Dyna-gram III B Holter ECG analysis system, in Computers in Cardiology. New York, IEEE Press, 1977, p 199 113. Spitz AL, Fitzgerald JW, Harrison DC: Ambulatory arrhythmia quantification by a correlation technique, in Computers in Cardiology. New York, IEEE Press, 1977, p 225 114. Murray A, Campbell RWF, Julian DG: et al: Operator-controlled computer system for analysis of 24 hour electrocardiographic recordings, in Computers in Cardiology. New York, IEEE Press, 1979, p 197 115. Biella M, Contini C, Kraft G, et al: A minicomputer based system for automatic analysis of 24 hour ECG and its evaluation, in Computers in Cardiology. New York, IEEE Press, 1979, p 201 116. Feldman CL, Hublebank M, Lane B, et al: Performance enhancement of a Holter processing system with microprocessor controlled overreading stations and remote terminals, in Computers in Cardiology, New York, IEEE Press, 1980, p 127 117. Fancott T, Wong D, Lemire J: A software implementation of a high speed Holter ECG tape analysis system for a single small processor, in Computers in Cardiology. New York, IEEE Press, 1980. p 131 118. Bragg-Remschel DA, Harrison DC: A computerized two channel ambulatory arrhythmia analysis system, in Computers in Cardiology. New York, IEEE Press, 1980, p 197 119. Wu D, Denes P, Anat-y-Leon F, et al: Limitation of the surface electrocardiogram in diagnosis of atrial arrhythmias. Am J Cardiol 36:91. 1975 120. Arzbaecher R, Collins S, Jenkins J, et al: Feasibility of long-term esophageal electrocardiography in the study of transient arrhythmias. Biomed Sci Instrum 14:1, 1978 121. Collins S, Jenkins J, Brown D, et al: Rapid analysis of supraventricular arrhythmia from long-term esophageal recordings, in Computers in Cardiology. New York, IEEE Press, 1979, p 189 122. Arzbaecher R, Collins S, Mirro M, et al: Long-term esophageal recording in the analysis of supraventricular arrhythmia. International Symposium on Ambulatory Monitoring, Gent, Belgium, 1981, in press 123. Caceres CA: Present status of computer interpretation of the electrocardiogram: A 20 year overview. Proceedings of the Tenth Bethesda Conference; Optimal Electrocardiography. Am J Cardiol, vol 41, 1978 124. Bailey JJ, Itscoitz SB, Hirshfeld JW, et al: A method for evaluating computer programs for electrocardiographic interpretation, I. Application to the experimental IBM program of 1971. Circulation 50:73, 1974 125. Bailey JJ, Itscoitz SB, Graver LE, et al: A method for evaluating computer programs for electrocardiographic interpretation. II. Application to version D of the PHS program and the Mayo Clinic program of 1968. Circulation 50:80, 1974 126. Bailey JJ, Horton M, Itscoitz SB: A method for evaluating computer programs for electrocardiographic interpretation. 111. Reproducibility testing and the sources of program errors. Circulation 50:88, 1974 127. Rautaharju PM, Ariet M, Pryor TA. et al: Computers in diagnostic electrocardiography. Proceedings of the Tenth Bethesda Conference. Optimal Electrocardiography. Am J Cardiol 41:158. 1978 128. Bailey JJ. Harris EK: Evaluation of ECG interpretation: Truth versus beauty. Computerized Interpretation of the Electrocardiogram, Engineering Foundation Conference. Asilomar, California. 1980. p 179 129. Rosenberg NW. Tartakovsky MB: The Telaviv system-three-channel evaluation of long-term ECG records for atrial and ventricular identification and verification of arrhythmia, in Computers in Cardiology. New York, IEEE Press, 1979. p 29 130. Schuler P, Mark R, Moody G. et al: Performance measures for arrhythmia detectors, in Computers in Cardiology. New York. IEEE Press. 1980. p 267 131. Willems JL: A plea for common standards in computer aided ECG analysis. Comput Biomed Res 13:120. 1980 132. Ripley KL, Oliver GC: Development of an ECG database for arrhythmia detector evaluation. in Computers in Cardiology. New York. IEEE Press, 1977. p 203 133. Ripley KL, Geselowitz DB. Oliver GC: The American Heart Association arrhythmia database: A progress report, in Computers in Cardiology. New York. IEEE Press. 1978.p 47 134. Hermes RE. Arthur RM. Thomas LJ Jr. et al: Status of the American Heart Association database, in Computers in Cardiology. New York, IEEE Press, 1979. p 293 135. Hermes RE, Geselowitz DB, Oliver GC: Development, distribution. and use of the American Heart Association database for ventricular arrhythmia detector evaluation, in Computers in Cardiology. New York, IEEE Press, 1980, p 263 136. Automated Electrocardiography in the U.S. Cambridge, Mass., Arthur D. Little, August 1976. NTISPD2579183WV. 137. Rautaharju PM: The impact of computers'on electrocardiography. Eur J Cardiol 8:238, 1978 138. Resnekov L, Fox S, Seizer A, et al: Use of electrocardiograms in practice. Proceedings of the Tenth Bethesda Conference, Optimal Electrocardiography. Am J Cardiol 41:170, 1978 139. Zipes DP, Spach MS. Holt JH, et al: Future directions in electrocardiography. Proceedings of the Tenth Bethesda Conference, Optimal Electrocardiography. Am J Cardiol 41:184, 1978 140. Zywietz C, Joseph G, Grable W: A new intelligent electrocardiograph for stand-alone ECG analysis, in Computers in Cardiology. New York. IEEE Press, 1980. p 173 141. Arzbaecher R, Zurkonis C. Jenkins J, et al: P-wave based microprocessor rhythm module as an adjunct to existing computer ECG systems, in Computers in Cardiology. New York, IEEE Press. 1980, p 169 142. Macfarlane PW. Irving A, Peden J, et al: Progress with C.A.R.E., Symposium on Computing in Medicine. ted) Paul J. London, McMillan, in press

408 JANICE M. JENKINS 143. Whitman J, Wolf HK: An encoder for electrocardiogram data with wide range of applicability: Optimization of computer ECG processing. IFIP-TC4 Working Conference. Halifax, Nova Scotia, 1979, p 43 144. Womble ME, Zied AM: A statistical approach to ECG/VCG Data Compression, Optimization of computer ECG processing. IFIP-TC4 Working Conference. Halifax, Nova Scotia, June 1979, p 57 145. Power RG: Optimal ECG data compression, Proceedings on Computerized Interpretation of the ECG, Engineering Foundation Conference, California, 1980, p 183 146. Cohn DL, Melsa JL: Application of sequential adaption to data rate reduction. Proceedings of the 8th Asilomar, California Conference in Circuits, Systems, and Computers, December 1979 147. Stewart D, Dower GE: An ECG compression code. J Electrocardiol 6:175, 1973 148. Cox JR, Fozzard HA, Nolle FM, et al: AZTEC a preprocessing program for real-time ECG rhythm analysis. IEEE Trans Biomed Eng 15:128, 1968 149. Cady LD, Woodbury MA, Tick LJ, et al: A method for electrocardiogram wave-pattern estimates. Circ Res 9:1078, 1961 150. Womble ME, Halliday JS, Mitter SK, et al: Data compression for transmitting and storing ECG's/VCG's. Proc IEEE 65:702, 1977 151. Ahmed N, Milne PJ, Harris SG: Electrocardiographic data compression via orthogonal transforms. IEEE Trans Biomed Eng 22:484, 1975 152. Pahlm 0, Borjessen PO, Johannsen K, et al: Efficient data compression and arrhythmia detection for long-term ECG's. in Computers in Cardiology. New York, IEEE Press, 1978, p 395 153. Ruttiman UE, Berson AE, Pipberger HV: ECG data compression by linear prediction, in Computers in Cardiology. New York, IEEE Press, 1976, p 313 154. Krishnakumar AS, Karpowicz JL, Belic N, et al: Microprocessor based data compression scheme for enhanced digital transmission of Holter recordings, in Computers in Cardiology. New York, IEEE Press, 1980, p 435 155. Mortara D, Marquette Electronics, Inc: Personal communication 156. Macfarlane PW, Cawood HT, Lawrie, DTV: A basis for computer interpretation of serial electrocardiograms. Comput Biomed Res 8:189, 1975

THE UNIVERSITY OF MICHIGAN DATE DUE