Ledelse og Erhvervsøkonomi/Handelsvidenskabeligt Tidsskrift/Erhvervsøkonomisk Tidsskrift, Bind 33 (1969)

Perspectives on Information Processing in Management Information Systems*).

Charles H. Kriebel **)

SUMMARY

Over the past several years there has been considerable speculation concerning the role, direction, and characteristics of information processing systems in the future. The gap between the technological state of the art in computer-based information processing and today's applied practices in management information systems further clouds the issues involved. It is argued in this essay that information systems in the future will play a considerably expanded role in managerial problem solving processes, augmenting and in some cases replacing the analytical skills which today reside in the decision maker. Some implications of this argument are illustrated by a discussion of DPS (Dialectic Problem Solver), an interactive man-computer decision analysis model currently under development.

1. INTRODUCTION

1.1 Management Information Systems: "The State of the Art"

The phrase "information system" has been used in a variety of connotations.In particular, the topic "management information systems" has served as a gigantic umbrella, encompassing clerical arrangements, electronic equipment and devices, data collection systems, managerial accounting, the



*) Presented at the Scandinavian -G.S.I.A. Joint Faculty Seminar, Aspenasgarden, Lerum, Sweden (August 3-16, 1969).

**) Ass. professor, Management Sciences Research Group, Graduate School of Industrial Administration, Carnegie-Mellon University, Pittsburgh, Pennsylvania 15213. This report was prepared as part of the activities of the Management Science Research Group, Carnegie-Mellon University, under Contract NONR 760(24) NR 047-048 with the U. S. Office of Naval Research. Reproduction in whole or in part is permitted for any purpose of the U. S. Government.

Side 230

management processes of a firm, and so on. For clarity, a management information system is defined here to be "the configuration of human and capital resources which results in the collection, storage, processing, retrieval,communication, and use of data for management decision making and control." Our focus here is on information processing activities which bear directly on management functions vis-a-vis routine business data processing for local operations or to satisfy legal requirements. It is not necessary, at this point, to attach precise meanings to management functions in organizations. It is important, however, to distinguish between data and information within the context of management information systems.

Data are "'facts" which can take a variety of forms. They are the raw materials-the reports, measurements or images of organizational activitieswhich are collected and stored. Information is the intelligence of retrieved data when put to use in context. It is the output resulting from the conversion of "raw data" into a "product" which enables managers to take action appropriate within a particular frame of reference. The essential distinction, therefore, is that management information systems require focus on management functions (decision making and control) in addition to data processing activities. Said differently, it is impossible to have "good" management information without "good" data processing, however, the former does not automatically follow from the latter. Today, the most conspicious equipment resource in modern information systems is the computer.

In spite of the increases in expenditures on computer equipment, the absolute size of the investment in physical computer resources is still small in comparison with other key decision areas of top management. For every $100 spent on computer hardware, companies spend $187 on systems personnel;that is, equipment costs today range between 35 and 40 percent of the total outlay for information systems development, cf. McKinsey (1968). In 1962 computer equipment accounted for 60 to 70 percent of total costs. Thus, people and experience are increasingly the primary investmentin information systems. Moreover, the focus for development in managementinformation systems during the past decade has begun to shift in emphasis from administrative cost reduction in routine data processing to management opportunities in operations and control. That is, as companies have gained experience with computer based information systems they have begun to recognize that the large potential for payoff lies in the mainstream of management activities — operating decisions, planning, and control - and not in the mechanization of clerical activities; cf. Dean (1966), (1968), Garrity (1963), Kriebel (1967), (1968b), McKinsey

Side 231

(1968). Middle management no longer feels threatened by the computer system, rather the computer system, is viewed as a potential valuable partner to managerial progress; Shau.l (1964), Schwitter (1965), Business Week (1966), Meyers (1967). Coincidental with this enlightenment has been the perceived need for better understanding of management processes and management information systems design, more generally. When one reads of "'the management information explosion", it is unlikely to suppose that "... we can save ourselves from drowning in data by installing faster printingdevices"; Simon (1968). We must talk about designing "informationprocessingsystems" for management and not just designing "computers."

1.2 "Theory" versus "Practice"

Current prospects for a theory of management information systems are perhaps best exemplified by research on decision theory, which emphasizes the "value of information", e.g. Raiffa (1968), particularly the branch of decision theory dealing with the normative analysis of group decision, such as the team decision construct of Professors Marschak and Rådner; e.g., Kriebel (1968a). Although practitioners and theorists often use quite different language, one anticipates that both groups will identify similar topics of concern for the design of management information systems. A recent conference of systems professionals indicated that no clear consensus exists on management information systems theory; significant differences of opinion still beset the field.1) In the extreme it is argued that there is no theory of management; information systems today. That is, current formal theories of information systems do not cope with the real problems facing top management; they do not meet the general requirements for a viable theory as perceived by line management.

One practical consequence of the hypothesis that there is no viable theory of management information systems is the critical need for management's involvement in the planning and development of modern information systems, McKinsey (1968). Top management has remained an outsider in systems design and development in part because the historical emphasis in information system has been on (electronic) data processing and not on management decision and information processes. System professionals have viewed the information system as a well-specified (and often static) black box, rather than as a collection of dynamic processes which can change technology and the product-market scope of the firm.



1) This conference was held in June .1968 at Carnegie-Mellon University under joint sponsorship of the Office of Naval Research and G.5.1.A.; cf. the forthcoming volume by J. T. Heames, C. H. Kriebel and R. L. Van Horn (1970).

Side 232

Perhaps the most descriptive characteristic of the technology for current generation management information system is their capacity for direct interaction between managers and the computer system. During the June 1968 conference at Carnegie, several empirical studies reported on the ways that computers do or should interact with, influence, and complement people. A rare consensus among systems professionals which emerged in this case is the conjecture that interactive systems will continue to grow in importance and may well become synonymous with management information .2) The rationale for this consensus is not based simply on the availability of technology; rather, it stems from the fact that the manager is involved in an interactive system. The manager's vested interest in an interactive system is readily apparent to him because of his direct involvement. Here, perhaps, is a first principle for a viable theory of management information systems. Managers will use information systems in direct proportion to their perceived self-interest.

The discussion which follows attempts to provide some perspective for
the design of interactive management information and problem-solving
systems.

2. PROBLEM-SOLVING AND DECISION PROCESSES

2.1 Decision Making Under Uncertainty

In traditional terms the basic responsibility of management is to determine the profitable operation of an enterprise through the economic allocation of scarce resources, the factors of production. Management authority and control over operations is generally manifested through the process of decision making. Simon (1960) identifies three principal phases in the decision process: finding occasions for making decision (the "intelligence" activity), finding possible alternative courses of action or strategies (the "design" activity), and selecting from among the available strategies (the "choice" activity). Other authors contend that the process extends beyond the choice activity and includes the activities of "implementation" and "follow-up", and therefore is a perpetual cycle. Both contentions are insightful, but for the moment we will defer these broader interpretations of decision processes to subsequent discussion, and focus attention on the "choice" activity.



2) Note, this consensus relates to development emphasis and concern. It does not mean that interactive systems will completely replace batch processing data systems. Organizations will continue to utilize both technologies, but in different proportions than todav.

Side 233

A common partitioning of the choice activity in decision making is to identify whether the decision is made by (1) an individual, or (2) a group, being affected under conditions of (A) certainty or B (uncertainty; e. g., Raiffa (1968). Decision-making under certainty is defined as the case where the decision maker has complete knowledge about the elements in the process and each strategy or action alternative is known to lead invariably to a specific outcome. Decision-making under uncertainty is defined as the case where the decision maker has less than complete knowledge about the process elements and any strategy, or several, has as its consequences a set of specified outcomes but each outcome occurs with a given (or estimated) probability.

The mechanics for the normative analysis of decision processes are well documented in the literature of decision theory, e. g., Raiffa (1968). The specific details of the analysis in a given situation depend upon the individual assumptions imposed on the problem elements. The general logic of the decision analysis procedure is outlined in Figure 1. Although this diagram is self-explanatory, it is worthwhile to call attention to the first step in the process, i. e., "1. Define the decision". The implication of this step in the "problem statement" phase of the analysis is to answer the question: What decision must be made? If the problematic situation can be influenced by some allocation of resources, a decision (choice) must be made; but if we are only lamenting about circumstances beyond our control, no formal analysis will help.

Critics of the decision analysis procedure most often question its limitations as a descriptive model of decision making behavior. For example, a major question can be raised concerning the existence (let alone identification) of a preference or measurable function on outcome for the decision maker. Similarly, one might question the degree to which a manager can specify rules for actions, particularly when considering "nonprogramable" decision processes; cf. Simon (1960). It is not our purpose to evaluate these criticisms or others as they pertain to the descriptive power of the "decision analysis" model. The model, or problem representation, is presented here only as a normative procedure for the analysis of the choice activity in decision processes under uncertainty. As a descriptive model, there is no question that it is an idealized oversimplification. In particular, the model requirements (problem element statements) seem to preclude the opportunity for its application to ill-structured problems, which are perhaps the principal sources of concern for upper and top management; Newell and Simon (1958). We explore this issue in more detail below, given the above background.

Side 234

DIVL5178

Pigure 1


DIVL5178

Pigure 1

2.2 Ill-Structured Problems

Newell and Simon (1958) consider the class of well-structured problems as those " described in terms of numerical variables, scalar or vector quantities" in which "the goals to be attained can be specified in terms of a welldefined objective function" and for which "there exist computional routines that permit the solution to be found and stated in actual numerical terms." They attribute three descriptive characteristics to the complementary class of ill-structured problems. First, many of the essential variables in the problem are symbolic or verbal and not numerical. Second, the objective function or goal is non-quantitative and often vague. Third, computional algorithms for solution are not available. It is clear that this distinction between well-structured and ill-structured problems is not precise in most circumstances, since few real problems have all the properties of one class only; Newell (1969).

Side 235

Reitman (1964) in his consideration of cognitive processes, proposed a formalism for characterizing "ill-defined" problems. The essentials of Reitman's model specification consist of a problem vector, a problem requirement, and a solution statement. The elements of a problem (vector) at the lowest representation consist of an initial state "A", a terminal state "B", and a procedure (means) "■->-" which transforms (maps) A into B. These three elements are defined as a problem vector " [A,B—>]"; any given problem may be a single vector or a set of vectors, at different (hierachial) levels of detail. Problems originate as open statements about the elements of a problem vector (set); these statements are refined to specific statements through successive attention to detail, e. g., by detailing assumptions, constraints, parameters, etc. A problem vector becomes a "problem" by specifying a problem requirement, " [A',B',->']". A solution is defined as a feasible problem vector, such that the problem requirement is satisfied; i.e., A'->'B' and [A',B',-*'] C [A,8,-+]. Reitman classified ill-defined problems within this; framework depending upon the degree to which various elements in the "problem-vector" can be identified.

The framework by Reitman provides a broad descriptive model for problem representation in decision processes. In one sense it can be used as a predecessor of the decision analysis model of choice activity, since it contains a structural representation of Simon's (1960) design activity. More generally, the concept of a problem vector can be introduced at each phase of the model to expand the descriptive power of decision analysis. For example, the concept of a solution statement and a problem requirement are generalizations of the concept of a measurable utility function or preference ordering of outcomes by procedure often will not require the explicit specification of a utility function, but only the "closing" of open constraints sufficient to permit an analysis of the attributes of feasible problem requirements; cf. Kriebel (1969). Similarly, the mathematical representation for decision rules is really a subset of Reitman's procedure, "->"5 which includes computer programs, decision protocols, and other paradigms. In this regard an important area of descriptive research on problem-solving procedures that merits further consideration is heuristic programming.

2.3 Heuristic Programming and the Simulation of Cognitive Processes

A heuristic is a rule of thumb (a strategy, an "intelligent" procedure, device, gimmick or etc.) which limits search activity (for alternatives and solutions) in problem spaces. Heuristics do not ensure that optimal (or perhaps any) solutions to a problem will be forthcoming; a useful heuristic

Side 236

will generate "good enough" results most of the time and may solve given problems. A heuristic program is a procedure (typically, a computer program)which employs a set of heuristics for solving complex problems; cf. Feigenbaum and Feldman (1963). Heuristic programs tradeoff the cost of non-optimality against processing costs (time and complexity) and availability of alternative algorithms. In this regard, they tend to discover acceptable solutions more efficiently than do exhaustive search methods.

Interest in the theory of human problem solving during the past decade has focused on programming computer models of mental processes through heuristic programs and simulation; e.g., Clarkson (1962), Cyert and March (1963), Feigenbaum and Feldman (1963), Tonge (1961). One of the most interesting models of this kind was the General Problem Solver (GPS) program developed by A. Newell, C. Shaw, and H. Simon. GPS was called "general" because it makes no specific reference to the subject matter of the problem - not because it could solve any problem posed to it. Human problem solving as modeled by GPS conformed to the following outline:

The process "... proceeds by erecting goals, detecting differences between present situation and goal, finding in memory or by search tools or processes that are relevant to reducing differences of these particular kinds, and applying these tools or processes. Each problem generates subproblems until we find a subproblem we can solve - for which we have a program stored in memory. We proceed until, by successive solution of such problems we eventually achieve our overall goal-or give up." (Simon, 1960).

This descriptive paradigm of problem - solving by means - ends to a great degree was synthesized from experimental research on humans in the process of solving problems. Realization of the model as a computer program, or interalized process of a machine, was a considerable departure at the time from conventional thoughts on computers as mechanical devices. Although not without critics, e.g., Pierce (1962) and Oettinger (1964), this enlightened view of computers represents perhaps the most significant development in computer systems research during the past decade. The essential aspects of this position on computers were outlined by Clarkson (1962) as follows:

"1. Computers are general-purpose devices that are capable of employing
operations for manipulating symbols. They can accept symbols
as inputs ..., emit symbols as outputs ..., erase symbols ...,
and store symbols. They can copy symbols . . . , and compare symbols.

Side 237

Finally, and most important, they can behave differently depending on whether a pair of patterns, when compared, turn out to be identical or different ... By virtue of this last capacity, they can follow strategies - that is, make decisions that are conditional upon any kind of symbolic information.

"2. The symbols, or patterns, that computers can input, output, compare,
and process can be interpreted as numbers, as words, as English
sentences, or even as geometric diagrams .. .

"3. A number of computer programs has been written that process non-numerical symbols. At least one of these is designed to be capable of applying means-ends analysis to the solution of a fairly wide range of types of problems." [i.e., GPS above].

"4. A computer can be programmed to modify its program on the basis of its own experience; - that is, to learn . . . Thus, the computer is essentially a determinate system that is free to produce, adaptive, complex, and intelligent behavior."

Much of the creative research engendered by this perspective of the computer has been in the fields of behavioral psychology, computer science, and artificial intelligence (e.g.., Feigenbaum and Feldman (1963), Reitman (1965), Kleinmutz (1968)), with relatively little spillover to the area of management. Even though some of these efforts have been directed at management problem areas, e. g., Tonge (1961) and Clarkson (1962), it has been suggested that the programs developed involve only a few basic ideas and further that the full-rang«; of complexity in management decision problems has yet to be explored; Newell (1969). For example, Clarkson's model (1962) of the investment decision process is built upon the problemsolving components of: (1) a memory containing lists of information; (2) search and selection procedures for processing (retrieving) information from memory; and (3) a set of rules defined unambigously which guide the decision-making process. These components are well-specified within the framework of investment decisions about portfolios and are self-contained as a computer simulation program. The self-contained characteristic of the model is common to much of this literature; that is, nearly all of these models are "closed-end" in that the decision maker is peripheral to the internalized system. In the context of our introductory remarks, these models are not interactive information processing systems and for all practical purposes operate independent of the manager.

The insight provided by information processing models of human problem-solvingcould be used to expand the descriptive realism of the decision analysis representation - e. g., as briefly suggested in the preceding discussion.More importantly, perhaps, it might be used to design the interface between a normative computer model (such as the decision analysis representation)and

Side 238

sentation)andthe manager's perception and processing of the "real" problemin an interactive system. In such an interactive framework the computermodel can augment the manager's own information processing skills and the symbiosis yield a potentially "better" solution than either man or computer could realize independently; cf. Licklider (1960).

3. INTERACTIVE DECISION AND INFORMATION PROCESSING

3.1 Dialectic Programming

The field of artificial intelligence has been primarily concerned with machine (computer) replication and accomplishment of human-like activities. The two main branches of research on artificial intelligence have concentrated on either problem-solving processes and heuristic programming, or pattern recognition, self-organizing and learning systems; Newell (1969). Recently, there has also developed a concern for circumstances in which the computer assists the human who is himself performing intellectual tasks, i. e., so-called, computer augmentation of human reasoning; Sass and Wilkinson (1965), Wilcox (1965). The phrase dialectic programming has been used to describe systems for man-computer, problemsolving processes that provide interaction at an intellectual level to permit synthesis of a more valuable solution than either man or machine could produce independently; Wilcox (1965).

At a minimum the rationale for dialectic programming can be drawn on the basis of the relative comparative advantages for information processing by man and computer, respectively. Comparative advantage usually implies economics of one form or another, such as lower cost, increased efficiency, simplification, greater speed, etc. One can also imagine the computer or machine as a physical extension of human capabilities, for example, analogous to the mechanical manipulators employed in industry for remote operations in environments hostile to humans - such as where there is high atomic radiation. In this context numerous examples and opportunities currently exist in the outer space exploration program of the United States, e. g., the Apollo Project under NASA.

A more general rationale for dialectic programming is available by consideringsituations where the computer actively participates in the problemsolvingprocess itself, interacting in much the same way as an intelligent assistant, staff member or colleague would; e. g., Holland (1960), Newell (1960). In this case the computer is contributing much more than the performance of simple tasks at greater speed or lower costs. The computer system is now interacting on an intellectual level with the human, it accepts

Side 239

input from the decision maker, analyzes this data and criticizes or corrects it to the extent required. In the extreme the computer might even argue an opposing point of view with the decision maker, so that through the interaction a broader perspective is provided for the problematic situation than would be available from the human alone.

''Perhaps the highest example of this process in human society lies in the American court system, wherein a plaintiff and a defendant argue their opposing views in detail so that a judge or a jury has the best chance of deducing the true situation. A similar but less formal example, one upon which the progress of science depends, is the discussion and debate which takes place in technical journals and at scientific meetings." Wilcox (1965).

Evidence on the need for augmenting (and correcting) human information
processing in management decision-making is already available; e. g.,
Ackoff (1967), Edwards (1968), Harris (1963).

3.2 Exploratory Research

As a basis of inquiry into the design of more powerful and relevant management information systems a research project has been initiated to develop a prototype man-computer model with dialectic programming capabilities for online, interactive decision analysis and problem-solving. This model has been named DPS., for Dialectic Problem Solver (or Dialectic Programming System). DPS will be implemented on a time-shared computer system and incorporate: dialectic and heuristic programming characteristics for the analysis and solution of complex decision problems.

The overall structural characteristics of the DPS model are outlined in Figure 2. The internal environment of the problem-solving system is representedby the information processing programs of the manager and the computer; the DPS model is the interface between these sets of programs which monitors and guides the dialogue. The three principal stages of the problem-solving dialogue within DPS are: problem orientation, problem analysis, and problem evaluation. Referring to Figure 2, the problem orientation stage begins with the manager's perception of a "problem" (or problematic symptoms) in the external environment. The perceived problem may be triggered by a specific event which requires an allocation of resources (e. g., the preparation of a marketing plan), or by a series of observations that suggest remedial action is necessary (e. g., declining profit margins on sales). Given an immediate stimulus the manager initiates a dialogue with the system in the form of a "problem statement". The


DIVL5266

DIVL5266
Side 241

problem statement consists of specifying those identified clues in the external task environment which indicate the existence of a problem. For the generalized input processes of DPS this requires parametrization (includingidentification of logical and symbolic relationships as well as quantificationof variables) of problem vectors, problem requirements, and solution statements. The "conversational" exchange between the manager and the computer continues until an "internal" problem presentation (or several) is established for the perceived problem in the external environment.3) For example, the internal problem and solution representation might consist of a set of well-defined conditions on environmental state variables, which provide a description of the existing situation and a desired situation. These conditions may be ordered in terms of a hierarchy of sub-problems or be an assimilated list of diagnosed symptoms that merit further scrutiny. The problem orientation dialogue terminates with a problem image (or model) based on the representation or at the manager's discretion.4)

The problem analysis stage of the dialogue is initiated by the input of a problem image (or model) based on the problem and solution representation. Two observations on this process are worth noting. First, as outlined above the problem orientation dialogue concerns the diagnosis of symptoms to the point where a "problem", per se, is identified. The identification of the problem requires the mapping of "clues" from the task environment into an information processing image for the computer. That is, the characteristics of the real world problem are translated via some representation into an image, model, or program that is capable of processing by computer. Perhaps the most common representation for this purpose from the manager's perspective is natural language, i. e., his normal medium of conversation-be it English, Danish., Russian, Greek, or whatever. While natural language is extremely flexible, easy to communicate, and a highly familiar representation, it often is inefficient, lacks rigor, and poses some obvious technical problems for computer processing. Furthermore, several other types of representations, such as decision trees and two-dimensional pictures (or graphs), can store implications about a problematic situation (e. g., environmental "patterns" and inferences) much more compactly, than natural language. In particular,

"Knowing more than one is told is a characteristic of human per-



3) Aa more familiar analogy on the issue of representation in problem-solving, consider the following exercise in long division: MMCDI-—XLIX? For "latter-day Romans" the answer is 49.

4) In the latter case, the manager may no longer require the "assistance" of DPS or else he is unable to specify descriptive conditions in sufficient: detail for DPS to continue.

Side 242

formance which is present in most behaviors which are called intelligent. We have argued that this characteristic is necessary for machines which are to solve the real problem of information retrieval, language translation, and problem-solving. And furthermore, we must find efficient ways to store implications if we are to develop intelligent machines with finite memory capacities; that is, if we are to develop intelligent machines." Feigenbaum and Feldman (1963), p. 219 ff.

A popular form of internal representation in problem-solving systems (e. g., simulation of cognitive processes) has been the use of so-called "list structures". A list structure is a form of associate computer memory, wherein each symbol or data element is labeled with an indicator which tells the machine the location of a related symbol and each symbol, inturn, may refer to a string of other related symbols, producing a hierarchial organization of memory associations; e.g., Feigenbaum and Feldman (1963). Without belaboring the representation issue further, the medium for problem and solution representation establishes the information processing design for the computer and will influence the efficiency of the dialogue between man and machine. In some instances the representation of the problem will lead naturally to a specific model for analysis; in other instances, the representation will de-limit a class of admissible models, the specific analogue to be selected through further analysis - by the manager, the computer, or both. (For example, see Baker (1967), Chapter 5 in Heames, et. al. (1970), Joyner and Tunstall (1968), Kleinmutz (1968), Newell (I960), Newell (1965), and Simon (1966) for elaboration of this issue).

The second observation on the problem analysis stage of DPS is the fact that under certain circumstances the manager may desire to initiate interaction with the system at this (second) stage, rather than at problem orientation. The requirement for advancing the dialogue is the ability to input a problem image or model in an acceptable representation. Thus, if the problem is "sufficiently well-understood" by the manager, he can begin with problem analysis; if this strategy proves naive, he can, with hindsight, reinitialize the dialogue and return to problem orientation.

Given the input of a problem image or model, problem analysis proceeds to the selection of a method which manipulate the model in an attempt to obtain a solution. This process cycles between the manager (and his store of methods), the computer, and tests results, based on successive modificationof the problem image and the methods chosen. The process terminates when either the "solution test" is passed, or the test is not satisfied but no more methods are available of the manager abandons the particular problemimage

Side 243

lemimageunder consideration. If the solution test is satisfied, a "solution
statement" is provided and the dialogue proceeds to problem evaluation.

Problem evaluation begins with the "solution statement" from DPS to the problem' image. The manager then interprets this statement relative to the perceived problem in the task environment. If "reasonable correspondence" exists in his judgment between the solution statement and the "problem", he proceeds to effect implementation and observe results. If the correspondence between the model solution and the environmental problem is "poor", or the results from implementation suggest "new problems", he may again return to DPS for additional processing or terminate the interaction.

The preceding overview of the DPS system model, although lacking in detail, suggests the general structural requirements for an interactive management information system that possesses operating characteristics deemed important by line manager's. In outline form it also includes features relevant to broader interpretations of the decision making process, e. g., Simon's intelligence, design and choice activities. DPS will also maintain a trace diary of man-computer interactions, similar to the trace procedures employed in CAI (computer-assisted instruction) systems, as a data reference for learning and evaluation. Given the conjecture that information processing technology exists today to permit development of a DPS prototype, the current research program was initiated. In the interests of brevity, an extended discussion of this research is not included here. The program development is being undertaken in three phases, roughly corresponding to the principal stages of the dialogue within DPS.. The current project is primarily concerned with the problem orientation portion of the complete model, and is attempting to cope with the diagnosis of task environments, problem representation, and the development of general constructs for problem images. This component of DPS will incorporate the normative structure of "decision analysis" (as a "desired" problem image), as well as the more general descriptive characteristics of Reitman's analogue and heuristic programming. Although computer programming of the model is being performed in Fortran IV, the list structure representation and the list-processing features of other special purpose language will be included. Further discussion of this research project will be forthcoming in a subsequent report as initial results become available.

4. OPEN ISSUES

The design of management information systems (or, for that matter, of
any structure) requires superposition of objectives., goals, and a performance

Side 244

measure as a prerequisite to the analysis. Said differently, the activity of systems analysis is descriptive by nature; the activity of systems design is normative by definition. Using the black-box analogy, systems design involvesthe reconfiguring and adjustment of elements within "the box", so that given inputs yield desired outputs. Practical experience in managementinformation systems development today suggests that an important attribute of "succesful" systems is the involvement of line management; that is, management establishes or strongly influences the objectives and goals of the design-they provide dimensions on what is "desired" of the system. Managers and systems professionals working as a team can establisha rapport such that each group has an appreciation for the other's value system. Churchman (1968) summarized this viewpoint in slightly differentterms in describing "the systems approach"; viz.:

"1. The systems approach begins when first you see the world through
the eyes of another. .. .

"2. The systems approach goes on to discovering that every world
view is terribly restricted. . . .

"3. There are no experts in the systems approach. ...

"4. The systems approach is not a bad idea."

In 1960 Pierce argued that computers are basically "dumb" machines with an extraordinary potential for routine data processing tasks (e. g., performing simple calculations on enormous volumes of data at great speed) and little capacity for the "intelligent" information processing attributable to humans. An extension of this argument says that research which attempts to extend the computer's capacities in the direction of human-like activities does so in error, and ignores the "natural" comparative advantages of man and machine, respectively. This position was essentially repeated by Oettinger (1964) in his "bulling" versus "cowing" classification.

One implication of the Pierce-like rationale is to view the computer ast an enabling device for data processing activity alone. This view ignores the distinction between "data" and "information" and, with it, the vital difference between business data processing and management information systems. Several years ago C. N. Parkinson gave society the pragmatic law in business economics the "expenditures rise to meet income." The past fifteen years of experience with computers in administrative organizationssuggests a comparable law (sic) for computer applications in managementinformation systems. Since the phenomenon generalizes, I've called it The Law of the Hammer. In its pristine form the Law of the Hammer

Side 245

says, "If you give a five-year old boy a hammer, he will soon discover, there are a lot of things which need hammering." There are over 70,000 computer installations in existence: today and an additional 20,000 computers on order throughout the world; A.F.I.P.S. (1966), McGovern (1968). Too often, managers and systems professionals are discovering "there are a lot of things which need computing." While it is important to avail ourselves of advances in information technology, we must be careful not to ignore the end objective of our efforts. More importantly, perhaps, as professionals involved in research on information technology, we have an opportunity to influence the direction and contribute to progress which* is sorely needed in the field. From a pragmatic point-of-view these needs demand an information processing perspective, given past failures with the myopic view of data processing.

REFERENCES

Ackoff, R. L. (1967), "Management Misinformation Systems," Management Science
(December), pp. 8147-8156.

A.F.I.P.S. (1966), The State of the Information Processing Industry (American Federation
of Information Processing Societies, New York City, April 1966), 103 + v
pages.

Ansoff, H. I. (1965), Corporate Strategy (McGraw-Hill)..

Baker, F. B. (1967), "The Internal Organization of Computer Models of Cognitive
Behavior", Behavioral Science (March), pp. 156-161.

Business Week (1966), "How Computers Liven a Management's Ways" (June 25).

Churchman, C. W. (1968), The Systems Approach (Delacorte Press).

Clarkson, G. P. E. (1962), Portfolio Selection: A Simulation of Trust Investment (Prentice-Hall).

Cyert, R. M. and March, J. G. (1963), A Behavioral Theory of the Firm (Prentice-
Hall).

Dean, N. J. (1968), "The Computer Comes of Age", Harvard Business Review (January-
February), pp. 83-91.

Edwards, W. (1968), "Conservatism in Human Information Processing", pp. 17—52 in
Kleinmutz (1968).

Epstein, R. A. (1967), The Theory of Gambling and Statistical Logic (Academic Press).

Feigenbaum, E. A. and Feldman, Jf. (ed.) (1963), Computers and Thought (McGraw-
Hill).

Garrity, J. T. (1963), "Top Management and Computer Profits," Harvard Business
Review (July-Augut), pp. 6-12.

Harris, Jr., J. G. (1963), "Judgmental Versus Mathematical Prediction: An Investigation
by Analogy of the Clinical Versus Statistical Controversy," Behavioral Science
(October) pp. 324-335.

Heames, J. T., Kriebel, C. H., and Van Horn, R. L. (1970), Management Information
Systems: Progress and Perspectives (Prentice-Hall, forthcoming).

Holland, J. H. (1960), "Iterative Circuit Computers," pp. 259-266 in Proceedings of
the Western Joint Computer Conference (May 3-5).

Joyner, R. C. and Tunstall, W. J. R. (1968), "CONCORD (Conference Coordinator Computer Assisted Organizational Problem Solving: Initial Development of the Program," C. I. P. Paper No. 4 (October), Faculty of Administrative Studies, York University, Toronto, Ontario, Canada.

Kleinrnuiz, B. (Ed.) (1968), Formal Representation of Human Judgment (John Wiley).

Kriebel, C. 11. (1967), "Operations Research in the Design of Management Information Systems,1' Chapter 22, pp. 375-390, in John F. Pierce (ed.), Operations Research and the Design of Management Information Systems (Technical Association of the Pulp and Paper Industry, New York City).

Kriebel, C. H. (1968a), "Quadratic Teams, Information Economics and Aggregate
Planning Decisions," Econometrica (July-October), pp.-530—543.

Kriebel, C. H. (1968b), "The Strategic Dimension of Computer Systems Planning,"
Long Range Planning (September), pp. 7-12.

Kriebel, C. H. (1969), "On the Design of Management Operating Systems," Chapter 3
pp. 59-78 in John F. Blood, Jr., (ed.), Management Science in Planning and
Control. (Technical Association of the Pulp and Paper Industry, New York City).

Licklider, J. C. R. (1960), "Man-Computer Symbiosis," IRE Transactions of Human
Factors in Electronics, Vol. HFE-1.

McGovern, P. J. (1967), "The EDP 100," E/D/P Industry and Market Report (February

McGovern, P. J. (1968), "The EDP 100," E/D/P Industry and Market Review (September

McKinsey (1963), Getting the Most Out of Your Computer (McKinsey &■ Company,
Inc., New York City), 20 pages.

McKinsey (1968), Unlocking the Computer's Profit Potential (McKinsey & Company,
Inc., New York City), 38 pages; reprinted in: Computers and Automation (April
1969), pp. 23^*4.

Myers, C. A. (ed.) (1967) The Impact of Computers on Management (M. I. T. Press).

Newell, A. and Simon, H. A. (1958), "Heuristic Problem Solving," Operations Research
(January-February), pp. I—lo.

Newell, A. (1960), "On Programming a Highly Parallel Machine to be an Intelligent
Technician," pp. 267-282, in Proceedings of the Western Joint Computer Conference
(May 3-5).

Newell, A. (1965), "Limitations of the Current Stock of Ideas about Problem Solving,"
Chapter 17, pp. 195-208 in A. Kent and O. Toulbee (ed.) Electronic Information
Handling (Spartan Press).

Newell. A. (1969), "Heuristic Programming: 111-Structured Problems," Chapter 10 in
J. S. Aronofsky (ed.), Progress in Operations Research, Vol. 3 (John Wiley and
ORSA).

Oettinger, A. E. (1964), "A Bull's Eye View of Management and Engineering Information
Systems," Proceedings of 1964 National Conference of the Association for
Computing Machinery (Thompson Books), pp. 8.1-1 to 1-14.

Pierce. J. R. (1962), "What Computers Should Be Doing," Chapter 8, pp. 290-325
in M. Greenberger (ed.), Management and the Computer of the Future (John
Wiley).

Raiffa, H. (1968), Decision Analysis (Addison-Wesley).

Rappaport, A. (1968), "Management Misinformation System - Another Perspective."

Reitman, W. R. (1964), "Heuristic Decision Procedures, Open Constraints, and the
Structure of 111-Defined Problems," in M. M. Shelley and G. L. Bryan (eds.),
Human Judgments and Optimality (John Wiley).

Reitman, W. R. (1965), Cognition and Thought, An Information Processing Approach
(John Wiley).

Sass, M. A. and Wilkinson, N. D. (ed.) (1965), Computer Augmentation of Human
Reasoning (Spartan Books and Macmillan Company, Ltd.).

Schwitter, J. P. (1965), "Computer Effect upon Managerial Jobs," Journal of the
Academy of Management (September), pp. 236 ff.

Shaul, D. R. (1964), "The Effects of EDP on Middle Management," Doctoral Dissertation,
Graduate School of Buiness Adminitration, University of California, Los
Angeles.

Simon, H. A. (1960), The New Science of Management Decision (Harper Bros.).

Simon, H. A. (1966), "Representation in Tic-Tac-Toe," Complex Information Processing
Paper, No. 90 (June), Carnegie-Mellon University, Pittsburgh, Pennsylvania,

Simon, H. A. (1968), "Observations and Opinions: The Future of Information Processing
Technology," Management Science (May), pp. 619-624.

Taylor, J. W. and Dean, N. J. (1966), "Managing to Manage the Computer," Harvard
Business Review (September-October), pp. 98—110.

Tonge, F. M. (1961), A Heuristic Program for Assembly Line Balancing, (Prentice-
Hall).

Wiek, A. (1969), "CW Survey: 350 % Growth for Terminals by 1972," Computerworld
(February 19), pp. 8-9.

Wilcox, R. H. (1965), "Computer Augmentation of Human Reasoning," Chapter 21,
pp. 267-276, in A. Kent and O. E. Taulbee (eds.), Electronic Information
Handling (Spartan Press).