Neither Stakhanov Nor Dilbert
In “The Laws of Cool,” Lawrence Liu suggests we read management literature as literature. We can take this further and read the history of management as history – as an integral and consequential part of the same history that contains social struggles and wars. And if an art is a craft with a theory, management is definitely an art. That theory, as presented in the literature, has changed throughout the history of management. It has a mainstream, its forgotten and neglected traditions, and its heresies.
Management consultant Stafford Beer sat somewhere outside the former. Born in 1926, he discovered the new interdisciplinary field of cybernetics in the 1950’s and set about applying it to managing heavy industry. Cybernetics (the innocent root of “cyberculture,” “cyberpunk,” and “cybersecurity”) claimed to be an interdisciplinary science of feedback loops in machines, people, and society. Its detractors regarded it as a dehumanizing pseudoscience. But for Beer it would become an indispensable tool for pursuing liberty.
During the 1950’s Beer founded the Department of Operations Research and Cybernetics at United Steel, in 1961 he then started the SIGMA Consultancy before joining the IPC publishing house in 1966 to push for the adoption of computing technology. In 1970 he left to become an independent consultant and lecturer at Manchester University. He wrote about his experiences and ideas in a series of manuscripts, developing his theory of management cybernetics. Beer’s willingness to think big and present his ideas to the public gained him both passionate supporters and detractors who regarded him as a charlatan. By the early 1970’s he’d become frustrated with the bureaucracy and centralization of government and had begun working on strategies for more dynamic, feedback-based models of stable and dynamic systems: What he called the “Viable System Model” (VSM) and “The Liberty Machine” respectively.
The VSM is a multi-layered feedback loop: System 1 includes the primary activities; System 2 is the communication system between those activities; System 3 is for planning and control of system one; System 4 is responsible for monitoring the environment and determining how the system as a whole needs to adapt to remain viable; and System 5 is responsible for steering the system as a whole. In a body, system one would be the limbs and organs, system two the nervous system and so on. In a corporation, system one would be the departments, in an economy, system one would be firms. In true cybernetic style it is the relationship between the components of the system that is important, not their gross physical structure. And each layer of the system is part of a feedback loop, not simply controlled by the next layer of the system.
Half a century later, Beer’s hand-drawn diagrams of multi-layered feedback loops inspired by the organization of the human nervous system have an air of the scientific occult, ready for repurposing as part of a Ghost Box album cover. They come from a world in which computer access was measured in hourly fees that would quickly exhaust an average weekly wage. They are not merely diagrams networks or circuits, or simple flowcharts, but lay out an organization and structure for information exchange. In the 1960’s and early 1970’s the potential computers and knowledge that they would one day be more widespread had as much of an effect on the popular imagination as their actual use by governments, large corporations, and universities. Cybernetics, system dynamics, and operations research were knit together with computation and applied to a everything from the arts, to politics (and political activism), to management, namely by Beer.
Beer had the opportunity to try out his Viable System Model through implementing basic cybernetic management systems for companies such as Warburton’s bakers. He also wrote books about his ideas and gave lectures on his developing theories at Manchester University, but cybernetic management never came into the mainstream. While the Soviet Union saw cybernetics as a way of imposing party control over the economy (see Francis Spufford’s “Red Plenty” for an account of the ambitions and failure of Soviet cybernetic management), Beer’s models were intended to support genuine autonomy (not merely Dilbert-esque “empowerment”) as well as a two-way flow of information and responsibility between parts of an organization. Rather than strive to perfect authority, Beer’s management cybernetics was an in fact an impediment to the managerial ego, emphasizing its potential for distributed authority rather than autocracy.
During the Cold War, entire national economies in the Communist “second world” were centrally managed (or “planned”) by the state. So were publicly owned industries and utilities in mixed economies such as the United Kingdom. Even in the United States, private interests were positioned to work symbiotically with vital state-owned infrastructure and the military-industrial complex. Managerial-thinking is ubiquitous today, evidenced by the dramatic intervention by “first world” states following the 2008 financial crisis: avoiding economic collapse through better management. The art of management is big business and by the mid-1960’s Stafford Beer was making a very comfortable living from it.
In 1971, Beer received a letter from Chile. Fernando Flores, a member of the recently elected Popular Unity party, knew of Beer’s earlier work from his books and thought that cybernetics might help the Chilean government’s plans to transition to a socialist economy while maintaining democratic institutions. He was offering Beer the chance to apply his ideas, not to a company or a single government department, with all its attendant bureaucracy, but an entire economy. Beer responded by offering to drop all his existing contracts and fly to Chile at once.
Eden Medina spent a decade researching what happened next: construction of the Cybersyn (“Cybernetic Synergy”) system which Beer, Flores, along with teams from Chile and the UK designed to tackle Chile’s unique economic and political challenges. She interviewed people involved with the project, located extensive archive materials in Chile and the UK, and worked the results into her book Cybernetic Revolutionaries (which I can’t recommend highly enough). Medina explains how Cybersyn was intended to use Chile’s paltry computing and telecommunications resources to make a piece of civil infrastructure unique even among more technologically and economically-developed nations - a computer-based system for monitoring economic production in real-time and helping workers manage their own production.
Cybersyn grew to consist of several components. Cyberstride was the software system running which first ran on an IBM and later a Burroughs mainframe, written by consultants in the UK and Chile in Fortran IV to process data in order to ensure the economy was running smoothly. Cybernet was a teletype network that connected factories and other production sites to the Cyberstride data centre. CHECO was an economic model of the Chilean economy used to assist in economic planning and written in the DYNAMO programming language (DYNAMO was also used at around the same time to create the economic model used in the still-influential book The Limits To Growth by Meadows et al.). The Opsroom was a futuristic meeting space with data screens and fibreglass chairs fitted with control buttons designed to display data from Cyberstride. And Cyberfolk was Beer’s unrealised plan for real-time public feedback on the governments performance using television broadcasts and “algedonic” electronic approval meters. By the end of the project, Cyberstride was processing some data but not in real-time, Cybernet had been used to help the government survive industrial action by haulage company owners, but CHECO was hamstrung by the hidden complexities of an economy under siege by foreign intelligence agencies, the Opsroom had only been prototyped, and work on Cyberfolk had hardly begun.
Medina frames Cybersyn (Projecto Cynco in Spanish) as an encounter between politics and technology, mediated and destabilised by the personal and geopolitical stories that overlap it. Cybersyn was abandoned when Augusto Pinochet’s US-backed coup turned Chile into a right-wing dictatorship, marking the start of the neoliberal era championed by Margaret Thatcher and Ronald Reagan. Cybersyn’s unfinished status encourages many to project their hopes and fears about cybernetics, management, and economic policy onto it – which people began doing even before it was publicly announced. Consequently, Cybersyn is exceedingly difficult to evaluate retrospectively. Medina scrupulously tracks the successes and failures of each part of Cybersyn, never letting its unfinished state either diminish the former or excuse the latter (arguing and capably demonstrating that the study of historical technology in context can and indeed should inform the study of political history).
Medina thoroughly grounds the technology of Cybersyn in Beer’s management theory and how his ideas were changed by his experiences in Chile. Beer’s theory is part of an even longer history and it is one that is not primarily one of technology or geopolitics. Modern management, the stuff of Dilbert cartoons and The Office, is devoted to the efficient allocation of resources in the enterprise. This tradition of “scientific management” is still sometimes referred to as Taylorism, after the early 20th management consultant Frederick Taylor who popularized it.
As described in an interview with The Harvard Business Review, Caitlin Rosenthal’s research into plantation management in America and the West Indies from 1750 to 1860 revealed their use of scientific management techniques, often in more advanced ways than the factories and slaughterhouses in the Northern United States which are frequently cited as their point of origin in accounts of Taylorism’s later development. Henry Laurence Gantt, inventor of the Gantt chart and therefore patron saint of Microsoft Project, was born on a plantation and was a close associate of Taylor. Whether Taylor took direct inspiration from Gantt’s experiences or not is unknown.
The History of Management
Taylorism’s actual historical influence is surprising – it doesn’t fit into a neat, linear history of management science and it had no affect on the development of the Fordist economic and social system which developed independently of Taylor’s contemporaneous work. But the Soviet concept of the Five Year Plan can be traced directly to Taylorism. Dilbert and Stakhanov are both Taylor’s children. Beer’s ideology, similar at first glance were idiosyncratic and ultimately misunderstood by many – the “control” function from his models that so alarmed some contemporary commentators, for instance, represents homeostatic stability, not social domination.
Taylorism’s themes of highly controlled rational efficiency outlived the heydey of scientific management in the first half of the 20th century. The white heat of logistics in the Second World War led to those themes continuing not just in cybernetics but in operations research, the scientific analysis of organizational behaviour, and system dynamics, the mathematical study of complex systems such as modern corporations. Increasing efficiency often meant replacing fallible human performance and initiative with abstract processes and opaque machinery. The emergence of Computer Numerical Control (CNC) in American industry in the 1950’s is contrasted in Manuel de Landa’s “War In The Age Of Intelligent Machines” with more human-centric management ideologies in Europe and Japan.
As Medina emphasizes, Cybersyn was distinguished both from CNC and Soviet cybernetics by placing the knowledge and skills of workers at the heart of the system, at least in theory. Cybersyn and the VSM did not need to act on different individuals for each role in the system: in principle, they could be used as tools for self-management or leaderless organizations, comparable to Edward De Bono’s “Six Thinking Hats.” While Cybersyn was implemented in a centralized way technologically, as Beer explained with some amusement in a lecture later recorded at Manchester University, this was because the project only had access to a single mainframe – all the computing time the Chilean state could afford to give it. Organizationally, the system was distributed.
Like scientific management before it, the ideas of operations research and system dynamics are still present in contemporary management theory. Cybernetics less so, and Beer’s models not at all. But there are two elements of Cybersyn that are tempting to project contemporary interests onto. The first is the use of data and information graphics by Cyberstride and Opsroom, which can be taken as a forerunner to contemporary “Big Data.” As Medina explains in Cybernetic Revolutionaries, the City of New York had already implemented a more extensive computerization program inspired by the system dynamics movement. “The New York Mayorial Origins Of Big Data” doesn’t have quite the same ring to it. And as Edward Tufte explains in “The Visual Display of Quantitative Information,” the use of information graphics in statistical analysis goes back to the 19th century. There were certainly more diagrams and there was more data by the time Beer started working at United Steel, but we cannot turn this continuous quantitative increase since the 1800’s into a qualitative shift in the 1970’s. Again more interestingly, Medina identifies the class and gender content of the Opsroom’s graphic and industrial design. Clerical work, traditionally performed by women, was to be made obsolete by the system’s technology, mediating workers and information through visual schematics and graphical displays. But the hand-drawn information graphics projected into the mens’-club-influenced design of the OpsRoom were, in fact, drawn by female designers, despite the availability of computer-controlled drawing devices since the 1950’s. Ultimately, Cybersyn’s use of visual displays as a mediating layer were more akin to the flow diagrams used in factories and power plants than 21st century information dashboards.
The second use of graphical interfaces are Cybersyn’s telex/teletype/teleprinter communication network. Teleprinters were electric typewriters connected to analogue phone lines by primitive modems. Cybersyn used them not because they were high tech (they weren’t) but because they were available – the previous government had ordered four hundred and forgotten about them in a warehouse. What resulted from hooking them up to the telephone network in 1972 was not a socialist precursor to the Internet however, as the Arpanet had been online since 1969. The centralized, analogue Cybernet of teletypes was more the last gasp of Victorian telegraph technology than a first glimpse of a Microsoft Exchange-operated future. But even to assemble such a network at all was a major achievement and a moment of technological alterity during the Cold War, shaped by the political and economic realities of Chilean socialism.
The one technological element of Cybersyn that is still thriving in contemporary industrial management theory and practice gets just a single paragraph in Cybernetic Revolutionaries. Beer had been made aware of P.J. Harrison and C.F. Stevens’ 1971 paper “A Bayesian Approach to Short-Term Forecasting” immediately before design on Cybersyn started. It’s a dry, technical paper that describes a way of deciding whether a new measurement represents a glitch, an interesting change in behavior, or business as usual. Implemented in FORTRAN IV within Cyberstride, the algorithm would monitor the performance indicators sent by teletype from the factory floor for changes that indicated problems for the workers to investigate (and, if they could not fix them within an agreed timescale, alert the next VSM level).
This kind of algorithm was an active area of research from the 1960’s onwards. Today its descendants are used by Twitter to identify breakouts and anomalies in their data. But they are used far more frequently as they were in Cybersyn, as a Statistical Process Control algorithm. Statistical Process Control is a quality control technique that analyses measurements from a manufacturing process in order to make sure that products meet set standards. No rotten fruit, no broken glass, no malfunctioning MP3 players.
In parallel to the history of Scientific Management, the quality control tradition stretches back to the first half of the 20th century. In the West, quality control reached its contemporary zenith in the 1980’s with Motorola’s Six Sigma QC tools and the international standard ISO 9001. Six Sigma is a set of techniques and procedures for decreasing waste by increasing quality. The name comes from the objective of having 99.99966% of all products free of defects. Like Cyberstride it relies on identifying key metrics in production and then reducing variability in them. ISO 9001 is a generic ontology for quality, a Mad Libs-style system of documents that one can plug their organization’s metrics and requirements into and follow to the letter in order to ensure products are made to spec. ISO 9001 came out of British engineering’s encounter with Japanese working practices in the 1970’s (which is a whole other story). Needless to say, ISO six sigma and ISO 9001 both have their own jargon, iconography, and literature and would likely repay a cultural analysis along the lines of Graham Cleverley’s “Managers and Magic.”
Quality control is cybernetic: it feeds back analysis of the products of manufacture (or labour) into the ongoing development of that process of manufacture (or labour). Quality assurance is homeostasis; continuous quality improvement is homeorhesis. But quality control is a closed loop of business processes in the CNC tradition rather than a Cybersyn-style system for balancing self-management with system integrity. Cybernetic analysis of the system as a whole, its social as well as its technical elements, provides us with the intellectual resources to criticize and transform these processes of production in forgotten but useful ways. The rediscovery of Cybernetics by some anarchist thinkers in relation to leaderless revolutions and self-management in particular should prove interesting in the contemporary techno-political climate.
Cybersyn serves as an inspiration and a warning for some contemporary tendencies. It represents an attempt to transform an economy through technological advancement and the use of simulation techniques (with CHECO). That it got as far as it did should be of interest to Accelerationists, although the project’s incompletion and the usage of similar technology to produce The Limits To Growth should give them pause. As Medina demonstrates with the examples both of Cybersyn’s international reception and the ultimate fate of technological elements of the project, it is very hard to embody politics in technology. Both proponents and critics of Bitcoin and of socialist alternatives to “Stacks” would do well to remember this. And developers of Smart Contracts and Distributed Autonomous Organizations can take Cybersyn’s non-monetary measurements of economic activity as an example of resource management without tokens, but should look closely at the project’s experience of interfacing algorithmic management with the pressures and temptations of the external, social world.
The technology and politics of Cybersyn, and its unique blending of the two, represent the results of a unique historical moment. Cybersyn’s embodiment of Beer’s alternative view of management is a strategy that may be more widely applicable. But we cannot know or exploit this if we do not study the history and theory of management and the history and theory of the post-war intellectual environment that evaporated with the advent of widespread access to computers in the late 1970’s.
Not that we should aim for the Cybernetic equivalent of “Welcome To Scarfolk,” or Hauntological MBAs. Rather, I propose a serious engagement with historical examples of alternative technologically mediated management systems and the conceptions of management that they embody. Algorithms don’t exploit people, people write algorithms that exploit people. But this hasn’t always been the case, and the example of Cybersyn shows us the value of studying the moments when they didn’t.