For this week, I read the article called “Understanding Infrastructure: Dynamics, Tensions, and Design.” The article is in fact a “Report of a Workshop on “History & Theory of Infrastructure: Lessons for New Scientific Cyberinfrastructures,”” published in January of 2007 by the scholars, Paul Edwards, Steven Jackson, Geoffrey Bowker, and Cory Knobel. This report summarizes the findings of a workshop that took place in September of 2006 at the University of Michigan--a three-day National Science Foundation-funded “think tank,” so to speak, that brought together experts in social and historical studies of infrastructure development, domain scientists, information scientists, and NSF program officers. The goal was to distill “concepts, stories, metaphors, and parallels” that might help realize the NSF vision for scientific cyberinfrastructure.
To begin, this workshop and report on cyberinfrastructure is highly technical, so I will attempt to translate some of the work and findings that are directly relevant to our class, LIS 201: the Information Age, as presented by Professor Greg Downey. The authors utilize Steward Brand’s notion of the “clock of the long now” to remind us to step back and look at changes occurring before our eyes that are taking place on a slower scale than we are used to thinking about. Citing Brand, the authors argue that the development of our current cyberinfrastructure has occurred over the course of the past 200 years during which time an exponential increase in information gathering and knowledge workers on the one hand and the accompanying development of technologies to sort information on the other, has led to a “cyberinfrastructure.” Manuel Castells, a Spanish born and highly influential sociologist and communications researcher—whom Dr. Greg Downey mentioned in class—argued that the roots of contemporary “network society” are new organizational forms created in support of large corporations. While James Beniger—another scholar Professor Downey mentioned in class—described the entire period from the first Industrial Revolution to the present as an ongoing “control revolution.” As we have seen in class from such examples as the old corporate education films and Charlie Chaplin’s “Modern Times,” the control revolution describes the trend in society toward efficiency, commodification, compartmentalization, specialization, and of course control—of both information flow and how people carry out their work and lives. The authors ultimately define cyberinfrastructure as the set of organizational practices, technical infrastructure, and social norms that collectively provide for the smooth operation of science work at a distance. The cyberinfrastructure will collapse if any of those three pillars should fail.
I find this last thought particularly interesting because the very idea of a functioning modern cyberinfrastructure depends upon the implicit “buy in” or “cooperation” of the society. It reminds me of what the great biologist, E.O. Wilson once said, that if all the ants were suddenly removed from the world, our entire ecosystem and the world as we know it would collapse. The same is true of human beings’ presumed complicity with the rules, regulations, and norms that comprise our modern cyberworld—if we suddenly stopped playing by the rules, the whole house of cards would come crashing down.