Monday, August 24, 2009

Feenberg's Critical Theory of Technology

In Andrew Feenberg's Critical Theory of Technology, he makes a similar claim to the one he pitched in Questioning Technology, written nine years after he published his Critical Theory book. Basically, he argues for a social conception of technology, understanding that any technology carries with it the values of those who had a hand in creating it--values that will inevitably elevate some interests while suppressing others. This book seems more interested in digging into the economical consequences of technology and ultimately argues that for any socialist society to function well, it will need to not only understand the inherent biases technologies carry but also be able to subvert those technologies to appropriately account for all who will be affected by said technologies.

Feenberg discusses Marxism and modernity at great length in this book arguing that the fragmentation and deskilling of laborers is greatly responsible for the USSR's inability to establish themselves as a successful socialist society. He says "the distribution of culture is in large part a function of the division of labor. Although society becomes more complex, most jobs remain simple or become even simpler as crafts and professions are deskilled. Despite the growing emphasis on credentialing in management, the gap between the level of culture required to understand the social world grows ever larger. Technological advance not only subordinates workers to capital, but disenfranchises them. Society has no incentive to teach and they have none to learn the knowledge that would qualify them to participate in the social decisions that concern them. This is the knowledge deficit" (28). Though he never comes out and says it, I think Feenberg is criticizing Fordist notions of mass production in which workers perform a limited set of skills in order to work together to create the finished product. For instance, rather than knowing how to assemble an entire vehicle, one worker only knows how to install the bolts on the driver side door, and the finished vehicle comes together down the line. No one person has all the information they need to build an entire car, but the theory behind Fordist mass production is that the worker will do that one thing very, very well. Feenberg draws on Marxist theory to point out that this actually disenfranchises the individual worker who, in times past, may have been able to build his own car but now must hand over the limited skill set he has to "the capitalist" who compiles everybody's limited abilities into a finished product "the capitalist" can make a profit selling.

Feenberg argues that the reason the USSR failed at developing a successful socialist society was because this type of technology reinforced a capitalist structure that collapsed when the powers that be tried to redistribute power/wealth/knowledge. Eventually, those with the know-how took control of a failing situation, and what resulted was socialist in name only but capitalist through and through. Feenberg argues throughout this book that technology is not inherently biased on its own but that the process of developing technology is. We must understand the process of development in order to be able to democratize it. The argument here is similar to that in Questioning Technology. Technology is initially developed with primary instrumentalizations, those processes/outcomes the initial developers intended when they crafted the technology, but users of technology have the potential (if they understand the social nature and malleability of technology) to develop secondary instrumentalizations that will subvert the technology for their own purposes. Feenberg cannot draw on the development of the Internet in this book mainly because it had not been developed to the extent that it was in his later book, but he does draw on the development of the computer and the ways in which users have moved away from the "pure" processing potentials of the device to seeing it as a communicative device.

Feenberg argues, as he does in Questioning Technology, that the only answer in developing technology that will be suitable for a proper socialist society will require the democratization of the design of technology. When I read this in his other book, I was frustrating that never really delved into the logistical specifics of this argument. He answers some of my questions here, but the nitty-gritty details still evade me. I asked before whether Feenberg was actually calling for everybody to have an equal say in the development of all technology, and after reading this book, I think his answer would be yes. However, he recognizes one of my earlier concerns with this notion that questioned whether people were adequately educated to have an equal voice in the development of technologies. Being no physicist myself, I can hardly imagine having any productive voice in the new space ship NASA is developing. Feenberg, acknowledging this, says "but given the disqualifying effects of the capitalist division of labor, how can workers organize the firm? They need not all be experts to play a role in corporate governance, but they must at least have capacities equivalent to those that enable investors to handle their investments, and work together in shaping policy and selecting managers. Absent these capacities, socialization either remains purely formal, or leads to disastrous mistakes" (151). He answers that education is the answer. He argues for an over-education of everybody that will initially balance itself out. He argues that in order for people to be able to make informed decisions, they will need a wealth of knowledge that has been denied them in the capitalist structuring of technology.

In the abstract, Feenberg presents a beautiful possibility. A utopia. However, even as he defends technology for not being inherently biased toward one group over another, he recognizes that technology's design is biased. Technology is designed by people, the self-same people who would be functioning in any future system. The abstraction is inviting, but Feenberg does not seem to be accounting for the variable that is man (not man as man, but man as humanity). In an ideal world, people would love their job and would have the autonomy to decide what they want to be when they grow up and all the jobs would be done by people eager to do them. But there are jobs that it seems nobody wants to do. Jobs that we consider base or below our status, but they are jobs that are essential to our society's functioning. Somebody has to do those jobs, and who gets to decide who will have the misfortune of being assigned the job nobody else wants to do? Who will be in power? Feenberg (and Marx et al.) strive for an equal society, but I question whether this is really possible. I question whether it's useful to align ourselves with the idea of equality, the illusion of equality or whether it's more useful to manipulate the current system to our own ends.

As I think about how Feenberg's work can relate to the work I do in First-Year Writing, I question whether it would be useful to point out how helpless many of my students are. The first-generation students who have grown up watching their parents slave away in the capitalist structure so that their children can have better life--how will it help this student to realize that they have disenfranchised by this system? Stopping at this realization and moving forward with an idealistic response seems counteractive to me. Rather, I'd want to acknowledge how the current system disempowers some while fully elevating others to a privileged position within society, but I would also want students to realize the moments of subversion in which they can grab a hold of to beat the very system that strives to disenfranchise them. I think there is some beauty to the capitalist structure (gasp!), some hopefulness that offers the opportunity for those who are disenfranchised to break out of the mold. Do I think the system takes advantage of that hope? You bet. But this system accounts for power--something that has corrupted man from the beginning of recorded history--and Feenberg's system does not. I think an understanding of how power works can be more beneficial for those striving to make a better life for themselves in this system that the reliance on an idealistic maybe. An understanding that technology is inherently social and therefore endowed with social biases, for instance. An understanding that one can take developed technologies and bend them as needed. An understanding of the means by which one comes to power in this system, how one can exploit their workers, and an understanding of what social justice means so that such exploitation can be prevented. I think Feenberg informs much of this discussion certainly, but I'm not sure I agree with his eventual argument. He discusses only briefly how costly and time consuming a truly democratic technological design would be, but this type of problem is exactly the type that brought down the USSR. He's aiming for the ideal, but this ideal operates in a vacuum.

Saturday, August 22, 2009

Feenberg's Questioning Technology

I'd initially said Feenberg's primary question in Questioning Technology was centered around our relationship to technology. That is, whether or not it is an autonomous phenomenon (as Feenberg claims Heidegger and Habermas seem to argue) or whether it's socially constructed (determined by the social context in which it is created). While I think he's pursuing this question, I don't think there's a doubt in his mind as to the answer of this question. However, with an understanding that technological design is socially determined, Feenberg also attempts to address how we can go about gaining agency in the development of future technologies.

First, he views technological development as inherently social claiming "that the choice between alternatives ultimately depends neither on technical nor economic efficiency, but on the "fit" between devices and the interests and beliefs of the various social groups that influence the design process. What singles out an artifact is its relationship to the social environment, not some intrinsic property" (79). There are always alternative developments, Feenberg claims, but the one that is deemed most beneficial by those with the most power in advancing a certain design is the one that is ultimately followed. These social interactions determine the future of a technology, but this fact is quickly forgetting in history's retelling of the making of said technology. Thus, it is not a question of the most efficient design but rather a question of which option best suits those who have the power to decide, and this, according to Feenberg, is problematic when it comes to those disempowered by the social system but who are typically most affected by the developing technologies. Technological design is demonstrative of the current values of a society. For example, Feenberg discusses the way factory machines were designed to suit the smaller stature of children before child labor was strongly opposed and eventually outlawed. I suppose in this sense, Feenberg is echoing Heidegger to some extent. Heidegger claims that technology is a revealing. In some sense, Feenberg is saying technology (and its design) reveals certain characteristics of the society in which it is developed. Smaller machines used by children in factories is indicative of a society that views children as workers rather than learners.

Feenberg moves to calling for a more democratic method of developing technology in which those who are most affected by a certain design will have an active say in how said technology is developed. Part of this is enabling those most affected to gain a public a voice. Feenberg argues that "to be a citizen is to be a potential victim. This is why information plays such a critical role in environmental politics: the key struggles are often decided in the communicative realm by making private information public, revealing secrets, introducing controversy into supposedly neutral scientific fields, and so on. Once corporations and government agencies are forced to operate under public scrutiny, it becomes much more difficult to support dangerous technologies such as nuclear power" (120; emphasis added). Feenberg goes on to argue that expertise is part of what is keeping technology from being democratically decided, but I'm not really sure whether he's arguing that expertise is a bad thing or whether it should be more fully dispersed. For instance, he notes that "expertise has historically served class power. The bias in favor of representing the interests of a narrow ruling group is strongly entrenched. An undemocratic technical system can offer privileges to its technical servants that might be threatened by a more democratic system" (143) and goes on to argue that "the most important means of assuring more democratic technical representation remains transformation of the technical codes and the educational process through which they are inculcated" (143). I'm not sure if he's arguing for a broader distribution of information here or whether he's arguing for something else I haven't quite put my finger on.

Once he gets to the end of the book, though he's been discounting ideas of essentialism and calling for a more democratic process, he seems to argue that "everything will be okay" because as technology is developed its primary instrumentatalization--that which it was designed for--is only one of many uses of said technology. As it enters the social sphere, users will develop a seemingly infinite number of secondary instrumentalizations that conceive of new uses for the technology that suit the society in which it's developed. One example of this that Feenberg provides is the Internet, which was initially developed as a means by which the government could make official documents/numbers widely available to other governmental organizations (military, research institutions, etc.) but was quickly subverted by its users as a global communicative device. The Internet was not initially designed as a communication tool, but its users saw fit to develop the technologies for their own purposes. Feenberg seems to be saying that because this type of activity is going on, it seems very possible we can have a more democratic system of technical design. He says "but unexpected struggles over issues such as nuclear power, access to experimental treatment, and user participation in computer design remind us that the technological future is by no means predetermined. The very existence of these struggles suggests the possibility of a chance in the form of technical rationality. They prefigure a general reconstruction of modernity in which technology gathers a world to itself rather than reducing its natural, human and social environment to mere resources" (224). The possibility exists, but I think Feenberg is arguing that we need to take advantage of it and become more active in the development of technologies that influence/affect our lives.

In so far as how this relates to my work in composition, I'm not really sure I can say at this point because I'm not really sure I fully understand all of what Feenberg's arguing for. I certainly buy into his theory that technology is socially constructed, but as far as making the design process more democratic, I'm not sure I understand really what he's calling for. Logistically speaking, it's not clear to me. The idea that technology is not predetermined seems especially useful for those that technology disempowers. I'm thinking specifically of non-native English speakers attempting to use Word, for instance. But is this something I would actively integrate into a course in which I had students struggling with word processing technologies? Would having a discussion about how socially constructed said program is help them overcome their difficulties using that program? Or is my knowledge of the fact Word is privileging SWE enough? I don't think so. I don't think Feenberg would say it's enough either. But what I'm not sure on is whether he's arguing for those students to be able to petition Microsoft and democratically vote for the inclusion of their languages in the design. The idea of disseminating information to make a more democratic society is also especially useful in such a class. I can see intersections between this line of thought and current discussions concerning public writing and service learning. Feenberg's argument that holding corporations to account for the technologies they create by making information public seems well aligned with the idea of public writing. But I'm not sure Feenberg is really adding anything new to this conversation. Maybe he was when this book was written in 1999.

Some questions I had as I engaged Feenberg's text... Well, first and foremost, what is he really arguing for? In concrete terms, what is he calling for when he says we need to democratize the development of technologies? Does he want all people to have a say in the development of all technology? How would this work? Would it be possible in the real world, or is this an ideal that we should strive for with the understanding that it will never be fully possible? If the users of technology are already subverting it for their own purposes, aren't they already doing something that is rather democratic? Even the corporations seem to be doing this to some extent--they're (in many cases) building on existing technologies to develop something for their own purposes--they're developing secondary instrumentalizations for these technologies much the way individuals do with things such as the Internet. Why is the corporation's technology worse, according to Feenberg?