Saturday, August 22, 2009

Feenberg's Questioning Technology

I'd initially said Feenberg's primary question in Questioning Technology was centered around our relationship to technology. That is, whether or not it is an autonomous phenomenon (as Feenberg claims Heidegger and Habermas seem to argue) or whether it's socially constructed (determined by the social context in which it is created). While I think he's pursuing this question, I don't think there's a doubt in his mind as to the answer of this question. However, with an understanding that technological design is socially determined, Feenberg also attempts to address how we can go about gaining agency in the development of future technologies.

First, he views technological development as inherently social claiming "that the choice between alternatives ultimately depends neither on technical nor economic efficiency, but on the "fit" between devices and the interests and beliefs of the various social groups that influence the design process. What singles out an artifact is its relationship to the social environment, not some intrinsic property" (79). There are always alternative developments, Feenberg claims, but the one that is deemed most beneficial by those with the most power in advancing a certain design is the one that is ultimately followed. These social interactions determine the future of a technology, but this fact is quickly forgetting in history's retelling of the making of said technology. Thus, it is not a question of the most efficient design but rather a question of which option best suits those who have the power to decide, and this, according to Feenberg, is problematic when it comes to those disempowered by the social system but who are typically most affected by the developing technologies. Technological design is demonstrative of the current values of a society. For example, Feenberg discusses the way factory machines were designed to suit the smaller stature of children before child labor was strongly opposed and eventually outlawed. I suppose in this sense, Feenberg is echoing Heidegger to some extent. Heidegger claims that technology is a revealing. In some sense, Feenberg is saying technology (and its design) reveals certain characteristics of the society in which it is developed. Smaller machines used by children in factories is indicative of a society that views children as workers rather than learners.

Feenberg moves to calling for a more democratic method of developing technology in which those who are most affected by a certain design will have an active say in how said technology is developed. Part of this is enabling those most affected to gain a public a voice. Feenberg argues that "to be a citizen is to be a potential victim. This is why information plays such a critical role in environmental politics: the key struggles are often decided in the communicative realm by making private information public, revealing secrets, introducing controversy into supposedly neutral scientific fields, and so on. Once corporations and government agencies are forced to operate under public scrutiny, it becomes much more difficult to support dangerous technologies such as nuclear power" (120; emphasis added). Feenberg goes on to argue that expertise is part of what is keeping technology from being democratically decided, but I'm not really sure whether he's arguing that expertise is a bad thing or whether it should be more fully dispersed. For instance, he notes that "expertise has historically served class power. The bias in favor of representing the interests of a narrow ruling group is strongly entrenched. An undemocratic technical system can offer privileges to its technical servants that might be threatened by a more democratic system" (143) and goes on to argue that "the most important means of assuring more democratic technical representation remains transformation of the technical codes and the educational process through which they are inculcated" (143). I'm not sure if he's arguing for a broader distribution of information here or whether he's arguing for something else I haven't quite put my finger on.

Once he gets to the end of the book, though he's been discounting ideas of essentialism and calling for a more democratic process, he seems to argue that "everything will be okay" because as technology is developed its primary instrumentatalization--that which it was designed for--is only one of many uses of said technology. As it enters the social sphere, users will develop a seemingly infinite number of secondary instrumentalizations that conceive of new uses for the technology that suit the society in which it's developed. One example of this that Feenberg provides is the Internet, which was initially developed as a means by which the government could make official documents/numbers widely available to other governmental organizations (military, research institutions, etc.) but was quickly subverted by its users as a global communicative device. The Internet was not initially designed as a communication tool, but its users saw fit to develop the technologies for their own purposes. Feenberg seems to be saying that because this type of activity is going on, it seems very possible we can have a more democratic system of technical design. He says "but unexpected struggles over issues such as nuclear power, access to experimental treatment, and user participation in computer design remind us that the technological future is by no means predetermined. The very existence of these struggles suggests the possibility of a chance in the form of technical rationality. They prefigure a general reconstruction of modernity in which technology gathers a world to itself rather than reducing its natural, human and social environment to mere resources" (224). The possibility exists, but I think Feenberg is arguing that we need to take advantage of it and become more active in the development of technologies that influence/affect our lives.

In so far as how this relates to my work in composition, I'm not really sure I can say at this point because I'm not really sure I fully understand all of what Feenberg's arguing for. I certainly buy into his theory that technology is socially constructed, but as far as making the design process more democratic, I'm not sure I understand really what he's calling for. Logistically speaking, it's not clear to me. The idea that technology is not predetermined seems especially useful for those that technology disempowers. I'm thinking specifically of non-native English speakers attempting to use Word, for instance. But is this something I would actively integrate into a course in which I had students struggling with word processing technologies? Would having a discussion about how socially constructed said program is help them overcome their difficulties using that program? Or is my knowledge of the fact Word is privileging SWE enough? I don't think so. I don't think Feenberg would say it's enough either. But what I'm not sure on is whether he's arguing for those students to be able to petition Microsoft and democratically vote for the inclusion of their languages in the design. The idea of disseminating information to make a more democratic society is also especially useful in such a class. I can see intersections between this line of thought and current discussions concerning public writing and service learning. Feenberg's argument that holding corporations to account for the technologies they create by making information public seems well aligned with the idea of public writing. But I'm not sure Feenberg is really adding anything new to this conversation. Maybe he was when this book was written in 1999.

Some questions I had as I engaged Feenberg's text... Well, first and foremost, what is he really arguing for? In concrete terms, what is he calling for when he says we need to democratize the development of technologies? Does he want all people to have a say in the development of all technology? How would this work? Would it be possible in the real world, or is this an ideal that we should strive for with the understanding that it will never be fully possible? If the users of technology are already subverting it for their own purposes, aren't they already doing something that is rather democratic? Even the corporations seem to be doing this to some extent--they're (in many cases) building on existing technologies to develop something for their own purposes--they're developing secondary instrumentalizations for these technologies much the way individuals do with things such as the Internet. Why is the corporation's technology worse, according to Feenberg?

No comments:

Post a Comment