Thursday, October 22, 2009

Expert Systems, Engineers, and Anthropologists

In these articles, what the observers viewed through their interpretations made me think about our values in the engineering work, our values in life, and the hidden agendas defining engineering and science.
Forsythe's Article

Now, I bring the following observer's interpretations to this "digital table":
1) Engineers regard knowledge acquisition as it if were a "'thing' that can be extracted, like a mineral or a diseased tooth" (p.47).
AI engineers have transferred the technical and cold terms of computer science and engineering to the human realm, which is in its essence warm. This AI vocabulary is part of the leakage of the idea that humanity can be mechanized, that humans can be considered machines, and that AI systems can mimic the human brain. Saying that the "human brain is a machine" or that "a human brain can be reverse-engineered" is in its core an act of degrading and devaluing humanity.
The idea of uploading and downloading knowledge of a human brain into a database, and updating this knowledge through a computer can be suspicious. Although it seems like science fiction, think about if it becomes a reality. This technology will obviously be used by people who can afford it, basically people who wish to control human populations. The benefits that it may have does not justify the dangers that it may bring.
Neither humanity nor the brain may be compared to a rigid, inflexible, and cold computer executing commands.
2) The observer pinpointed that engineers lacked communication skills that made them dislike interviewing. They seemed to prefer the predictability of computers than the complexity of humans.
I would mention here that this "communication skills" problem could be more a problem of those modern society's values, beliefs, and attitudes that shape the relationships with other humans.
3) They commit the error to narrow knowledge, if they focus only in traces of data coming from few experts.
Other concerns expressed by the author:
1) One of the steps in developing an AI system is to select the "best" knowledge among the whole "universe" of knowledge acquired in the encyclopedias, books, interviews, etc.
How do we know if they missed a particular important piece of knowledge? How do we know if that piece was important to a particular group or minority? (pp. 57-58)
2) The author cited Bourdieu (1977) mentioning that this exercise of building boundaries to define what is "expert knowledge" is a way that power manifests. Another citation of Bourdieu (1977) mentioned that the power that engineers exercise has political aspects.
Latour, Bruno, and Woolgar

Of this article I would like to place attention to the following interpretations of the observer:
1) He/She sees "obscure activities [such] as a technician grinding the brains of rats, by realising that the eventual end product of such activity might be a highly valued diagram" (p. 52).
This interpretation could point a difference between the value system of the natural sciences and engineering cultures and the one of social sciences. This natural sciences value system includes the justification of using animals in laboratories.
2) The hidden "rat race" is seen at the time of discussing which name will be placed first below the article's title.
3) Scientists believe in the objectivity of their work.
And here I argue that a "100% pure, extra-virgin, and cold-pressed" scientific objectivity does not exist at all.
4) The identity of the scientific field that they are protecting, borrowed instruments from other fields (physics, computer science, statistics, etc).
5) The arrogance of some senior lab scientists consider the novices' questions to be annoying, because some seniors consider this knowledge to be "obvious" (pp. 76-77).

blog comments powered by Disqus