4. Die Frage

Why does it make me so uncomfortable to be in my dad’s car? Don’t worry, this does not consider my relationship with my father. Granted, when my dad tells me to keep the wheel straight it stresses me out. However, this is annoyance that emotional bonds are built on. It is a lifetime of caring and of rebelling summed up in one movement with the hand whenever I take a turn too sharply. Armies go to war for the right woman, but I think that wars could just as easily be started by a father telling his son to steer his war horse more carefully.

No, the relationship that I am concerned with here is one that might prove even more important than that with one’s parents. This is the relationship between my hands and automated steering, between me and my dad’s car: between humans and technology.

 

I have only recently started reading the work of Martin Heidegger, specifically The Question Concerning Technology (translated title). Some works are best read when you are already thinking the thoughts that are written down so well, and this was one of them for me. I am still in no position to try and summarise this work, but allow me to loosely use the framework that I interpreted Heidegger as using.

The essential danger that lies within the use of technology is that of becoming a blockade between us and the world. Our sense of understanding becomes clouded behind a veil of technology that is developed beyond our comprehension, beyond our control even. Note that this is any kind of technology, not only that which is as advanced as we have been discussing before. Heidegger wrote in the 20th century, when the talk of technology was partly immensely relevant already and partly a premonition for the times to come.

What I took as a main conclusion from reading this small part of his work is that this danger comes from the development of non-meaningful relations. When we implicitly create a duality between us and the technology that is so prevalent in our society, we immediately also commit to an idea of superiority and uniqueness over “the rest”. When applied to technology, exactly the absence of these sort of relationships is what defines the danger that exists in technology.

To cultivate a more meaningful relationship with e.g. machine learning algorithms, it is therefore necessary to allow for such a relation to occur. This process consists of taking a view of the algorithms as part of our direct nature and letting go of the gap between us and this nature. There might be a path of interaction and discussion, that opens up an essence of trust and of meaning between us and artificial entities. However, for this we must first let go of the idea of superiority and of being the only thing that matters.

Exactly what this means I do not know (hence my gratitude of being able to use the word “draft”), but this still defines a promising angle of looking at trustworthy machine learning. To enhance the process of trust and cooperation between us and artificial entities, there is a gap to bridge. This is the gap of hidden technology (consider specifically Latour’s idea of blackboxing), the gap of presupposed neutrality of algorithms, the gap that comes from the taboo of being on a different level from whatever we create and interact with.

If my dad’s car tells me to steer between the lines, could we maybe come to a compromise between the car’s correctness and my own sense of security and well-being? What if we could enter a “discussion”, for lack of a better word, and both learn from each other? To approach the world, in Heidegger’s words, through technology as a mode of understanding, and vice versa for the algorithm? Wouldn’t that be beautiful?

Previous
Previous

5. Let’s talk about God

Next
Next

3. Putting robots in boxes