Christoph Lütge
Appearance

Christoph Lütge (born November 10 , 1969) is a German philosopher and economist notable for his work on business ethics, AI ethics, experimental ethics and political philosophy. He is full professor of business ethics at the Technical University of Munich and director of its Institute for Ethics in Artificial Intelligence.
Quotes
[edit]- This is a very unique opportunity to work on ethical issues in a new technology on such a scale. It comes at a point in time when AI is on the forefront of a large number of both scientific and public debates. We have the chance to work on AI ethics issues in detail, and not just by doing research behind closed doors, but with an outreach to civil society, politics, and the corporate world.
- First, I always state that there are no obligations whatsoever towards Facebook. The new institute is an independent research institute, which will also have an independent Advisory Board with no members of Facebook sitting on it. The money comes as a gift for research. It will be used to help make AI systems more ethical, not just by putting together some abstract principles, but by working on concrete issues, like algorithms, systems, robots or screening technologies, for example. Therefore, if the money from Facebook can be employed for advancing ethics and bringing (ethical) benefits to the users of AI (which we all either soon will be or already are), it will be beneficial for all sides.
- When sitting on the German federal ethics committee for autonomous driving, I had a lot of discussions with representatives from automotive companies who felt the same way about this kind of technology: They were equally looking for academia and government to help them address ethical problems. There are some problems a single company – or even industry as a whole – cannot address on their own: These range from questions like how to organize AI accountability issues to very fundamental philosophical problems such as: How much dependence on certain technologies are we willing to accept as a society?
- If innovation is considerably stifled, it cannot bring about its ethical potential.
- This is not a simple yes or no question. In general however, I believe regulation should be approached with caution at this point. The digitech markets are still very dynamic, and that has to be taken into account. It should also be clear beforehand that specific regulation would achieve the goals it aims at and not be counterproductive: if innovation is considerably stifled, it cannot bring about its ethical potential. Therefore, I favor an approach that relies on ethical guidelines first, and in which all parts of society participate.
- There is still a lot of critical discussion on the GDPR in Europe, and there are some valid arguments in it. I believe however that on the overall, the GDPR can become a tool for improving trust in digital technologies without putting the brakes on them. It could become a sort of blueprint for other regions of the world, as it sets a relatively clear regulatory framework for people to sell their data. This issue is certainly seen in a more liberal way in the US, where the use of data is not considered per se as problematic as in Germany, in particular. But still, there are a lot of critics in the US too, and a – revised and refined – GDPR could address these.
- The digital world is a mirror of society in many ways. Of course, there are a lot of activities going on which people would not openly admit to, some of them illegal, certainly. However, I am not sure that privacy (and in particular, privacy with respect to illegal activities) has increased when compared to the non-digital world. Was it not worse in a time when dictatorships around the world could shield their citizens from information from the outside or when companies could hide their activities easily without having to worry about the power of social networks? So in this regard, I believe we cannot complain about too much privacy in the digital world – in some ways at least, we had more privacy in the old days, with bad consequences sometimes. Still, I agree of course that disclosure of information is essential in many digital contexts and too much privacy can have bad consequences too.
