Jump to content

Jürgen Schmidhuber

From Wikiquote
Jürgen Schmidhuber

Jürgen Schmidhuber (born 17 January 1963) is a German computer scientist noted for his work in the field of artificial intelligence, specifically artificial neural networks.

Quotes

[edit]

Since age 15 or so, the main goal of professor Jürgen Schmidhuber has been to build a self-improving Artificial Intelligence (AI) smarter than himself, then retire.

You cannot stop it. Surely not on an international level, because one country might may have really different goals from another country. So, of course, they are not going to participate in some sort of moratorium. But then I think you also shouldn’t stop it. Because in 95% of all cases, AI research is really about our old motto, which is make human lives longer and healthier and easier. ... It’s just that the same tools that are now being used to improve lives can be used by bad actors, but they can also be used against the bad actors. And I would be much more worried about the old dangers of nuclear bombs than about the new little dangers of AI that we see now.

It is true that AI can be weaponized, and I have no doubt that there will be all kinds of AI arms races, but AI does not introduce a new quality of existential threat. The threat coming from AI weapons seems to pale in comparison to the much older threat from nuclear hydrogen bombs...

I signed this open letter by LAION because I strongly favor the open-source movement. And I think it's also something that is going to challenge whatever big-tech dominance there might be at the moment. Sure, the best models today are run by big companies with huge budgets for computers, but the exciting fact is that open-source models are not so far behind; some people say maybe six to eight months only.

 Within a few tens of billions of years, curious self-improving AIs will colonize the visible cosmos in a way that’s infeasible for humans. Those who don’t won’t have an impact. Sounds like science fiction, but since the 1970s I have been unable to see a plausible alternative to this scenario, except for a global catastrophe such as an all-out nuclear war that stops this development before it takes off.

Don’t think of humans as the crown of creation. Instead, view human civilization as part of a much grander scheme, an important step (but not the last one) on the path of the universe from very simple initial conditions toward more and more unfathomable complexity. Now it seems ready to take its next step, a step comparable to the invention of life itself over 3.5 billion years ago.  Alas, don’t worry, in the end, all will be good!