30 Mar What does astronomy teach us about artificial intelligence?
Cosmologists are always listened to. Stephen Hawking was treated as one of the biggest geniuses of our time. In Finnish media cosmologist Esko Valtaoja, the champion of interdisciplinarity, is always asked to comment on various topics. So let’s try to make AI into a more easy-to-understand concept by using cosmology.
To many of us astronomy mainly means scratching heads during a game of Trivial Pursuit. AI in it’s turn is, depending on the perspective, either an all-encompassing buzz word or a neural network assigned to do a certain task. To me it’s the latter but I also understand why many see it the other way.
I think popularizing, making complicated things more easy-to-understand, is the best thing you can do to a fellow person. That’s why the aim of this post is to give new information and a new perspective on both cosmology and AI. And because fear sells let’s start with threats.
Why haven’t we met extraterrestrial life? Is AI a threat?
The question why haven’t we met intelligent extraterrestrial life culminates in intelligence. Intelligence isn’t necessarily the most important ability in survival, you see.
In his book “The universe in a nutshell” from 2001 Stephen Hawking wrote: “Bacteria are doing great even without intelligence, and they will survive after this so-called intelligent species has destroyed themself with nuclear bombs.” This might also be the reason why we haven’t been contacted from other galaxies, even if life existed there. It might not be intelligent enough. Or maybe it got too intelligent momentarily.
We fear – partly with a point but quite often pointlessly – what we don’t know. Whether that scary and unknown thing exists or not doesn’t matter. The fear of the attack of the Martians is just as rational as the fear of a war against supercomputers. In both cases the fear is still the same: something more intelligent than us comes and wipes us off the face of the Earth.
Cosmology and AI have also another thing in common: singularity.
If we trust commonly accepted scientific theories and leave the chemtrail sighting -category to their own value, one thing is clear: humanity has a singularity in front of it as well as behind it.
(There’s not a thing completely agreed on in the scientific world. And science resolves itself. And so on. Let’s use layman terms here.)
In astrophysics singularity means a distortion in the time-space continuum. Because I want to express myself in a more non-Marvel-universe way, I’ll quote Wikipedia: singularity is a phenomenon “where density apparently becomes infinite”.
Put simply: in a singularity, there is a small point where an unimaginable amount of matter is concentrated and it no longer follows the scientific principles. This is believed to be the case in black holes and in the Big Bang, of course.
Singularity is an interesting phenomenon also because it is difficult to research. We do not know what happens in the core of a black hole or what happened before the Big Bang. Singularity is a state where the laws of the nature do not hold true. That’s why there are only theories.
Technological singularity is an analogy to cosmology.
This sounds more complicated than it actually is. Let’s quote Wikipedia again: singularity “is the hypothesis that the invention of artificial superintelligence (ASI) will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization.” And this change is by definition beyond our comprehension.
In other words: just like how we don’t know what happened before the Big Bang (astrophysical singularity), we do not know what happens when AI acquires a level of comprehension greater than ours (technological singularity).
So we can predict the future as much as we can tell about the time before the Big Bang. And it has been estimated that a technological singularity is possible in just over 10 years.
Are we doomed?
Cosmology gives us some perspective to study AI but it also gives us a chance to prepare for the threat it creates.
Because we don’t know that happens after a technological singularity and whether it’s a threat to all life on Earth or not, we must set our eyes outside of our lovely planet. The best insurance for humanity is to populate the outer space. And for irony’s sake: this isn’t possible without AI, of course.
Let’s come back to Hawking one more time. In 2016 he considered it probable that we will screw up on this planet in one way or another. Even he didn’t give a certain answer to whether it will be with nuclear weapons, climate change or genetically modified viruses. Perhaps with AI. But he was sure that we will screw up.
So we are in between two singularities, with no way of seeing on the other sides.
Future research is just as important as cosmology or neural network research. It helps us understand where we’re headed to and how the journey can be made as safe as possible.
Predicting the future is possible but only to a certain point. With our current knowledge we can predict the time after AI as well as the time before the Big Bang. He who claims otherwise is a snake oil salesman.