Evolving Intelligence is a Knife’s Edge Balance

Fast Evolution vs Slow Evolution
Evolving intelligence too quickly can threaten the survival of an intelligent species because advanced technology is dangerous without an equivalent level of ethics. On the other hand, evolving intelligence too slowly can threaten the survival of an intelligent species because natural cause events can exterminate a species without the technology to avoid them.

Our civilization seems to be threatened by the advance of our technology into areas that can be deadly without sufficient understanding of our connection with others and everything around us (ethics). On the other hand, it’s only in the last few years that we have begun to understand how much danger is presented by asteroid strikes and we still don’t have a good response. Advanced space going technology will be needed to either defend against a killer asteroid or to establish a foothold in space as insurance against the obliteration of Earth.

We are fortunate to have made it to a position where we can become aware of the threats and begin to postulate responses to them. But the same technology that can become our salvation has the potential to destroy us. We must grow up fast.

The aliens are silent because they are extinct – [anu.edu]

Life on other planets would likely be brief and become extinct very quickly, say astrobiologists from ANU Research School of Earth Sciences.

In research aiming to understand how life might develop, the scientists realised new life would commonly die out due to runaway heating or cooling on their fledgling planets.

“The universe is probably filled with habitable planets, so many scientists think it should be teeming with aliens,” said Dr Aditya Chopra, lead author on the paper, which is published in Astrobiology.

“Early life is fragile, so we believe it rarely evolves quickly enough to survive.”

A New Filter for Life’s Survival – [centauri-dreams.org]

How do we make out the odds on our survival as a species? Philosopher Nick Bostrom (University of Oxford) ponders questions of human extinction in terms of a so-called Great Filter. It’s one that gives us a certain insight into the workings of the universe, in Bostrom’s view, because it seems to keep the galaxy from being positively filled with civilizations. Somewhere along the road between inert matter and transcendent intelligence would be a filter that screens out the vast majority of life-forms, keeping the population of the galaxy low, and offering us a way to gauge our own chances for survival.

Extinction Alert: Stephen Hawking Says Our Technology Might Wipe Us Out

SEE ALSO:
The Progression of Ethics
Ethics is the Best Measure of Civilization
The L-Factor Limits Contact With ET Aliens
Virtue in Intelligence
The Sequence of Contact

Comments are closed.