Thursday, November 8, 2018

Extraterrestrial Contact: A Road Map for Humanity


If we are confronted with alien First Contact in our solar system someday we will have to confront more than just the unknown. We will have to confront our inner nature. Do we spasm in fear as a human civilization and seek to protect ourselves at all costs? Do we allow greed and fear to create international chaos through conflict between nations?

It may be tough to get people to take that internal exploration seriously. We will be excited and focused on learning as much as we can about the alien perspective. Dedicated anthropologists and futurists will need to do the internal work and then help to prepare a roadmap for humanity.

Luckily, there are already science-based groups doing this sort of thinking. The Future of Life Institute (FLI) at the Massachusetts Institute of Technology is one such organization. It is led by M.I.T Physics professor Max Tegmark, Skype founder Jann Tallinn, and UC Santa Cruz Physics professor Anthony Aguirre. They focus on the human challenges in four major areas: Artificial Intelligence, Biotechnology, Nuclear Weapons and Climate.

The FLI scientific advisory board is an example of the range of expertise such think tanks enjoy. Advisors come from fields of study including business, genetics, brain science and artificial intelligence.

Oxford University professor Nick Bostrom is on the FLI board. His group, Future of Humanity Institute (FHI) at Oxford, focuses on many of the same topics with an emphasis on AI governance. They too consider existential risks such as climate change and nuclear proliferation, pointing out that such anthropogenic threats are immediate and wide-spread for humanity.

Other groups covering the same intellectual territory include the Global Catastrophic Risk Institute (GCRI) and the Centre for the Study of Existential Risk (CSER).

It is interesting to note that these organizations don’t consider the impact of First Contact with extraterrestrials. While I understand that the man-made challenges of climate change, nuclear war and run-amok artificial intelligence are much bigger threats, it is concerning that very few people are considering First Contact risks.

Groups such as these would be our second responders in a First Contact event. They may not be involved in the initial study, but their expertise in looking forward would be critical After First Contact. It could be disastrous to stumble into the future without an assessment of risk and a plan for positive development. Let’s just hope that world leaders understand this need if First Contact does occur in the future.


Photo by Marc-Olivier Jodoin on Unsplash









No comments: