Anders Sandberg on Knowing The Unknowable

Monday, October 21, 2013


 Ideas
During a talk a last year's Alcor Conference, Future of Humanity Institute researcher and futurist Anders Sandberg presented a high level talk on the problems of futurist thinking and what some of the implications might be for the pursuit of cryonics.




Researcher, science debater, futurist, transhumanist, and author Anders Sandberg presented and interesting talk on the possibilities of cryonics and other futurist topics at last year's Alcor conference.  He defines his presentation as a 'theoretic and meta talk," on the future.

Uncertainty, for Sandberg, is especially prevalent when we extend outside of our animal needs.  The solutions therefore are to approach such problems with rationality.  However as he points out, decision theory is hard and rationality is hard, so approximation is always necessary.  These problems affect multiple areas including economics and artificial intelligence.

Moreover for humans, our cognitive biases affect our decision-making.  During his talk, Sandberg discusses how these factors affect the pursuit of cryonics.


cognitive biases

Sandberg also points out that certain tools of futurism, like trend lines, can affect accuracy.  He warns that technological tracking, like Moore's Law, or the graphs used by Ray Kurzweil to predict the Singularity, do not really take into account uncertainty and, as a consequence, the models may not reflect what is actually going to happen.  For Sandberg, the Singularity also represents a potential area of uncertainty.

Looking at cryonics Sandberg comments, "Cryonics is actually fun, because we have so much uncertainty in it."  There is uncertainty about the future situation(s), uncertainty about storage, uncertainty about the future of medical advances, uncertainty about the reanimation situation and uncertainty about identity recovery (especially considering how little we know about what defines identity now).  Statistically, for Sandberg, "not getting suspended poses the greatest risk of all."
Sandberg model of cryonics
Sandberg model of cryonics - Image Source: Anders Sandberg
Also for Sandberg, "The Singularity is a wildcard you may or may not believe in.  It amounts to a serious chunk of the positive futures, but is likely to also contribute significant existential risk."

Sandberg holds a Ph.D. in computational neuroscience from Stockholm University, and is currently a James Martin Research Fellow at the Future of Humanity Institute at Oxford University.

Sandberg's research centres on societal and ethical issues surrounding human enhancement and new technology, as well as on assessing the capabilities and underlying science of future technologies. His recent contributions include work on cognitive enhancement (methods, impacts, and policy analysis); a technical roadmap on whole brain emulation; on neuroethics; and on global catastrophic risks, particularly on the question of how to take into account the subjective uncertainty in risk estimates of low-likelihood, high-consequence risk.

Related articles
He has worked on this within the EU project ENHANCE, where he also was responsible for public outreach and online presence. Besides scientific publications in neuroscience, ethics, and future studies, he has also participated in the public debate about human enhancement internationally. Anders also holds an AXA Research Fellowship.

Sandberg has a background in computer science, neuroscience and medical engineering. He obtained his Ph.D in computational neuroscience from Stockholm University, Sweden, for work on neural network modelling of human memory. He has also been the scientific produce for the major neuroscience exhibition “Se Hjärnan!” (“Behold the Brain!”), organized by Swedish Travelling Exhibitions, the Swedish Research Council and the Knowledge Foundation that toured Sweden 2005–2007. He is co-founder and writer for the think tank Eudoxa.


SOURCE  Alcor

By 33rd SquareSubscribe to 33rd Square

0 comments:

Post a Comment