Speaking recently at Oxford University, philosopher Nick Bostrom outlined the case for studying existential risk — the chances of a phenomenon wiping out all of humanity. Bostrom finds a lack of serious research on the topic despite its importance to us all. |
Bostrom is Professor in the Faculty of Philosophy at Oxford University and founding Director of the Future of Humanity Institute and of the Programme on the Impacts of Future Technology within the Oxford Martin School. He's also the co-founder and chair of both the World Transhumanist Association, which advocates the use of technology to extend human capabilities and lifespans, and the Institute for Ethics and Emerging Technologies.
According to Bostrom, "Many theories of value imply that even relatively small reductions in net existential risk have enormous expected value. Despite their importance, issues surrounding human-extinction risks and related hazards remain poorly understood."
0 comments:
Post a Comment