Top Five Singularity Concerns

Sunday, January 8, 2012


It can't continue forever.
The nature of exponentials is that you push them out
and eventually disaster happens.

 - Gordon Moore

Although we are generally positive about the developments that are leading to the technological singularity, there are a few potentially worrisome factors to consider.  We have broken these out as:

1.  Brain state issues at time of upload



So you've died and passed on.  Your brain has been plasticized or cryogenically preserved for future re-instantiation.  The big problem - your death was excruciatingly painful or terrifying and this state is reflected in the recreated connectome.  You find yourself reawakened into a body locked in this state, or uploaded into a simulation of yourself trapped in an endless loop of agony or fear.  

Furthermore, what if you are brought back in a hallucinogenic dimethyltryptamine (DMT) state.  DMT is thought to be the drug your own brain produces during dreaming and at the time of death - Steve Jobs' last words, "Oh, Wow, oh wow!" being a potential example.  Although it might be enjoyable to talk with the clockwork elves for a bit, wouldn't you want your immortal self to be taking part in something a little more meaningful?

Hopefully at the very least this would be just another issue that might be overcome by exponential technology, however being brought back, only to recreate your death might not be such a delightful or transcendent experience.

2.  Apathy 



Voter apathy is a well known phenomenon.  However exponential technological change, and the fact that so many are not prepared for it, and refuse to do anything about it may be a bigger problem.

If apathy with regards to The Singularity is too great, the choices that are made before it occurs may not be vetted by enough of the world's [human] population to make them democratic decisions.  By 'leaving it up to the experts,' people could be selling themselves into slavery or extinction.  


-->

3. Corruption of power by a Scientific Dictatorship


There are some, like Aaron Franz, who view progress to The Singularity as a creation of elites, designed as a way to exert power and control on the teeming masses of useless eaters.    Alex Jones is another popular figure who criticises much of the Singularity movement as being involved with or co-opted  by those who would bring about a scientific dictatorship.  


The US government's recent moves toward censoring the internet through SOPA and the PDD 51 do little to counteract this view, but the anti-liberty, anti-democracy potential of technological progress has been recognized for a long time.

As discussed above, apathy may be a factor in letting a scientific dictatorship come to be.   With control over media and other possible mind-control tools, it may be enforced apathy, according to some.

Aldous Huxley says that the ultimate revolution will come about by using mind-altering drugs or other methods of effecting the human brain in order to get people to "love their servitude." 

Newt Gingrich has written and presented many papers and conferences on what he called "The Age of Transitions." introduced by Gingrich in the National Science Foundation´s Workshop 2001.  This work has been criticized for being part of defining how the Scientific Dictatorship may be established.  Furthermore, many make The Age of Transitions and The Singularity synonymous.

In wider circles, so-called conspiracy theorists also blame the government, or elements within the government and supra-national organizations such as the Bilderberg Group, the Council on Foreign Relations, the U.N. and the Bohemian Grove Club as agents bringing about a scientific dictatorship.  

Would the "99%"enjoy their servitude while only the elites become post-human?  Would the scientific dictatorship be more like Orwell's "1984" or Huxley's "Brave New World?"


4.  Unfriendly Robots / AI 

"The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else." - Eliezer Yudkowsky
A common theme in science fiction is the future battle of robots and AI versus humans.  As the Singularity approaches, this world view does not seem so far fetched.  Nick Bostrom has written about the possibility of artificial intelligence wiping out humanity.


Eliezer Yudkowsky proposed that research be undertaken to produce friendly artificial intelligence in order to address the dangers. He noted that if the first real AI was friendly it would have a head start on self-improvement and thus prevent other unfriendly AIs from developing, as well as providing enormous benefits to mankind. The Singularity Institute for Artificial Intelligence is dedicated to this cause.

A significant problem, however, is that unfriendly artificial intelligence is likely to be much easier to create than FAI: while both require large advances in recursive optimisation process design, friendly AI also requires the ability to make goal structures invariant under self-improvement (or the AI will transform itself into something unfriendly) and a goal structure that aligns with human values and doesn't automatically destroy the human race. An unfriendly AI, on the other hand, can optimize for an arbitrary goal structure, which doesn't need to be invariant under self-modification.

The AI agent, may itself be an uploaded human brain/AI hybrid.  Suppose uploads come before human-level artificial intelligence. An upload is a mind that has been transferred from a biological brain to a computer that emulates the computational processes that took place in the original brain /connectome.   Uploading a mind will make it much easier to enhance its intelligence, by running it faster, adding additional computational resources, or streamlining its architecture. 

One could imagine that enhancing an upload beyond a certain point will result in a positive feedback loop, where the enhanced upload is able to figure out ways of making itself even smarter; and the smarter successor version is in turn even better at designing an improved version of itself, and so on. If this runaway process is sudden, it could result in one upload reaching superhuman levels of intelligence while everybody else remains at a roughly human level. Bostrom suggests, such enormous intellectual superiority may well give it correspondingly great power. It could rapidly invent new technologies or perfect nanotechnological designs, for example. If the transcending upload is bent on preventing others from getting the opportunity to upload, it might do so.  The Lawnmower man made every phone in the world ring at once.


5 Loss of Identity




Google has introduced the Robot Operating System (ROS) for controlling robots and machines via cloud computing.  Increasingly our machines will refer to the cloud for their object recognition, control and other processing functions.  As the cloud develops to the Singularity, will humans find it necessary to plug themselves in too, to survive and compete with the machines?  Will we, by plugging in to the cloud, loose, forget or have taken from us, that which makes us ourselves?  Do we become like the Borg by plugging in to a hive mind?

Nick Bostrom has an excellent article on The Future of Identity, where some of these issues are discussed.  He remarks that identity is central to our human activity, and that  technologies that affect how our identities function can have important effects on the 
individual and on society.

Other factors that should not be totally dismissed are: the grey goo scenario, runaway global warming, extraterrestrial attack and other existential threats.  




Join the conversation in the comments, and let us know what you think.



6 comments: Leave Your Comments

  1. Does scientific dictatorship mean biological warfare or other types of dangerous technology by the singularity because I think one of the main threats to the singularity is the implications of manufacturing on nanometer or smaller scales. I believe it was a Scientific Group or college in Europe who proposed an idea to have nanobots inside your body to remove dangerous leads found in your bloodstream. That is fantastic but what about a nanobot designed to destroy your bloodstream, or nerves, or muscles. With the singularity, we would have the counter technology to ward this off but would everyone have access to such technology? Just curious but maybe that is what you meant within your article.

    ReplyDelete
    Replies
    1. By 'Scientific Dictatorship' I literally meant that a oligarchical group could co-opt the technological tools of The Singularity and use them to control, and potentially destroy the rest of humanity. Look no further than the 'Georgia Guidestones' for evidence for a stated desire among some to reduce our numbers.

      It is also a question of timing. If you are at the 'early adopter' stage of The Singularity, will your mental faculties expand so rapidly that dealing with those that haven't uploaded/plugged in/upgraded themselves be like dealing with a dog, and ant or worse - a bacteria?

      Delete
    2. Interesting ideas. I thought you meant by scientific dictatorship that a human, group, or etc. would become powerful enough to use technology not to advance society but to destroy it. Imagine if a human had the chance to design any disease with the ability to target it at any country. Do you believe dangerous technology could halt the singularity? I do not believe this to be true but if we did enter World War 3 and nuclear weapons were launched, how does technological progress continue with such catastrophic events?

      Delete
  2. From Hacker News:


    These aren't a list of concerns that most AGI researchers would put forth: they are the concerns of a fanboy base, who tend to think of the singularity as some kind of really, really swank PS3.
    reply


    vertr 1 hour ago | link

    It is very interesting to me how the 'singularity culture' talks about the singularity as if it is a given for our future. This seems very un-scientific, as they are betting on something to happen, without proper evidence or even solid ideas on what will happen. Just that it will, and it's going to be great!
    reply


    AndrewDucker 42 minutes ago | link

    The singularity seems, to me, to be based on a few simple ideas:
    1) There is nothing supernatural about thought.
    2) Investigation will lead us to understand it well enough to emulate it.
    3) Once we can do this we will be be able to do it better than people.
    4) Once this happens it will be able to improve itself better than we can.
    5) Once this happens, we have no idea what will happen next.
    2/3 are arguable, but don't seem unreasonable to me. The rest seem obviously true.
    reply


    vertr 21 minutes ago | link

    These ideas seem reasonable to me, however when I talk about 'singularity culture' I refer to those who think that the singularity will be the end of the human race, will 'save the world', will be a new species of humans on this planet, will be the rise of machines who enslave us, is the realization of 2012 and so on. These ideas are going around non-technical circles and they believe them.
    reply


    AndrewDucker 15 minutes ago | link

    Oh yes, the people that have ideas about what will happen afterwards, or who believe that it will save us all seem very odd to me.
    reply

    ReplyDelete
  3. On reddit
    http://www.reddit.com/r/singularity/comments/og8db/top_five_singularity_concerns_although_we_are/

    ReplyDelete
  4. Only one, but a good reason the scientist will never be accepted as community leaders: they are amoral, they will experiment regardless the consequences, just for the sake of knowledge.
    About AI: the fundamental problem is that does not exist a scientifical definition of intelligence(just a bunch of loose estimations about what intelligence can and could do - relevant in the same way it would if I'd define the microscope as an instrument that can be used for nailing). From this lack derives the next: the lack of a mathematical model for intelligence. And another thing about AI: there is a false identity between computing and thinking, but at least two cases are against it: the idiot scientist and animals - both excelling in computing, but lack in reasoning.

    ReplyDelete