Ray Kurzweil, inventor and laptop engineer, presents a speak on the Singularity at the RAS Conference 2007. See our assortment of laptop computer footage. It's a typical theme in science fiction -- mankind struggles to outlive in a dystopian futuristic society. Scientists discover too late that their machines are too highly effective to manage. Computers and robots power the human race into servitude. But this popular plot won't belong inside the realm of fiction eternally. Discussed by philosophers, computer scientists and girls named Sarah Connor, this idea appears to gain extra credence every year. Could machines exchange people because the dominant force on the planet? Some might argue that we've already reached that point. After all, computer systems enable us to speak with each other, keep track of complicated methods like international markets and even control the world's most harmful weapons. On prime of that, robots have made automation a actuality for jobs starting from constructing cars to constructing pc chips. They lack the power to make selections outside of their programming or use intuition. Without self-awareness and the flexibility to extrapolate based on available information, machines stay instruments. How lengthy will this last? Are we headed for a future wherein machines gain a type of consciousness? If they do, what occurs to us? Will we enter a future in which computers and robots do all the work and we enjoy the fruits of their labor? Will we be converted into inefficient batteries a la "The Matrix?" Or will machines exterminate the human race from the face of the Earth? To the typical particular person, risis history these questions could seem outlandish. But some people suppose we need to take questions like these into consideration now. One such particular person is Vernor Vinge, a former professor of arithmetic on the San Diego State University. Vinge proposes that mankind is heading toward an irrevocable destiny by which we are going to evolve beyond our understanding through the usage of expertise. He calls it the singularity. What's the singularity, and how might it come about? Out of those 4 prospects, the first three might lead to machines taking over. While Vinge addresses all the prospects in his essay, he spends probably the most time discussing the first one. Let's take a look at his idea. Computers are inclined to double in power every two years or so. This development is said to Moore's Law, which states that transistors double in power each 18 months. Vinge says that at this charge, it's only a matter of time before people build a machine that can "assume" like a human. But hardware is just part of the equation. Before synthetic intelligence becomes a reality, men working somebody should develop software program that may permit a machine to analyze knowledge, make decisions and act autonomously. If that happens, we can anticipate to see machines begin to design and construct even higher machines. These new machines may build quicker, more highly effective fashions. Technological advances would transfer at a blistering pace. Machines would understand how to improve themselves. Humans would turn into out of date in the computer world. We'd have created a superhuman intelligence. Advances would come quicker than we could acknowledge them. In short, we might reach the singularity. What would happen then? Vinge says it's not possible to say. The world would grow to be such a different landscape that we are able to solely make the wildest of guesses. Vinge admits that while it's probably not fruitful to recommend potential situations, it is still lots of enjoyable. Maybe we'll reside in a world where every particular person's consciousness merges with a pc network. Or maybe machines will accomplish all our duties for us and let us dwell in luxury. But what if the machines see people as redundant -- or worse? When machines attain the point the place they will restore themselves and even create higher variations of themselves, may they come to the conclusion that people should not only unnecessary, but in addition unwanted? It certainly looks like a scary scenario. But is Vinge's vision of the long run a certainty? Is there any means we can keep away from it? Find out in the following part -- earlier than it is too late. We will use robots to perform repetitive duties routinely, but will we engineer all people out of a job? Not everyone thinks we're destined -- or doomed -- to reach the singularity detailed in Vinge's essay. It won't even be bodily attainable to achieve the advances necessary to create the singularity effect. To understand this, we want to go back to Moore's Law. In 1965 Gordon E. Moore, a semiconductor engineer, proposed what we now name Moore's Law. He seen that as time handed the price of semiconductor components and manufacturing prices fell. Rather than produce built-in circuits with the identical quantity of power as earlier ones for half the fee, engineers pushed themselves to pack extra transistors on every circuit. The development grew to become a cycle, which Moore predicted would proceed till we hit the bodily limits of what we can achieve with built-in circuitry. Today, we say that the information density of an built-in circuit doubles every 18 months. Manufacturers now construct transistors on the nanoscale. Recent microprocessors from Intel and AMD have transistors that are 45-nanometers large -- a human hair can have a diameter of up to 180,000 nanometers. Engineers and physicists aren't certain how for much longer this can continue. Gordon Moore stated in 2005 that we're approaching the fundamental limits to what we can obtain via building smaller transistors. Even when we find a approach to construct transistors on a scale of just a few nanometers, they would not essentially work. That's as a result of as you strategy this tiny scale you need to take quantum physics under consideration. It turns out that whenever you deal with things on a subatomic scale, they behave in ways that seemingly contradict common sense. For instance, physicists have shown that electrons can cross by means of extremely skinny materials as if the material weren't there. They name this phenomenon electron or quantum tunneling. The electron does not make a bodily hole through the fabric -- it just seemingly approaches from one facet and ends up on the other. Since transistors control the stream of electrons like a valve, this turns into an issue. If we hit this bodily limit earlier than we are able to create machines that can assume as properly or better than humans, we may never attain the singularity. While there are other avenues we can explore -- resembling building chips vertically, utilizing optics and experimenting with nanotechnology -- there's no assure we'll be able to keep up with Moore's Law. That might not stop the singularity from coming however it might take longer than Vinge's prediction. Another manner to prevent the singularity contains building in safety features earlier than machines are capable of grow to be self-conscious. These features may even resemble the three laws of robotics proposed by Isaac Asimov. But Vinge counters that argument by mentioning one element: if the machines are smarter than we're, won't they be capable of finding methods around these guidelines? Even Vinge would not go so far as to say the singularity is inevitable. There are plenty of different engineers. Philosophers who assume it's a non-problem. But possibly it's best to assume twice before you mistreat a piece of equipment -- you never know if it'll come after you for revenge later down the road. To learn more about our future computer overlords and different subjects, take a look on the hyperlinks on the next page. Moore's Law is not a tough-and-quick rule, nor is it governed by nature. It's a self-fulfilling prophecy. Semiconductor manufacturers push themselves to maintain pace with the legislation, which makes the law an correct prediction. Why are there limits on CPU speed? Are we 10 years away from synthetic life? Chang, Kenneth. "Nanowires May Result in Superfast Computer Chips." The new York Times. November 9, 2001. (Sept. Dubash, Manek. "Moore's Law is dead, says Gordon Moore." TechWorld. April 13, 2005. (Sept. Hanson, Robin. "A Critical Discussion of Vinge's Singularity Concept." (Sept. Moore, Gordon. "Cramming extra parts onto integrated circuits." Electronics. April 19, 1965. Vol. 38, No. 8. (Sept. Vinge, Vernor. "The approaching Technological Singularity: How to outlive within the Post-Human Era." Vision-21 Symposium. Yudkowsky, Eliezer. "The Low Beyond." 2001. (Sept.