bloc 33rd Square Business Tools - IBM Research 33rd Square Business Tools: IBM Research - All Post
Showing posts with label IBM Research. Show all posts
Showing posts with label IBM Research. Show all posts

Monday, August 18, 2014


 Neuromorphic Computing
Researchers at IBM have created by far the most advanced brain-inspired neuromorphic computer chip to date. Called TrueNorth, the chip consists of 1 million programmable neurons and 256 million programmable synapses across 4096 individual neurosynaptic cores.




Researchers led by IBM's Dharmendra Modha have unveiled their latest neuromorphic chip that is the size of a postage stamp and capable of processing massive amounts of data while handling inputs from many different sources, the company said.

The announcement of the TrueNorth chip and the accompanying article in the journal Science, comes one month after IBM unveiled a $3 billion investment over the next five years in chip research and development to find a game-changing breakthrough that can help revive its slumping hardware unit.

Unlike most chips, which operate on pre-written paths, IBM's version processes data in real-time and is capable of dealing with ambiguity, the company said. It runs on the energy equivalent of a hearing aid.

Abstract from Science:
Inspired by the brain’s structure, we have developed an efficient, scalable, and flexible non–von Neumann architecture that leverages contemporary silicon technology. To demonstrate, we built a 5.4-billion-transistor chip with 4096 neurosynaptic cores interconnected via an intrachip network that integrates 1 million programmable spiking neurons and 256 million configurable synapses. Chips can be tiled in two dimensions via an interchip communication interface, seamlessly scaling the architecture to a cortexlike sheet of arbitrary size. The architecture is well suited to many applications that use complex neural networks in real time, for example, multiobject detection and classification. With 400-pixel-by-240-pixel video input at 30 frames per second, the chip consumes 63 milliwatts.

TrueNorth is built on Samsung Electro-Mechanics Co Ltd's 28-nanometre process technology, the chip consumes a miniscule 72 milliwatts of energy at max load, which equates to around 400 billion synaptic operations per second per watt — or about 176,000 times more efficient than a modern CPU running the same brain-like workload.  This is 769 times more efficient than other state-of-the-art neuromorphic approaches.

Related articles
Indeed, IBM may now a momentous step closer to building a brain on a chip, in TrueNorth's case the estimate is around that of the brain of a bee.

"To underscore this divergence between the brain and today’s computers, note that a 'human-scale' simulation with 100 trillion synapses required 96 Blue Gene/Q racks of the Lawrence Livermore National Lab Sequoia supercomputer," writes Modha on the IBM Research site.

The product of nearly a decade of research, mainly funded under the DARPA SyNAPSE program, the chip aims to bridge the divide between existing computers and the brain's high cognitive power and low energy use.


"After years of collaboration with IBM, we are now a step closer to building a computer similar to our brain," said Professor Rajit Manohar at Cornell Tech, where the chip was designed.

Facebook's AI head Yann LeCun has expressed some skepticism over IBM's approach however. "My main criticism is that TrueNorth implements networks of integrate-and-fire spiking neurons. This type of neural net that has never been shown to yield accuracy anywhere close to state of the art on any task of interest (like, say recognizing objects from the ImageNet dataset)," writes on his Google+ account.

According to LeCun, if the brain were the model for TrueNorth, multiplying the number of spiking neurons would not improve performance. "The advantage of spiking neurons is that you don't need multipliers (since the neuron states are binary). But to get good results on a task like ImageNet you need about 8 bit of precision on the neuron states. To get this kind of precision with spiking neurons requires to wait multiple cycles so the spikes 'average out'. This slows down the overall computation."

Terrence J. Sejnowski, director of the Salk Institute’s Computational Neurobiology Laboratory praises IBM's work. “It will take many generations before it can compete, but when it does, it will be a scalable architecture that can be delivered to cellphones, something that Yann’s G.P.U.s [graphics processing units] will never be able to do.”


TrueNorth is essentially ready for commercial applications. Running parallel with IBM's big data and super-computing solutions, like Watson, the neuromorphic system could really add a new dynamic to computation. 


IBM's brain-inspired architecture consists of a network of neurosynaptic cores. Cores are distributed and operated in parallel. Core operate -without a clock- in an event-driven fashion. Cores integrate memory, computation, and communication. Individual cores can fail and yet, like the brain, the architecture can still function. Cores on the same chip communicate with one another via an on-chip event-driven network. Chips communicate via an inter-chip interface leading to seamless scalability like the cortex, enabling creation of scalable neuromorphic systems.

The chip contains one million programmable neurons and could allow a thermometer to scan and smell chemical signals and deliver a diagnosis, or help a search and rescue robot to identify people in need during a disaster, the company said.

Big Blue's long-term goal is to build a neurosynaptic chip system with ten billion neurons and one hundred trillion synapses, all while consuming only one kilowatt of power and occupying less than two liters of volume.

Based on the materials provided, TrueNorth is essentially ready for commercial applications. Running parallel with IBM's big data and super-computing solutions, like Watson, the neuromorphic system could really add a new dynamic to computation. "Over time, our hope is that SyNAPSE will become an integral component of IBM Watson group offerings," writes Modha.

With TrueNorth consuming so much less power than conventional Von Neumann chips, it would make a fantastically efficient processing addition for computer vision systems and sensor input, artificial intelligence and countless other emerging technologies.




SOURCES  IBM Research, New York Times

By 33rd SquareEmbed

Monday, July 14, 2014

IBM Looks To “Post-Silicon” Era To Continue Moore's Law


 Moore's Law
IBM has announced it is investing $3 billion for R&D in two research programs to push the limits of chip technology and extend Moore’s law. The research programs are aimed at “7 nanometer and beyond” silicon technology and developing alternative technologies for post-silicon-era chips using entirely different approaches.




IBM has announced it is investing $3 billion over the next 5 years in two broad research and early stage development programs to push the limits of chip technology needed to meet the emerging demands of cloud computing and Big Data systems. These investments will push IBM's semiconductor innovations from today’s breakthroughs into the advanced technology leadership required for the future.

The first research program is aimed at so-called “7 nanometer and beyond” silicon technology that will address serious physical challenges that are threatening current semiconductor scaling techniques and will impede the ability to manufacture such chips.

The second is focused on developing alternative technologies for post-silicon era chips using entirely different approaches, which IBM scientists and other experts say are required because of the physical limitations of silicon based semiconductors.

IBM will be investing significantly in emerging areas of research including carbon nanoelectronics, silicon photonics, new memory technologies, and architectures that support quantum and cognitive computing.

"The question is not if we will introduce 7 nanometer technology into manufacturing, but rather how, when, and at what cost?"


These teams will focus on providing orders of magnitude improvement in system level performance and energy efficient computing. In addition, the company will continue to invest in the nanosciences and quantum computing--two areas of fundamental science where IBM has remained a pioneer for over three decades.

IBM Researchers and other semiconductor experts predict that while challenging, semiconductors show promise to scale from today's 22 nanometers down to 14 and then 10 nanometers in the next several years.  However, scaling to 7 nanometers and perhaps below, by the end of the decade will require significant investment and innovation in semiconductor architectures as well as invention of new tools and techniques for manufacturing.

"The question is not if we will introduce 7 nanometer technology into manufacturing, but rather how, when, and at what cost?" said John Kelly, senior vice president, IBM Research. "IBM engineers and scientists, along with our partners, are well suited for this challenge and are already working on the materials science and device engineering required to meet the demands of the emerging system requirements for cloud, big data, and cognitive systems. This new investment will ensure that we produce the necessary innovations to meet these challenges."

Related articles
Silicon transistors, tiny switches that carry information on a chip, have been made smaller year after year, but they are approaching a point of physical limitation. Their increasingly small dimensions, now reaching the nanoscale, will prohibit any gains in performance due to the nature of silicon and the laws of physics. Within a few more generations, classical scaling and shrinkage will no longer yield the sizable benefits of lower power, lower cost and higher speed processors that the industry has become accustomed to.

With virtually all electronic equipment today built on complementary metal–oxide–semiconductor (CMOS) technology, there is an urgent need for new materials and circuit architecture designs compatible with this engineering process as the technology industry nears physical scaling limits of the silicon transistor.

Beyond 7 nanometers, the challenges dramatically increase, requiring a new kind of material to power systems of the future, and new computing platforms to solve problems that are unsolvable or difficult to solve today. Potential alternatives include new materials such as carbon nanotubes, and non-traditional computational approaches such as neuromorphic computing, cognitive computing, machine learning techniques, and the science behind quantum computing.

IBM already holds over 500 patents for technologies that will drive advancements at 7nm and beyond silicon -- more than twice the nearest competitor. These continued investments will accelerate the invention and introduction into product development for IBM's highly differentiated computing systems for cloud, and big data analytics.

Quantum Computing


The most basic piece of information that a typical computer understands is a bit. Much like a light that can be switched on or off, a bit can have only one of two values: "1" or "0.” Described as superposition, this special property of qubits enables quantum computers to weed through millions of solutions all at once, while desktop PCs would have to consider them one at a time.

IBM is a world leader in superconducting qubit-based quantum computing science and is a pioneer in the field of experimental and theoretical quantum information, fields that are still in the category of fundamental science - but one that, in the long term, may allow the solution of problems that are today either impossible or impractical to solve using conventional machines. The team recently demonstrated the first experimental realization of parity check with three superconducting qubits, an essential building block for one type of quantum computer.

Neurosynaptic Computing

Bringing together nanoscience, neuroscience, and supercomputing, IBM and university partners have developed an end-to-end ecosystem including a novel non-von Neumann architecture, a new programming language, as well as applications. This novel technology allows for computing systems that emulate the brain's computing efficiency, size and power usage. IBM’s long-term goal is to build a neurosynaptic system with ten billion neurons and a hundred trillion synapses, all while consuming only one kilowatt of power and occupying less than two liters of volume.

Silicon Photonics

IBM has been a pioneer in the area of CMOS integrated silicon photonics for over 12 years, a technology that integrates functions for optical communications on a silicon chip, and the IBM team has recently designed and fabricated the world's first monolithic silicon photonics based transceiver with wavelength division multiplexing.  Such transceivers will use light to transmit data between different components in a computing system at high data rates, low cost, and in an energetically efficient manner.
Silicon nanophotonics takes advantage of pulses of light for communication rather than traditional copper wiring and provides a super highway for large volumes of data to move at rapid speeds between computer chips in servers, large datacenters, and supercomputers, thus alleviating the limitations of congested data traffic and high-cost traditional interconnects.

Businesses are entering a new era of computing that requires systems to process and analyze, in real-time, huge volumes of information known as Big Data. Silicon nanophotonics technology provides answers to Big Data challenges by seamlessly connecting various parts of large systems, whether few centimeters or few kilometers apart from each other, and move terabytes of data via pulses of light through optical fibers.
III-V technologies
IBM researchers have demonstrated the world’s highest transconductance on a self-aligned III-V channel metal-oxide semiconductor (MOS) field-effect transistors (FETs) device structure that is compatible with CMOS scaling. These materials and structural innovation are expected to pave path for technology scaling at 7nm and beyond.  With more than an order of magnitude higher electron mobility than silicon, integrating III-V materials into CMOS enables higher performance at lower power density, allowing for an extension to power/performance scaling to meet the demands of cloud computing and big data systems.

Carbon Nanotubes

IBM Researchers are working in the area of carbon nanotube (CNT) electronics and exploring whether CNTs can replace silicon beyond the 7 nm node.  As part of its activities for developing carbon nanotube based CMOS VLSI circuits, IBM recently demonstrated -- for the first time in the world -- 2-way CMOS NAND gates using 50 nm gate length carbon nanotube transistors.

IBM also has demonstrated the capability for purifying carbon nanotubes to 99.99 percent, the highest (verified) purities demonstrated to date, and transistors at 10 nm channel length that show no degradation due to scaling--this is unmatched by any other material system to date.

Carbon nanotubes are single atomic sheets of carbon rolled up into a tube. The carbon nanotubes form the core of a transistor device that will work in a fashion similar to the current silicon transistor, but will be better performing. They could be used to replace the transistors in chips that power data-crunching servers, high performing computers and ultra fast smart phones.

Carbon nanotube transistors can operate as excellent switches at molecular dimensions of less than ten nanometers – the equivalent to 10,000 times thinner than a strand of human hair and less than half the size of the leading silicon technology. Comprehensive modeling of the electronic circuits suggests that about a five to ten times improvement in performance compared to silicon circuits is possible.

Graphene

Graphene is pure carbon in the form of a one atomic layer thick sheet.  It is an excellent conductor of heat and electricity, and it is also remarkably strong and flexible.  Electrons can move in graphene about ten times faster than in commonly used semiconductor materials such as silicon and silicon germanium. Its characteristics offer the possibility to build faster switching transistors than are possible with conventional semiconductors, particularly for applications in the handheld wireless communications business where it will be a more efficient switch than those currently used.
Recently in 2013, IBM demonstrated the world's first graphene based integrated circuit receiver front end for wireless communications. The circuit consisted of a 2-stage amplifier and a down converter operating at 4.3 GHz.

Next Generation Low Power Transistors

In addition to new materials like CNTs, new architectures and innovative device concepts are required to boost future system performance. Power dissipation is a fundamental challenge for nanoelectronic circuits.

A potential alternative to today’s power hungry silicon field effect transistors are so-called steep slope devices. They could operate at much lower voltage and thus dissipate significantly less power. IBM scientists are researching tunnel field effect transistors (TFETs). In this special type of transistors the quantum-mechanical effect of band-to-band tunneling is used to drive the current flow through the transistor. TFETs could achieve a 100-fold power reduction over complementary CMOS transistors, so integrating TFETs with CMOS technology could improve low-power integrated circuits.

Recently, IBM has developed a novel method to integrate III-V nanowires and heterostructures directly on standard silicon substrates and built the first ever InAs/Si tunnel diodes and TFETs using InAs as source and Si as channel with wrap-around gate as steep slope device for low power consumption applications.

"In the next ten years computing hardware systems will be fundamentally different as our scientists and engineers push the limits of semiconductor innovations to explore the post-silicon future," said Tom Rosamilia, senior vice president, IBM Systems and Technology Group. "IBM Research and Development teams are creating breakthrough innovations that will fuel the next era of computing systems."

IBM investing $3 billion to extend Moore’s law with post-silicon-era chips and new architectures


SOURCE  IBM

By 33rd SquareEmbed

Wednesday, October 16, 2013


 Artificial Intelligence
According to IBM's Michael Barborak, it’s not human versus machine that will represent how artificial intelligence evolves, but human plus machine taking on challenges together and achieving more than either could do on its own.




When IBM’s Watson defeated two grand-champions on the TV quiz show, Jeopardy!, the world’s smartest computer was matched up against two really smart humans. The quiz-show win captured peoples’ attention, but, these days, as IBM develops new practical uses for Watson it’s becoming clear that these technologies will be used primarily to augment human intelligence, not compete with people or replace us.

According to Michael Barborak, Manager, Unstructured Language Engineering at IBM Research it’s not human versus machine, but human plus machine taking on challenges together and achieving more than either could do on its own.

Barborak manages the Natural Language Engineering and Frameworks department of the Watson project. This group is tasked with improving the DeepQA system through principled and thoughtful software engineering. Some of our projects include a Matching Framework to represent alignment between texts, a Term Matching Framework to support ecosystems of term matchers, and the TeachWatson service which supports synchronous and asynchronous opportunities for improving Watson through human interaction.

Nowhere is this powerful new one-two punch clearer than in the world of medicine and healthcare. Cognitive machines have the potential to help physicians diagnose diseases and assess the best treatments for individual patients. But, to make the most of this opportunity, machines will have to be designed and trained to interact with doctors in ways that are most natural to them.

Michael Barborak works with WatsonPaths

Related articles
Already IBM is co-developing with physicians at Cleveland Clinic, a project aimed at helping medical students learn to think like experienced physicians. Called WatsonPaths, the project is the most advanced example to date of a computer and humans thinking together.

Barborak joined IBM Research from a digital marketing agency a few months before the Jeopardy! contest was aired. He was hired to help develop real-world applications for Watson. Now, he is the Watson team’s manager of natural language engineering.
It was clear when I joined that the Watson technology would have to be adapted to be useful in healthcare, banking, retailing, education and other spheres of business and life. Watson was designed to form precise answers to precise questions on Jeopardy!, but that’s not the way the world works. To be useful in real life, the system must be able to understand complex, real-world scenarios so it can help people deal with them. So we had to train Watson to use its question and answer capabilities like a pick to chip away at a complex scenario and break it down into comprehensible pieces.  The system had to be able to discover salient facts, form hypotheses, test them, and arrive at conclusions. So we developed a technology we call the Watson inference chaining system, or WICS, to achieve this.
At Cleveland Clinic, we found a perfect match for our inference-chaining technology—which helped us evolve it into the application IBM calls WatsonPaths. The Clinic’s Lerner College of Medicine uses problem-based-learning methods to teach students how to think like doctors. Using medical scenarios, they walk step by step through the process a physician goes through to evaluate a patient’s condition and determine the best treatment.

WatsonPaths
Image Source: IBM Research
Working with WatsonPaths, students will be able to review the evidence the computer offers and the inferences it draws. As they work with the system, the path to the best conclusion will become more pronounced graphically. The students will train the system and, at the same time, the machine will help train the students. The goal is to one day incorporate these kinds of capabilities into future Watson commercial offerings.

"When I look ahead into the era of cognitive computing, I see a revolution unfolding before my eyes. For the first time, computers will adapt to the way we want to do things, rather than vice versa. That will be a remarkable change," says Barborak.


SOURCE  A Smarter Planet

By 33rd SquareSubscribe to 33rd Square

Monday, June 17, 2013


 
Artificial Intelligence
According to IBM, Cognitive computing systems like Watson learn and interact naturally with people to extend what either man or machine could do on their own.


Cognitive computing systems are systems that learn and interact naturally with people to extend what either man or machine could do on their own, according to IBM.

Cognitive computing systems like Watson, help human experts make better decisions by penetrating the complexity of Big Data.

Big Data is increasing in volume, speed and uncertainty. It comes in unstructured forms such as video, image and text. To deal with this onslaught of infomration, new types of computing system are needed in order to understand, process and make sense of it all.

The first cognitive computer was Watson, which debuted in a televised Jeopardy! challenge where it bested the show’s two greatest champions. The challenge for Watson was to answer questions posed in every nuance of natural language, such as puns, synonyms and homonyms, slang, and jargon.



Watson was not connected to the Internet for the match. It only knew what it had amassed through years of persistent interaction and learning from a large set of unstructured knowledge. Using machine learning, statistical analysis and natural language processing to find and understand the clues in the questions, Watson then compared possible answers, by ranking its confidence in their accuracy, and responded – all in about three seconds.

Now, newer generations of Watson are currently being trained in oncology diagnosis for healthcare professionals, and in customer service as a support representative.
Cognitive computers are not programmed to perform a function or set of tasks; rather, they use artificial intelligence (AI) and machine learning algorithms to sense, predict and, in some ways, think. This allows these systems to comprehend and draw insight from Big Data. In order to handle this type of processing, cognitive computers require new hardware innovations in which data processing is distributed throughout the system and memory and processing are more tightly integrated.

Related articles
Cognitive computing isn't about the computer becoming the primary expert as much as assisting the human experts. By having deep domain expertise in fields such as healthcare, banking and commerce, and using data visualization techniques, cognitive computing helps humans to solve complex problems and make sense of big data. Cognitive computing systems get smarter the more they are used.

Dr. John E. Kelly III is senior vice president and director of IBM Research. In this position he directs the worldwide operations of IBM Research, with approximately 3,000 scientists and technical employees at 12 laboratories in 10 countries around the world, and helps guide IBM's overall technical strategy.

Dr. Kelly's top priorities as head of IBM Research are to stimulate innovation in key areas of information technology, and quickly bring those innovations into the marketplace to sustain and grow IBM's existing business; to create the new businesses of IBM's future, and to apply these innovations to help IBM clients succeed.

IBM Research breakthroughs have helped to create and shape the world's computing industry, while more recent breakthroughs, including Deep Blue computing systems, breaking the Petaflop barrier, and the introduction of Watson, the deep question answering natural-language computer system, are blazing the computing trails of the future.

In the video below, Computer History Museum CEO John Hollar moderated a fascinating conversation with Kelly on topics ranging from his background and the path that led him to IBM, the history of research there, IBM's Watson and cognitive computing, to the newest lab in Nairobi, Kenya. Africa, IBM says, is destined to become an important growth market for the company. "Africa is a complex place," Dr. Kelly said. "But we feel it is on the cusp, at an inflection point. It's going to take off."



SOURCE  The Computer History Museum

By 33rd SquareSubscribe to 33rd Square