bloc 33rd Square Business Tools - IBM 33rd Square Business Tools: IBM - All Post
Showing posts with label IBM. Show all posts
Showing posts with label IBM. Show all posts

Tuesday, June 6, 2017

Moore's Law Has Another Life with Development of 5 Nanometer Chip


Moore's Law

IBM and Samsung have developed a first-of-a-kind process to build silicon nanosheet transistors that will enable 5 nanometer chips. The resulting increase in performance will help accelerate artificial intelligence, the Internet of Things (IoT) and other data-intensive applications delivered in the cloud. The power savings alone might mean that the batteries in smartphones and other mobile products could last two to three times longer than today’s devices, before needing to be charged.


"The economic value that Moore’s Law generates is unquestionable. That’s where innovations such as this one come into play, to extend scaling not by traditional ways but coming up with innovative structures."
IBM and Samsung, have announced the development of an industry-first process to build silicon nanosheet transistors that will enable 5 nanometer (nm) chips.

The breakthrough means that silicon technology has yet again extended the potential of Moore's Law.

Less than two years after developing a 7nm test node chip with 20 billion transistors, the researchers involved have paved the way for 30 billion switches on a fingernail-sized chip.

Related articles
The resulting increase in performance will help accelerate artificial intelligence, the Internet of Things (IoT), and other data-intensive applications delivered in the cloud. The power savings could also mean that the batteries in smartphones and other mobile products could last two to three times longer than today’s devices, before needing to be charged.

“The economic value that Moore’s Law generates is unquestionable. That’s where innovations such as this one come into play, to extend scaling not by traditional ways but coming up with innovative structures,” says Mukesh Khare, vice president of semiconductor research for IBM Research.

Scientists working as part of the IBM-led Research Alliance at the SUNY Polytechnic Institute Colleges of Nanoscale Science and Engineering’s NanoTech Complex in Albany, NY achieved the breakthrough by using stacks of silicon nanosheets as the device structure of the transistor, instead of the standard FinFET architecture, which is the blueprint for the semiconductor industry up through 7nm node technology.

Moore's Law extended again
IBM scientists at the SUNY Polytechnic Institute Colleges of Nanoscale Science and Engineering’s NanoTech Complex in Albany, NY prepare test wafers with 5nm silicon nanosheet transistors, loaded into the front opening unified pod, or FOUPs, to test the process of building 5nm transistors using silicon nanosheets. Image Source - Connie Zhou / IBM

The silicon nanosheet transistor demonstration, as detailed in the Research Alliance paper Stacked Nanosheet Gate-All-Around Transistor to Enable Scaling Beyond FinFET, and published by VLSI, proves that 5nm chips are possible, more powerful, and not too far off in the future.

5 Nanometer Chip
Pictured: a scan of IBM Research Alliance’s 5nm transistor, built using an industry-first process to stack silicon nanosheets as the device structure – achieving a scale of 30 billion switches on a fingernail-sized chip that will deliver significant power and performance enhancements over today’s state-of-the-art 10nm chips. Image Source - IBM

Gary Patton, CTO and Head of Worldwide R&D at GLOBALFOUNDRIES stated. “As we make progress toward commercializing 7nm in 2018 at our Fab 8 manufacturing facility, we are actively pursuing next-generation technologies at 5nm and beyond to maintain technology leadership and enable our customers to produce a smaller, faster, and more cost efficient generation of semiconductors.”

IBM Research has explored nanosheet semiconductor technology for more than 10 years. This work is the first in the industry to demonstrate the feasibility to design and fabricate stacked nanosheet devices with electrical properties better than FinFET architecture.

The scientists used the same Extreme Ultraviolet (EUV) lithography approach used to produce the 7nm test node and its 20 billion transistors to the nanosheet in the new transistor architecture. Using EUV lithography, the width of the nanosheets could be adjusted continuously, all within a single manufacturing process or chip design.

This adjustability allowed for the fine-tuning of performance and power for specific circuits – something not possible with today’s FinFET transistor architecture production.

Dr. Bahgat Sammakia, Interim President, SUNY Polytechnic Institute said that, “We believe that enabling the first 5nm transistor is a significant milestone for the entire semiconductor industry as we continue to push beyond the limitations of our current capabilities.”

Full implementation of this technology will still require 10 to 15 years of further development according to some reports.

The details of the process will be presented at the 2017 Symposia on VLSI Technology and Circuits conference in Kyoto, Japan.




SOURCE  IBM


By  33rd SquareEmbed





Monday, March 6, 2017

IBM Promises to Unleash Quantum Computing Before 2020


Quantum Computing

Quantum computing promises to be the next major technology with the potential to drive a new era of innovation across industries. Following on the success of competitors like D-Wave, IBM has announced that it too will begin commercializing quantum based computer systems.



Functional quantum computers are an entirely new paradigm of computing, and so far have only had limited implementation in the marketplace. IBM Q systems hopes to change that. The computing company will design a new class of computers that make use of quantum computers and aims to sell them commercially in a very short time.

"Our goal is to provide businesses and organizations with access to a new realm of computational power, before unachievable, to solve real-world and societal problems."
IBM intends to build their IBM Q systems with around 50 qubits in the next few years to demonstrate capabilities beyond today’s classical systems, and will collaborate with key industry partners to develop applications that exploit the quantum speedup of the systems. Potential applications include medicine and materials discovery, supply chain and logistics, financial services, artificial intelligence and cloud security.

Due to the exponential power of quantum computers, it is postulated that a universal quantum system with just 50 qubits may be able to perform certain complex calculations at a rate that today’s top multi-Petaflop supercomputers can’t yet produce.

IBM has demonstrated prototype systems that use quantum effects in small-scale demonstrations. They have taken advantage of effects like superpositioning, where trapped electrons can exist in two states at the same time, and used this behaviour to allow them to work in far more complex ways than the 1s and 0s that are used in today's computers.

IBM Promises to Unleash Quantum Computing Before 2020

The company has also released a new application programming interface (API) for the IBM Quantum Experience that enables developers and programmers to build interfaces between its existing five qubit, IBM Cloud-based quantum computer and classical computers.

One of the first and most promising applications for the new quantum computer system will be in the simulation of chemistry. Even for simple molecules like caffeine, the number of quantum states in the molecule can be astoundingly large – so large that all the conventional computing memory and processing power scientists could ever build could not handle the problem.

Related articles
"The universal quantum computers IBM has worked toward in more than 35 years of research are the most powerful and general class of quantum computers," writes Dario Gil, Vice President, Science and Solutions, IBM Research. "Their mojo comes from the quantum bit, or qubit, which leverages quantum effects that are not visible in our daily lives, to explore an exponentially more-powerful computational space, to help solve certain problems that we could not otherwise solve."

"Our goal is to provide businesses and organizations with access to a new realm of computational power, before unachievable, to solve real-world and societal problems," Gil writes on the company's blog.

IBM has been collaborating and engaging with developers, programmers and university partners for the development and evolution of IBM’s quantum computing systems. Since its launch less than a year ago, about 40,000 users have run over 275,000 experiments on the IBM Quantum Experience API. So far, 15 third-party research papers have been posted to arXiv with five published in leading journals based on experiments run on the Quantum Experience.

IBM has worked with academic institutions, such as MIT, the  Institute for Quantum Computing at the University of Waterloo, and École polytechnique fédérale de Lausanne (EPFL)
to leverage the IBM Quantum Experience as an educational tool for students.

Quantum computers are projected to have the capabilities to tackle problems that are currently seen as too complex and exponential in nature for classical computers. The promise of these systems is that they will deliver solutions to important problems where patterns cannot be seen and the number of possibilities that you need to explore to get to the answer are too enormous ever to be processed by classical computers.




SOURCE  IBM


By  33rd SquareEmbed



Sunday, November 20, 2016

IBM and NVIDIA Team Up To Deliver AI Hardware


Artificial Intelligence

IBM and NVIDIA have announced collaboration on a new deep learning tool called PowerAI, which is optimized for the latest hardware technologies to help train computers to think and learn in more human-like ways at a faster pace.


IBM and Nvidia have released a new jointly-created set of deep learning software tools called PowerAI. These tools are meant to make machine learning projects faster and tighten up neural net performance, and in part, to provide a way to extend Watson's capabilities. as our understanding of AI advances.

"PowerAI hardware and software can build learned models from images, speech, or other media in less time than prior generations of hardware and software."
Related articles
PowerAI consists of a set of binary distributions of several custom-tuned neural networks and some associated custom NVIDIA's GPUDL libraries compatible with machine-learning tasks. As a software suite, PowerAI is designed to run on a single one of IBM’s highest-end Power servers, the Power S822LC for High Performance Computing (HPC), but also to scale up from one to many many supercomputing clusters.

"PowerAI democratizes deep learning and other advanced analytic technologies by giving enterprise data scientists and research scientists alike an easy to deploy platform to rapidly advance their journey on AI,” said Ken King, General Manager, OpenPOWER. “Coupled with our high performance computing servers built for AI, IBM provides what we believe is the best platform for enterprises building AI-based software, whether it’s chatbots for customer engagement, or real-time analysis of social media data.”

The first set of hardware is the Power8 server with the Nvidia Tesla GPUs, said Sumit Gupta, IBM’s vice president of high-performance computing and analytics.

PowerAI Servers


This technology can be used for a broad set of purposes. For example, new driver assist technologies rely on machine and deep learning patterns to recognize objects in a rapidly changing environment, personal digital assistant technology is learning to categorize information contained in emails and text messages based on context, and in the enterprise, machine and deep learning applications can be used to identify high-value sales opportunities, provide assistance in call centers, detect instances of intrusion or fraud and suggest solutions to technical or business problems.

The PowerAI Deep Learning Frameworks were created to give developers and data scientists a platform on which to develop new machine learning-based applications and to analyze data with
immediate productivity, ease of use, and high performance.

The new hardware is the fastest deep-learning system available, according to the companies. The Power8 CPUs and Tesla P100 GPUs are among the fastest chips available, and both are linked via the NVLink interconnect, which outperforms PCI-Express 3.0. Nvidia’s GPUs power many deep-learning systems in companies like Google, Facebook, and Baidu.

Initial client uses for the new IBM Power S822LC for HPC servers include:


  • Human Brain Project– In support of the Human Brain Project, a research project funded by the European Commission to advance understanding of the human brain, IBM, and NVIDIA deployed a pilot system at the Juelich Supercomputing Centre as part of the Pre-Commercial Procurement process. Called JURON, the new supercomputer leverages Power S822LC for HPC systems.
  • Cloud provider Nimbix– HPC cloud platform provider, Nimbix expanded its cloud supercomputing offerings this month, putting IBM Power S822LC for HPC systems with PowerAI in the hands of developers and data scientists to achieve enhanced performance.
  • City of Yachay, Ecuador – Ecuador’s “City of Knowledge,” Yachay, is a planned city designed to push the nation’s economy away from commodities and towards knowledge-based innovation. Last week the city announced it is using a cluster of Power S822LC servers to build the country’s first supercomputer for the purpose of creating new forms of energy, predicting climates, and pioneering food genomics.
  • SC3 Electronics– A leading cloud supercomputing center in Turkey, SC3 Electronics announced last month at the OpenPOWER Summit Europe that it is creating the largest HPC cluster in the Middle East and North Africa region based on Power S822LC for HPC servers.

"The co-designed PowerAI hardware and software can build learned models from images, speech, or other media in less time than prior generations of hardware and software," states Hillery Hunter, Director of Systems Acceleration and Memory and Memory Strategist IBM Research. "Deep learning training time is a key metric for developer productivity in this domain. It enables innovation at a faster pace, as developers can invent and try out many new models, parameter settings, and data sets."


SOURCE  Inside Big Data


By  33rd SquareEmbed



Tuesday, July 5, 2016

IBM's Arvind Krishna Explores The Future of Artificial Intelligence


Artificial Intelligence

"The Future of AI: Emerging Topics and Societal Benefit." conference was recently held at Stanford University and brought together visionaries in the field of artificial intelligence from academia, government and industry. IBM's Arvind Krishna gave the keynote address.


Arvind Krishna, Senior Vice President and Director, IBM Research was the keynote speaker at the recent Stanford University event, 'The Future of AI: Emerging Topics and Societal Benefit." The conference brought together visionaries in the field of artificial intelligence from academia, government and industry. See the keynote below.

Related articles
The participants discussed some of the major ways AI could benefit society in the coming years.

"AI can be bigger than the steam engine if we harness it correctly and do the right work."
Krishna helps guide IBM’s overall technical strategy in core and emerging technologies, including cognitive computing, quantum computing, cloud platform services, data-driven solutions and blockchain.

"AI can be bigger than the steam engine if we harness it correctly and do the right work," says Krishna.

We need another revolution

In his talk, Krishna points to the fact that artificial intelligence is on the cusp of having a major impact on the study and treatment of aging. "Aging is the place where computational neuroscience has the ability to offer tremendous benefits," states Krishna. By analyzing speech patterns, Krishna's colleagues have been able to use AI tools to predict the onset of dementia, Parkinson's and Alzheimer's disease.  While he admits that this work is a long way from a cure, it does provide the opportunity for better care and quality of life for patients.

Krishna oversees an organization of approximately 3,000 scientists and technologists in 12 labs across six continents. Previously, Krishna was general manager of IBM Systems and Technology Group’s development and manufacturing organization, responsible for the advanced engineering and development of a full technology portfolio, ranging from advanced semiconductor materials to leading-edge microprocessors, servers and storage systems. Krishna has an undergraduate degree from the Indian Institute of Technology, Kanpur, and a Ph.D. from the University of Illinois at Urbana-Champaign. He is the recipient of a distinguished alumni award from the University of Illinois, is the co-author of 15 patents, has been the editor of IEEE and ACM journals, and has published extensively in technical conferences and journals.




SOURCE  Stanford University


By 33rd SquareEmbed


Monday, April 4, 2016

IBM's Dharmendra Modha Discusses the Company's Roadmap for Neuromorphic Computing


Neuromorphic Computing

IBM's Dharmendra Modha discussed the progress his team is making with TrueNorth neuromorphic architecture and how the project may proceed over the coming years recently, and he had a very ambitious prediction to make.


IBM's Dharmendra Modha spoke recently at the 2016 Neuro-Inspired Computational Elements Workshop at UC Berkeley, and talked about the progress his team is making with TrueNorth neuromorphic architecture and how the project may proceed over the coming years. (Full lecture is embedded below).

Related articles
Modha is IBM's Chief Scientist for Brain-inspired Computing.  He is a Cognitive Computing pioneer who envisioned and now leads a highly successful effort to develop Brain-inspired Computers.  The project has received ~$58 million in research funding from DARPA (under SyNAPSE Program), US Department of Defense, and US Department of Energy. The resulting architecture, technology, and ecosystem breaks path with the prevailing von Neumann architecture (circa 1946) and constitutes a foundation for energy-efficient, scalable neuromorphic systems.

According to Modha IBM has developed end-to-end technology and ecosystem to create and program energy-efficient, brain-inspired machines that mimic the brain’s abilities for perception, action, and cognition.

"Not only do we have chip," says Modha, "but we have an end-to-end programming paradigm from user interface deep learning down to a programming language for firmware, as well as the whole user flow and debugging tools."

"On top of that, we have created various boards and we are making plans by December of next year [2017], to create a hundred and twenty-eight chip system," Modha continues.

Roadmap for Neuromorphic Computing

"Before 2020 ends is that we will be able to produce a brain-in-a-box which was the original vision of the SyNAPSE project."

IBM's research team project that TrueNorth’s modular, scalable architecture and ultra-low power consumption provide an unique opportunity to create brain-inspired information technology systems with 100 trillion synapses, which is comparable to “human-scale.”

"We conceive that such systems would constitute 96 racks, each rack with 4,096 TrueNorth processors and consuming merely 4kW. The key is to leverage TrueNorth’s seamless tiling for low-power local-connectivity along with recent advances in interconnect technology for long-range connectivity," they wrote in a recent paper.

Modha concludes his talk stating, "The ultimate vision, which I believe will be possible before 2020 ends is that we will be able to produce a brain-in-a-box which was the original vision of the SyNAPSE project ten billion neurons in two liters, one kilowatt. This is no longer science fiction, it is happening."

The full potential of a the neuromorphic computer ecosystems, like those based on TrueNorth will potentially impact areas including machine learning, neural network research, computer vision and voice recognition, neuroscience, robotics, computer architecture, circuit design, simulation methodology, programming languages, visualization, usability, design, supercomputing and more.




SOURCE  UC Berkeley


By 33rd SquareEmbed


Wednesday, March 30, 2016

 IBM's Brain-like Computing Chip to Work on National Security Issues


Neuromorphic Computing

A new chip-architecture breakthrough accelerates path to exascale computing; helps computers tackle complex, cognitive tasks such as pattern recognition sensory processing. IBM's TrueNorth neuromorphic chip is about to be put through the paces at a leading national laboratory.



Lawrence Livermore National Laboratory (LLNL) has announced it will receive a first-of-a-kind brain-inspired neuromorphic computing platform for deep learning developed by IBM Research. Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery – a mere 2.5 watts of power.

The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips.

TrueNorth Neuromorphic Computer System

The neuromorphic system will be used to explore new computing capabilities important to the National Nuclear Security Administration’s (NNSA) missions in cybersecurity, stewardship of the nation’s nuclear weapons stockpile and nonproliferation. NNSA’s Advanced Simulation and Computing (ASC) program will evaluate machine-learning applications, deep-learning algorithms and architectures and conduct general computing feasibility studies. ASC is a cornerstone of NNSA’s Stockpile Stewardship Program to ensure the safety, security and reliability of the nation’s nuclear deterrent without underground testing.

Related articles
“Neuromorphic computing opens very exciting new possibilities and is consistent with what we see as the future of the high performance computing and simulation at the heart of our national security missions,” said Jim Brase, LLNL deputy associate director for Data Science. “The potential capabilities neuromorphic computing represents and the machine intelligence that these will enable will change how we do science.”

"The potential capabilities neuromorphic computing represents and the machine intelligence that these will enable will change how we do science."
The technology represents a fundamental departure from computer design that has been prevalent for the past 70 years, and could be a powerful complement in the development of next-generation supercomputers able to perform at exascale speeds, 50 times (or two orders of magnitude) faster than today’s most advanced petaflop (quadrillion floating point operations per second) systems. Like the human brain, neurosynaptic systems require significantly less electrical power and volume.

“The low power consumption of these brain-inspired processors reflects industry’s desire and a creative approach to reducing power consumption in all components for future systems as we set our sights on exascale computing,” said Michel McCoy, LLNL program director for Weapon Simulation and Computing.

A single TrueNorth processor consists of 5.4 billion transistors wired together to create an array of 1 million digital neurons that communicate with one another via 256 million electrical synapses. It consumes 70 milliwatts of power running in real time and delivers 46 giga synaptic operations per second – orders of magnitude lower energy than a conventional computer running inference on the same neural network. TrueNorth was originally developed under the auspices of the Defense Advanced Research Projects Agency’s (DARPA) Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program, in collaboration with Cornell University.

According to the contract, LLNL will receive a 16-chip TrueNorth system representing a total of 16 million neurons and 4 billion synapses. LLNL also will receive an end-to-end ecosystem to create and program energy-efficient machines that mimic the brain’s abilities for perception, action and cognition. The ecosystem consists of a simulator; a programming language; an integrated programming environment; a library of algorithms as well as applications; firmware; tools for composing neural networks for deep learning; a teaching curriculum; and cloud enablement. 

Lawrence Livermore computer scientists will also collaborate with IBM Research, partners across the Department of Energy complex and universities (link is external) to expand the frontiers of neurosynaptic architecture, system design, algorithms and software ecosystem.



SOURCE  Lawrence Livermore National Laboraory


By 33rd SquareEmbed


Saturday, December 12, 2015

Meet The Next Generation of Watson, CELIA


Artificial Intelligence

IBM is building on Watson's capabilities and 'humanness.' Researchers at the company have built a machine-learning computer that could one day form the foundation of a fully-fledged general artificial intelligence system,—and it is called CELIA.


Years after IBM beat the quiz show Jeopardy!, company researchers are still developing artificial intelligence, or as they refer to them cognitive computing systems. Watson is becoming yesterday's news, the a new name the researchers are touting is CELIA.

CELIA stands for Cognitive Environments Laboratory Intelligent Assistant. IBM's Watson spin-off is already being adopted into multiple industries, and now research teams are working to integrate CELIA into the workplace itself.

This new digital employee evisioned by IBM is intended to help business with their corporate strategy. The system builds on the Watson foundation and adds more human elements into the system. CELIA is intended to mimic how we would interact with each other.

artificial intelligence CELIA

Related articles
"The future of expertise will be defined by a partnership between our capabilities and the capabilities of the machines."
"The future of expertise will be defined by a partnership between our capabilities and the capabilities of the machines," says a researcher. "What we bring to the table is the problems, the values and our common sense." Essentially CELIA represents a digitally augmenting human intelligence.

For instance, a user could ask CELIA a question like, “What small-market-cap companies in the semiconductor industry would be a good fit for us to buy?” and Celia would return a list of companies that make sense, as well as explain her reasoning for choosing those companies.

“We are inspired by how humans can reason through a problem—minus the emotional bias,” Rob High, the Watson Group director, told Quartz.

In the future, doctors could talk through diagnoses with CELIA, receiving answers instantly synthesized from thousands of pieces of medical data. Researchers could work through ideas for new drugs, or how to build a high-rise tower, or even plan an entirely new city, and CELIA would instantly give them suggestions in plain English that they couldn’t have worked out in days.



SOURCE  Quartz


By 33rd SquareEmbed


Tuesday, September 29, 2015

How Professionals are Making the Most Out of Big Data?


Big Data


Big data allows companies to cut costs, create new products and services and make faster, more informed decisions. Here are some of the sectors where Big Data is already having a big impact.
 


While nobody can specify the exact volume of available big data, up to three billion gigabytes of data are generated every day. Big data offers endless potential for businesses and organizations. Big data allows companies to cut costs, create new products and services and make faster, more informed decisions. Continue reading to learn how four professional fields are using big data to their advantage.

Law Practices

Many law firms still manually analyze achieved documentation to search for patterns and create reports.
Related articles

However, big data analytics can help law firms through providing critical case correlations. That is, law firms will be able to reduce their costs and court wait time through understanding relevant case connections. Additionally, law firms can also use big data to make predictions based on historical rulings. As a result, the law firm will be able to provide their clients with an accurate time frame and save them money. In fact, law firms will be able to increase financial transparency through comparing how their legal fees compare to competitors. Finally, the law firm will be able to statistically analyze which cases are even worth accepting based on past success and failure.

Health Care

Health care reform in America is a hot topic that is debated by political pundits and media experts. However, no substantial change can occur without hard data. One of the best ways that big data can improve the health care system is through preventing diseases. Medical and data techs can now analyze patient decisions, demographic data and health and insurance records in order to identify medical trends. For example, big data can analyze radiology information systems to profile high risk patients and prevent disease. Big data and mobile technology will soon allow health care technology companies to analyze mobile phone users through fitness and health apps. For example, wearable apps, such as the Apple Watch, will soon provide health care providers with unique insight into the private lives of users. This will result in expanded databases of health information that will improve public health and education.

Finance

The financial sector depends on accurate data in order to make better decisions. As markets become more complex and customers more demanding, financial firms need big data to increase efficiency and competitiveness. For example, one of the fundamental aspects of big data is the velocity that data is stored and evaluated. Million dollar investment decisions depend on millisecond data updates. As investment firms better analyze and understand structured data, their ability to manage volatile markets will improve. Furthermore, algorithmic trading now allows investment firms to entirely rely on advanced software programs to make extremely fast and accurate trades. Software programs that have access to enormous volumes of historical data and proven strategies will automatically make better real-time decisions.

Telecommunications

According to IBM, communications service providers (CSPs) have access to a gold mine of data that will help them win customers and increase revenue. CSPs, such as nationally recognized mobile phone and Internet providers, are continually engaged in continual competition to gain and win back customers. Luckily, telecommunication data is already digitally available for analysis on almost all of the BPM software. For example, big data can help CSPs understand local and regional customer behaviors and preferences. As a result of big data analytics, CSPs can strategically plan for the future, improve the customer experience and adjust their business objectives accordingly.

In the end, every industry can benefit from big data analytics, which saves money for companies and customers alike, however certain industries are leading the way. Law practice, health care, finance and telecommunications are four sectors that are using big data to increase efficiency and competitiveness.

By Dennis HungEmbed



Tuesday, August 18, 2015

IBM Introduces Researchers To Their Artificial Rodent Brain


Neuromorphic Computing


After years of development, IBM has unveiled their neuromorphic computing system to the public.  They claim the system, based on the TrueNorth processor is computationally equivalent to the brain of a small rodent.

 


A team at IBM, led by Dharmendra Modha has been working with DARPA's Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program since 2008 to develop neuromorphic computing systems that work more like brains than traditional Turing machines.

Related articles

Now, after years of development, the company has unveiled the system to the public as part of a three-week "boot camp" training session for academic and government researchers.

The TrueNorth system, uses modular chips that act like artificial neurons. By stringing multiple chips together researchers can essentially build an artificial neural network. The version that IBM just debuted contains about 48 million connections—which is roughly equivalent to the computing capacity of a rat's brain.

These systems ideal for operating deep learning algorithms—the computational artificial intelligence development behind the explosion of image recognition, speech recognition, and many other advances.

Neuromorphic systems also have the benefit of modelling their power requirements on biological brains, using a fraction of electrical needs and space used by typical computers. For example, a TrueNorth chip contains 5.4 billion transistors but only uses 70 mw of power. An Intel processor, conversely contains just 1.4 billion transistors and draws between 35 and 140 watts.

Neuromorphic Computing IBM TrueNorth

According to Wired's Cade Metz, researchers who got their hands on the chip at an engineering workshop in Colorado the previous month have already developed software that can identify images, recognize spoken words, and understand natural language. Basically, they’re using the chip to run “deep learning” algorithms.

"Humans use technology to transform society. These are the humans."


The promise is that IBM’s chip can run these algorithms in smaller spaces with considerably less electrical power, letting us shoehorn more AI onto phones and other tiny devices, including hearing aids and, well, wristwatches.

Future versions of the TrueNorth system could, in theory, be miniaturized so they are small enough to fit inside cell phones or smart watches.

TrueNorth is definately not a digital brain, but it may be a step toward a digital brain. IBM, is accelerating the project as they demonstrated with a recent three week boot camp, for potential neuromorphic algorithm developers.

The machine they encountered is really 48 separate machines, each built around its own TrueNorth processors.  IBM intends to divide up the processors to the researchers so they can bring them back to their own labs, which span over 30 institutions on five continents. “Humans use technology to transform society,” Modha says, pointing to the room of researchers. “These are the humans.”

SOURCE  Wired


By 33rd SquareEmbed



Tuesday, August 11, 2015

Watson Now Will Be Help Read Medical Images


Artificial Intelligence


IBM announced that Watson will gain the ability to “see” by bringing together Watson’s advanced image analytics and cognitive capabilities with data and images obtained from a well-established medical imaging management platform.
 


IBM has announced that Watson will soon have the ability to ‘see’ by bringing together Watson’s advanced image analytics and cognitive computing system with data and images obtained from a medical imaging management platform.

IBM plans to acquire Merge Healthcare Incorporated, a provider of medical image handling and processing, interoperability and clinical systems designed to advance healthcare quality and efficiency, in an effort to unlock the value of medical images to help physicians make better patient care decisions.

Merge’s technology platforms are used at more than 7500 US healthcare sites, as well as most of the world’s leading clinical research institutes and pharmaceutical firms to manage a growing body of medical images.

Related articles

According to the press release, the vision is that these organisations could use the Watson Health Cloud to surface new insights from a consolidated, patient-centric view of current and historical images, electronic health records, data from wearable devices and other related medical data, in a secure environment.

“As a proven leader in delivering healthcare solutions for over 20 years, Merge is a tremendous addition to the Watson Health platform,” says John Kelly, senior vice president, IBM Research and Solutions Portfolio.

“Healthcare will be one of IBM’s biggest growth areas over the next 10 years, which is why we are making a major investment to drive industry transformation and to facilitate a higher quality of care.

The acquisition bolsters IBM’s strategy to add rich image analytics with deep learning to the Watson Health platform – in effect, advancing Watson beyond natural language and giving it the ability to “see.”

Medical images are by far the largest and fastest-growing data source in the healthcare industry and perhaps the world – IBM researchers estimate that they account for at least 90% of all medical data today – but they also present challenges that need to be addressed:


  • The volume of medical images can be overwhelming to even the most sophisticated specialists – radiologists in some hospital emergency rooms are presented with as many as 100,000 images a day1.
  • Tools to help clinicians extract insights from medical images remain very limited, requiring most analysis to be done manually.
  • At a time when the most powerful insights come at the intersection of diverse data sets (medical records, lab tests, genomics, etc.), medical images remain largely disconnected from mainstream health information. 

"Healthcare will be one of IBM’s biggest growth areas over the next 10 years, which is why we are making a major investment to drive industry transformation."


“Watson’s powerful cognitive and analytic capabilities, coupled with those from Merge and our other major strategic acquisitions, position IBM to partner with healthcare providers, research institutions, biomedical companies, insurers and other organisations committed to changing the very nature of health and healthcare in the 21st century.

Radiologists have to deal with as much 100,000 images a day according to Dr. Elliot Siegel, a physician who has worked with IBM on Watson since its earliest days. They have to process all of this visual information while trying to cross correlate with patient historic data, lab data and more. Faced with that amount of information, it is becoming increasingly evident that artificial intelligence is becoming a real necessity in medical imaging.

Moreover, medical imaging is increasingly becoming a cloud computing commodity, making it even more ready for cognitive computing systems to scan and produce ever-accurate diagnoses with deep learning systems.



SOURCE  Merge Healthcare


By 33rd SquareEmbed



Thursday, May 28, 2015

Are You Ready To Have A Conversation With Your Computer?

 Artificial Intelligence
IBM's Watson artificial intelligence system is now dramatically better at understanding human conversations thanks to a newly developed deep learning algorithm.





Researchers at IBM have created an algorithm that will give Watson the power to follow a conversation. Michael Picheny, the leader of IBM’s speech team, said the algorithm recognizes conversations spoken by two alternating voices.

The breakthrough in performance was aided by advances in applications of deep learning to both acoustic modeling, and language modeling by the team.

Related articles
The researchers tested the artificial intelligence software on a database of recorded telephone conversations between strangers. They found an 8% error rate – 36% better than the best previous results on the same test. While not as accurate as a person, the system is very close. “Humans on this particular set of data only get 4 percent of the words wrong,” Picheny said. “A few years ago the number [our systems got] was closer to 20 percent.”

"Extrapolating from historical trends, we believe that human accuracy on this task can be reached within the next decade."


The software will be added to the Watson API, allowing developers extended speech recognition capabilities for app development according to IBM. It could be used, for instance, to help customer-support staff recognize what hard-to-understand callers were saying.

The researchers conclude in their paper that:

Extrapolating from historical trends, we believe that human accuracy on this task can be reached within the next decade. We think that the way to get there will most likely involve an increase of several orders of magnitude in training data and the use of more sophisticated neural network architectures that tightly integrate multiple knowledge sources (acoustics, language, pragmatics, etc.).

Yann LeCun, Facebook 's director of artificial intelligence research said the results represent a “significant advance.”

SOURCE  WSJ Digits

By 33rd SquareEmbed

Wednesday, April 29, 2015

Critical Steps to Building First Practical Quantum Computer Made

 Quantum Computers
Scientists have made two advances needed to create viable quantum computers. They have shown the ability to detect and measure both kinds of quantum errors simultaneously, as well as created a new, square quantum bit circuit design that is the only physical architecture that could successfully scale to larger dimensions.





Scientists at IBM have unveiled two critical advances towards the realization of practical quantum computers. For the first time, they showed the ability to detect and measure both kinds of quantum errors simultaneously, as well as demonstrated a new, square quantum bit circuit design that is the only physical architecture that could successfully scale to larger dimensions.

With Moore's Law expected to run out of steam, quantum computing will be among the inventions that could usher in a new era of innovation across industries. Quantum computers promise to open up new capabilities in the fields of optimization and simulation simply not possible using today's computers. If a quantum computer could be built with just 50 quantum bits (qubits), no combination of today's supercomputers could successfully outperform it.

The IBM breakthroughs, described in the journal Nature Communications, show for the first time the ability to detect and measure the two types of quantum errors (bit-flip and phase-flip) that will occur in any real quantum computer. Until now, it was only possible to address one type of quantum error or the other, but never both at the same time. This is a necessary step toward quantum error correction, which is a critical requirement for building a practical and reliable large-scale quantum computer.

IBM Square Lattice Quantum Computer Chip

IBM's new and complex quantum bit circuit, based on a square lattice of four superconducting qubits on a chip roughly one-quarter-inch square, enables both types of quantum errors to be detected at the same time. By opting for a square-shaped design versus a linear array – which prevents the detection of both kinds of quantum errors simultaneously – IBM's design shows the best potential to scale by adding more qubits to arrive at a working quantum system.

Related articles
"Quantum computing could be potentially transformative, enabling us to solve problems that are impossible or impractical to solve today," said Arvind Krishna, senior vice president and director of IBM Research. "While quantum computers have traditionally been explored for cryptography, one area we find very compelling is the potential for practical quantum systems to solve problems in physics and quantum chemistry that are unsolvable today. This could have enormous potential in materials or drug design, opening up a new realm of applications."

For instance, in physics and chemistry, quantum computing could allow scientists to design new materials and drug compounds without expensive trial and error experiments in the lab, potentially speeding up the rate and pace of innovation across many industries.

For a world consumed by Big Data, quantum computers could quickly sort and curate ever larger databases as well as massive stores of diverse, unstructured data. This could transform how people make decisions and how researchers across industries make critical discoveries.

One of the great challenges for scientists seeking to harness the power of quantum computing is controlling or removing quantum decoherence – the creation of errors in calculations caused by interference from factors such as heat, electromagnetic radiation, and material defects. The errors are especially acute in quantum machines, since quantum information is so fragile.

"Up until now, researchers have been able to detect bit-flip or phase-flip quantum errors, but never the two together. Previous work in this area, using linear arrangements, only looked at bit-flip errors offering incomplete information on the quantum state of a system and making them inadequate for a quantum computer," said Jay Gambetta, a manager in the IBM Quantum Computing Group. "Our four qubit results take us past this hurdle by detecting both types of quantum errors and can be scalable to larger systems, as the qubits are arranged in a square lattice as opposed to a linear array."

"Quantum computing could be potentially transformative, enabling us to solve problems that are impossible or impractical to solve today."


The most basic piece of information that a typical computer understands is a bit. Much like a beam of light that can be switched on or off, a bit can have only one of two values: "1" or "0". However, a quantum bit (qubit) can hold a value of 1 or 0 as well as both values at the same time, described as superposition and simply denoted as "0+1". The sign of this superposition is important because both states 0 and 1 have a phase relationship to each other. This superposition property is what allows quantum computers to choose the correct solution among millions of possibilities in a time much faster than a conventional computer.

Two types of errors can occur on such a superposition state. One is called a bit-flip error, which simply flips a 0 to a 1 and vice versa. This is similar to classical bit-flip errors and previous work has showed how to detect these errors on qubits. However, this is not sufficient for quantum error correction because phase-flip errors can also be present, which flip the sign of the phase relationship between 0 and 1 in a superposition state. Both types of errors must be detected in order for quantum error correction to function properly.

quantum computers

Quantum information is very fragile because all existing qubit technologies lose their information when interacting with matter and electromagnetic radiation. Theorists have found ways to preserve the information much longer by spreading information across many physical qubits. "Surface code" is the technical name for a specific error correction scheme which spreads quantum information across many qubits. It allows for only nearest neighbor interactions to encode one logical qubit, making it sufficiently stable to perform error-free operations.

The research team used a variety of techniques to measure the states of two independent syndrome (measurement) qubits. Each reveals one aspect of the quantum information stored on two other qubits (called code, or data qubits). Specifically, one syndrome qubit revealed whether a bit-flip error occurred to either of the code qubits, while the other syndrome qubit revealed whether a phase-flip error occurred. Determining the joint quantum information in the code qubits is an essential step for quantum error correction because directly measuring the code qubits destroys the information contained within them.

Because these qubits can be designed and manufactured using standard silicon fabrication techniques, the company anticipates that once a handful of superconducting qubits can be manufactured reliably and repeatedly, and controlled with low error rates, there will be no fundamental obstacle to demonstrating error correction in larger lattices of qubits.

Next, the Experimental Quantum Computing team is planning on making a similar lattice with eight qubits. “Thirteen or 17 qubits is the next important milestone,” Jerry Chow, Manager of Experimental Quantum Computing at IBM Research, told TechCrunch , because it’s at that point they’ll have the ability to start encoding logic into the qubits — which is when things start to get really interesting.


SOURCE  Market Watch

By 33rd SquareEmbed