bloc 33rd Square Business Tools - cats 33rd Square Business Tools: cats - All Post
Showing posts with label cats. Show all posts
Showing posts with label cats. Show all posts

Thursday, November 13, 2014


 Robotics
New research has looked at how cat and human mid-air orientation during activities like parkour, gymnastics and diving, for ways to make reactions to falling safer for robots, with less computational requirements.




A cat always lands on its feet. At least, that’s how the adage goes. Karen Liu hopes that in the future, this will be true of robots as well.

To understand the way feline or human behavior during falls might be applied to robot landings, Liu, an associate professor in the School of Interactive Computing (IC) at Georgia Tech, delved into the physics of everything from falling cats to the mid-air orientation of divers and astronauts.

Related articles
In research presented at the 2014 IEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Liu shared her studies of mid-air orientation and impact behavior in both cats and humans as it applies to reduced impact in falling robots, especially those that one day may be used for search-and-rescue missions in hazardous conditions.

Not only did Liu and her team of Georgia Tech researchers simulate falls, they also studied the impact of landings.

Cats and Athletes Teach Robots to Fall“It’s not the fall that kills you. It’s the sudden stop at the end,” Liu said. “One of the most important factors that determines the damage of the fall is the landing angle.”

In their experiments with a small robot consisting of a main body and two symmetric legs with paddles, the team compensated for the fact that a robot cannot move fast enough in a laboratory setting by creating a reduced-gravity environment using a tilted surface similar to an air hockey table outfitted with a leaf blower. Liu along with Jeffrey Bingham, Ravi Haksar, Jeongseok Lee and Jun Ueda, simulated the elements of a long fall and explored the possibility of a “soft roll” landing to reduce impact and damage to the robot.

In their work, the researchers found that a well-designed robot has the “brain” to process the computation necessary to achieve a softer landing, though current motor and servo technology does not allow the hardware to move quickly enough for cat-like impacts. Future research aims at further teaching a robot the skill of orientation and impact, a feat that falling humans cannot achieve but cats perform naturally.

Robot falling simulation

"One day we will have the capability to build robots that can do this kind of highly dynamic motion, we also have to teach robots how to fall — and how to land, safely, from a jump or a relatively high fall."


“Most importantly, the human brain cannot compute fast enough to determine the optimal sequence of poses the body needs to reach during a long-distance fall to achieve a safe landing,” the researchers note.

“Theoretically, no matter what initial position and initial speed we have, we can precisely control the landing angle by changing our body poses in the air,” says Ueda, an associate professor in the Woodruff School of Mechanical Engineering. “In practice, however, we have a lot of constraints, like joint limits or muscle strength, that prevent us from changing poses fast enough.”

“If we believe that one day we will have the capability to build robots that can do this kind of highly dynamic motion, we also have to teach robots how to fall — and how to land, safely, from a jump or a relatively high fall,” Liu said.


SOURCE  Georgia Tech

By 33rd SquareEmbed

Monday, July 14, 2014


 Artificial Intelligence
Microsoft has just upped the ante for artificial intelligence.  Project Adam is a new deep-learning system modeled after the human brain that has greater image classification accuracy and is 50 times faster than other systems in the industry.




Microsoft Research has developed a new artificial intelligence system using machine learning, called “Project Adam." The software increases the speed and efficiency of computers and their ability to learn.  Project Adam uses inspiration from the human brain to absorb new data and teach itself new skills — such as distinguishing among different breeds of dogs.

Related articles
Project Adam aims to demonstrate that large-scale, commodity distributed systems can train huge deep neural networks effectively. For proof, the researchers created the world’s best photograph classifier, using 14 million images from ImageNet, an image database divided into 22,000 categories.

The system was demonstrated at Microsoft’s Faculty Summit in Redmond recently (video below), as Microsoft brought out several different breeds of dogs on stage and showed how the technology could automatically distinguish among them in real time, using computer vision and insights from large sets of data.  The system was integrated into Cortana, Microsoft's digital assistant platform.

Microsoft says Project Adam has achieved breakthroughs in machine learning by using distributed networks and an asynchronous technique that improves the overall efficiency and accuracy of the system over time. This is a critical area of technology as Microsoft and other companies race to build intelligent, predictive systems that leverage mobile technologies and the cloud.

Where Google's Neural Networks Recognized Cats, Microsoft Sees Dogs, and Faster

"Project Adam knows dogs. It can identify dogs in images. It can identify kinds of dogs. It can even identify particular breeds, such as whether a corgi is a Pembroke or a Cardigan."

Now, if this all sounds vaguely familiar, that’s because it is—vaguely. A couple of years ago, Google used a network of 16,000 computers to teach itself to identify images of cats. That is a difficult task for computers, and it was an impressive achievement.

"We wanted to build a highly efficient, highly scalable distributed system from commodity PCs that has world-class training speed, scalability, and task accuracy for an important large-scale task."


According to Microsoft Research, Project Adam is 50 times faster—and more than twice as accurate, as outlined in a paper currently under academic review. In addition, it is efficient, using 30 times fewer machines, and is scalable, areas in which the Google effort fell short.

“We wanted to build a highly efficient, highly scalable distributed system from commodity PCs that has world-class training speed, scalability, and task accuracy for an important large-scale task,” says Trishul Chilimbi, one of the Microsoft researchers who spearheaded the Project Adam effort. “We focused on vision because that was the task for which we had the largest publicly available data set.

“We tend to overestimate the impact of disruptive technologies in the short term and underestimate their long-term impact—the Internet being a good case in point. With deep learning, there’s still a lot more to be done on the theoretical side," Chilimbi says.



SOURCE  Microsoft Research

By 33rd SquareEmbed

Monday, January 27, 2014

E-whiskers

 
Sensors
Researchers have created sensitive, tactile nanosensors that are similar to a cat's whiskers. The so-called "e-whiskers" may potentially help future robots feel their way around a space, or be used for new human-machine interfaces.




From the world of nanotechnology we’ve gotten electronic skin, or e-skin, and electronic eye implants or e-eyes. Now we’re on the verge of electronic whiskers. DARPA-funded researchers with Berkeley Lab and the University of California (UC) Berkeley have created tactile sensors from composite films of carbon nanotubes and silver nanoparticles similar to the highly sensitive whiskers of cats and rats.

A paper describing this research, “Highly sensitive electronic whiskers based on patterned carbon nanotube and silver nanoparticle composite films,” has been published in the Proceedings of the National Academy of Sciences.

These new e-whiskers respond to pressure as slight as a single Pascal, about the pressure exerted on a table surface by a dollar bill. Among their many potential applications is giving robots new abilities to “see” and “feel” their surrounding environment.

“Whiskers are hair-like tactile sensors used by certain mammals and insects to monitor wind and navigate around obstacles in tight spaces,” says the leader of this research Ali Javey, a faculty scientist in Berkeley Lab’s Materials Sciences Division and a UC Berkeley professor of electrical engineering and computer science.

“Our electronic whiskers consist of high-aspect-ratio elastic fibers coated with conductive composite films of nanotubes and nanoparticles. In tests, these whiskers were 10 times more sensitive to pressure than all previously reported capacitive or resistive pressure sensors.”

Javey and his research group have been leaders in the development of e-skin and other flexible electronic devices that can interface with the environment. In this latest effort, they used a carbon nanotube paste to form an electrically conductive network matrix with excellent bendability. To this carbon nanotube matrix they loaded a thin film of silver nanoparticles that endowed the matrix with high sensitivity to mechanical strain.


Related articles
“The strain sensitivity and electrical resistivity of our composite film is readily tuned by changing the composition ratio of the carbon nanotubes and the silver nanoparticles,” Javey says. “The composite can then be painted or printed onto high-aspect-ratio elastic fibers to form e-whiskers that can be integrated with different user-interactive systems.”

Javey notes that the use of elastic fibers with a small spring constant as the structural component of the whiskers provides large deflection and therefore high strain in response to the smallest applied pressures. As proof-of-concept, he and his research group successfully used their e-whiskers to demonstrate highly accurate 2D and 3D mapping of wind flow.

In the future, e-whiskers could be used to mediate tactile sensing for the spatial mapping of nearby objects, and could also lead to wearable sensors for measuring heartbeat and pulse rate.

“Our e-whiskers represent a new type of highly responsive tactile sensor networks for real time monitoring of environmental effects,” Javey says. “The ease of fabrication, light weight and excellent performance of our e-whiskers should have a wide range of applications for advanced robotics, human-machine user interfaces, and biological applications.”



SOURCE  Berkeley Lab

By 33rd SquareSubscribe to 33rd Square