bloc 33rd Square Business Tools - flying robot 33rd Square Business Tools: flying robot - All Post
Showing posts with label flying robot. Show all posts
Showing posts with label flying robot. Show all posts

Monday, June 8, 2015

The Technology Used in a Drone

 Drones
With all the publicity and controversy surrounding drones, have you ever wondered what makes them work?





Formally referred to as unmanned aerial vehicles (UAV), drones are essentially flying robots. Traditionally used in situations considered too dangerous for human pilots, drones have since provided a variety of applications, both personal and commercial. Also used for spying purposes, drones are not without a serious degree of controversy. However, steps are being taken to control drone air traffic and therefore deal with the privacy issue.

So how do drones work, i.e. what is the technology involved?

Degree of Human Intelligence

No matter the size of the drone or what it’s used for, it requires some degree of human intelligence to control movement. However, drones are likely getting “smarter,” with adaptive control techniques no doubt in the tiny vehicles’ future.

dronefly


Related articles

The Satellite Component

When a drone takes off, it is controlled via direct data link from a ground-control station. This occurs until the drone leaves the line of sight. Once that happens, the ground-control station switches to a satellite link. The satellite subsequently controls the aircraft, though the drone also uses GPS to provide its position.

Should the drone lose its communication link, it will fly in circles for hours or return to the base until the link is reestablished. Drones can also crash if they run out of fuel before contact is made.


Cameras and Other Features

Most drones are equipped with special cameras that capture images of the subject or target. These images are sent back to the person or people operating the drone. Cameras provide color or black and white images depending on the model. Drones may also include features such as infrared imaging for low-light conditions, radars, and lasers for targeting purposes.

Learn more about drone technology and purchase drone products and accessories by contacting Dronefly today.

By Kevin SkaggsEmbed

Wednesday, November 5, 2014


 Robotics
With the intention of creating a robotic system that can fly any aircraft, a team of researchers has successfully tested the PIBOT pilot robot in a model airplane. More than a drone, PIBOT uses an aircraft's existing controls to pilot the plane.




Researchers at South Korea’s Advanced Institute of Science and Technology (KAIST) have built a humanoid robot pilot called PIBOT to fly planes and helicopters on dangerous missions.

According to lead researcher Shim Hyung-Chul, what sets PIBOT apart from autonomous drones and autopilot programs is the robot's ability to adapt to any type of aircraft: “Many existing drones have been developed, however, PIBOT is the world first robot which can immediately automate any kind of aircraft.”

The project was was presented at the IROS 2014 in the paper, "A Robot-Machine Interface for Full-Automation using functions on a Humanoid".

Related articles
Shim says the one thing that all aircraft have in common is that they were designed to be flown by humans, so he and his team designed a robot that can control a plane the same way a human would.

The team has now created a smaller version of PIBOT, that has flown a model airplane, as shown in the video above.

The robot used by the team is actually an off-the-shelf humanoid BIOLOID Premium from Robotis, modified to be able to work the controls of a cockpit simulation, scaled down to mini-robot size.

PIBOT
PiBot in a simulator environment

"Many existing drones have been developed, however, PIBOT is the world first robot which can immediately automate any kind of aircraft."

The robot interfaces with a plane’s sensors and instrumentation to automate their functionality. PIBOT uses real-time computer vision to navigate during take-off and landing.

So far PIBOT has successfully completed a rigorous flight simulation programme as well as field tests with the model airplane. The researchers have plans to test PIBOT’s flying skills in a full scale plane in the near future.

“When Japan’s Fukushima nuclear plant got damaged by an earthquake in 2011, there was a helicopter which was trying to spray extinguishing agents, but it couldn’t get close to the site because of the radiation hazard,” explained Shim.

He believes if PIBOT was at the controls, radiation would not have been an issue.


SOURCE  Euronews

By 33rd SquareEmbed

Friday, May 23, 2014

Quadrotor Flying Autonomously

 Robotics
Researchers at the University of Pennsylvania have successfully demonstrated autonomous flight using an off-the-shelf Google Tango smart phone attached to a quadrotor.




Earlier this year, Google unveiled its Project Tango smartphone, a mobile device equipped with a depth sensor, a motion tracking camera, and two vision sensors that let the phone track its position in space and create 3D maps in real time, a process known as SLAM (Simultaneous Localization and Mapping).

This set-up has great implications for robots, which have to navigate and locate themselves in the world. As such, a video showed how Google and its partners were putting the smartphone on different kinds of robots, including mobile platforms and manipulator arms.

The Google device is remarkable because it lets you "literally velcro it to a robot and have it be autonomous."


Now researchers at the University of Pennsylvania led by Professor Vijay Kumar, where quadrotor swarms have already demonstrated amazing abilities, are moving on to the next logical phase of the work: attaching a Tango device from Google onto one of their quadrotors.

Related articles
Kumar says that a big challenge for researchers working with flying robots is not building them but rather developing hardware and software capable of making them autonomous. Many robots use GPS for guiding themselves, or, when flying indoors, they rely on motion tracking systems which offer great accuracy but requires that you install sensors on walls and ceilings.

The Tango phone does, opens new possibilities for flying robots. Kumar says that the Google device is remarkable because it lets you "literally velcro it to a robot and have it be autonomous."

Next, the researchers now plan to study Tango's accuracy of localization (and compare it to external motion tracking systems), but from their initial tests they estimate the accuracy to be within a centimeter. If that proves to be the case (and if Tango can be made cheap enough), it will be an impressive capability for the Google device, which could revolutionize how mobile robots and drones navigate indoor spaces.

Kumar says that the convergence of computation, communication, and consumers has a huge potential for the robotics industry, and a device like Tango is a key advance because it's "lowering the barrier to entry for autonomous robots."

The team has made a video of their results. In the first part of the video (below), Giuseppe Loianno, a PhD student in Kumar's group sets the quadrotor to hover at a fixed position and then disturbs it by moving it around, but the drone promptly returns to the starting point. Next he commands the drone to go to different places in the room and, even if disturbed, the drone recovers and stays on its programmed path.




SOURCE  IEEE Spectrum
By 33rd SquareEmbed

Wednesday, June 5, 2013

Researchers Fly Robot Drone With Their Thoughts

 Brain-Machine Interface
In a jaw-dropping feat of engineering, electronics turn a person's thoughts into commands for a robot. Using a brain-computer interface technology pioneered by University of Minnesota biomedical engineering professor Bin He, several young people have learned to use their thoughts to steer a flying robot around a gym, making it turn, rise, dip, and even sail through a ring.






At the University of Minnesota, a new technology is turning science fiction into reality. In the lab of biomedical engineering professor Bin He, several young people have learned to use their thoughts to steer a flying robot around a gym, making it turn, rise, dip, and even sail through a ring.

The technology, pioneered by He, may someday allow people robbed of speech and mobility by neurodegenerative diseases to regain function by controlling artificial limbs, wheelchairs, or other devices. And it's completely noninvasive: Brain waves (EEG) are picked up by the electrodes of an EEG cap on the scalp, not a chip implanted in the brain.

A report on the technology has been published in the Journal of Neural Engineering.

"My entire career is to push for noninvasive 3D brain-computer interfaces, or BCI," says He, a faculty member in the College of Science and Engineering. "[Researchers elsewhere] have used a chip implanted into the brain's motor cortex to drive movement of a cursor [across a screen] or a robotic arm. But here we have proof that a noninvasive BCI from a scalp EEG can do as well as an invasive chip."

Mind controlled drone

Mapping the brain 


He's BCI system works thanks to the geography of the motor cortex—the area of the cerebrum that governs movement. When we move, or think about a movement, neurons in the motor cortex produce tiny electric currents. Thinking about a different movement activates a new assortment of neurons.

Sorting out these assortments laid the groundwork for the BCI, says He.

"We were the first to use both functional MRI and EEG imaging to map where in the brain neurons are activated when you imagine movements," he says. "So now we know where the signals will come from."

The brain map showed that imagining making fists—with one hand or the other or both—produced the most easily distinguished signals.

"This knowledge about what kinds of signals are generated by what kind of motion imagination helps us optimize the design of the system to control flying objects in real time," He explains.

Researchers Fly Robot Drone With Their Thoughts
Image Source: Journal of Neural Engineering / He

Related articles
Monitoring electrical activity from the brain, the 64 scalp electrodes of the EEG cap report the signals (or lack of signals) they detect to a computer, which translates the pattern into an electronic command. Volunteers first learned to use thoughts to control the 1D movement of a cursor on a screen, then 2D cursor movements and 3D control of a virtual helicopter.

Now it's the real deal, controlling an actual flying robot—formally, an AR [augmented reality] drone. He's computers interface with the WiFi controls that come with the robot; after translating EEG brain signals into a command, the computer sends the command to the robot by WiFi.

The journal article describes how five men and women learned to guide the flying robot. The first author is Karl LaFleur, who was a senior biomedical engineering student during the study.

"Working for Dr. He has been a phenomenal experience," says LaFleur, who plans to put his knowledge to use when he enters the U's Medical School next year. "He has so much experience with the scientific process, and he is excellent at helping his students learn this process while allowing them room for independent work. Being an author on a first-person journal article is a huge opportunity that most undergraduates never get."

"I think the potential for BCI is very broad," says He. "Next, we want to apply the flying robot technology to help disabled patients interact with the world.

"It may even help patients with conditions like stroke or Alzheimer's disease. We're now studying some stroke patients to see if it'll help rewire brain circuits to bypass damaged areas."



SOURCE  University of Minnesota 

By 33rd SquareSubscribe to 33rd Square