Older blog entries for jwp9447 (starting at number 63)

Robotic Arm Opens Doors For the Wheelchair Bound

For people confined to wheelchairs, the proliferation of ramps has greatly enhanced their mobility. Unfortunately, opening doors remains an omnipresent, and frustrating, challenge. Oddly enough, opening doors also presents a serious impediment for anthropomorphic robots. Now, robotics engineer Erin Rapacki has solved both problems with a single stroke.

Continuing a student project she began at University of Massachusetts, Lowell, Rapacki has created a cheap robot arm that can serve as a door-opening assistant to wheelchair bound humans, or as the primary arm for mobile robots. The trick was finding the right material for the fingers, something hard enough to grasp the handle, but supple enough fit a range of shapes.

Rapacki created the arm to use only one motor, utilizing a slip clutch to allow the arm to twist and push (or pull) at the same time. Altogether, the arm only cost $2,000 to build.

Now if only she could do something about the height of elevator buttons...

[New Scientist]

Syndicated 2009-11-24 21:04:28 from Popular Science - robots

24 Nov 2009 (updated 26 Nov 2009 at 02:02 UTC) »

NASA Robotic Rocket Plane To Survey Martian Surface

<!--paging_filter-->

Since budget cuts and the inability to overcome problems like boredom and high radiation doses have ruled out any manned mission to Mars in the foreseeable future, NASA has shifted gears back towards a program of robotic exploration. To that end, NASA now wants a rocket-powered UAV to fly around the Red Planet, photographing the surface.

The plane, repetitively named ARES (not to be confused with NASA's shuttle replacement, also named ARES), would fly to Mars in a regular rocket. Once it reaches the fourth rock from the Sun, it would pop out of the capsule, deploy its wings, and fire the rockets for an hour-long flight through the Martian sky. During that flight, ARES would cover about 373 miles, which is a little less than 100 times the area covered by the Spirit rover over the last five years.

Any aircraft flying on Mars would need some serious horsepower. The Martian atmosphere is 169 times thinner than the air here on Earth, so generating lift over ARES's wings may prove tricky. NASA has already devoted five years to initial design, but still has a long, long way to go before this thing takes flight. Of course, when the end product is a Martian rocket plane, the wait is worth it.

[The Register]

Syndicated 2009-11-24 18:29:22 from Popular Science - robots

18 Nov 2009 (updated 21 Nov 2009 at 12:06 UTC) »

iPhone Touchscreen Interface Puts Robot Control At Your Fingertips

<!--paging_filter-->

Adding a new wrinkle to the 'droid versus iPhone debate, a project at Keio University in Tokyo have created iPhone software specifically designed to control androids. More specifically, they've created an interface that puts control of a humanoid robot right at your fingertips.

"Walky" takes advantage of an iPhone or iPod's touchscreen to create an intuitive interface that requires virtually no learning. Your fingers simulate the robot's legs: a walking motion using the index and middle fingers makes the robot walk, tapping the screen makes it jump up and down, and a flicking motion with one finger elicits a kicking motion.

The idea is to make the robot's motion as intuitive as possible. Most controller commands, like joysticks, paddles and buttons, don't have any natural relation to each other. That's why you keep getting fragged in Modern Warfare; until you become very familiar with the controls, your knee-jerk reactions aren't necessarily the ones programmed into the game. By making the robot respond to commands that the user already knows, they've created a sort of universal remote that anyone can pick up and start using.

On that note, the team at Keio thinks their software could also be employed in controlling digital characters -- think on-screen avatars in video games -- but for now it's best suited to bipedal robots. While it won't be integrated into gadgets this holiday season, it will debut officially in December at SIGGRAPH Asia in Yokohama.

[DesignBoom via Fast Company]

Syndicated 2009-11-18 14:33:47 from Popular Science - robots

Robotic Surrogate Takes Your Place at Work

Having one of those days where even a hearty bowl of Fruit Loops and Jack Daniels can't get you out of bed? A telepresence robot can come into the office for you, elevating telecommuting to a decidedly new level. The somewhat humanoid 'bots, produced by Mountain View, California-based Anybots, are controlled via video-game-like controls from your laptop, allowing you to be "present" without actually being in the office.

The robots are equipped with a screen displaying your smiling mug to your co-workers, as well as a camera that beams a feed straight to your computer screen. They're also mobile, so you can still drop by co-workers' offices unannounced, presumably to let them know that a robot will be filling in for you today. There are a few variations that Anybots is toying with, including one with long arms and hands, and another with a laser pointer that, while useful for pointing things out in a physical environment, will prove slightly less cool than Johnny 5's shoulder-mounted laser cannon.

In all seriousness, Anybots plans to have a telepresence robot to market in the second half of next year. While it may seem to be little more than an expensive videoconferencing link (the product is expected to retail between $10,000 and $15,000), it could be a valuable tool in workplace environments where a more physical presence is handy, such as for U.S. businesses trying to keep an eye on manufacturing floors overseas. After all, you never know when a surrogate might come in handy.

All right, all right. See QA operating from both sides of the link in the (real) videos below.

[Anybots via Technology Review]

Syndicated 2009-11-17 21:16:26 from Popular Science - robots

Doctors Equip Yorkshire Man With Cyborg Sphincter

Meet Ged Galvin, the Steve Austin of colorectal surgery. After a car crash in which Galvin almost died, surgeons at Royal London Hospital realized they could rebuild his crushed organs. Stronger. Faster. They had the technology to give him a cyborg colon.

"The operation changed my life and gave me back my pride and confidence," Galvin told the Daily Telegraph.

After he spent a short time with a colostomy bag, the doctors surgically removed muscle from Galvin's leg, fashioned it into an ersatz sphincter, and surgically implanted electrodes into the new muscle ring. Doctors then installed the device in Galvin.

Now, Galvin is free from the indignity of a colostomy bag. He controls his functions with a small remote control, about the size of a cell phone, that operates the electrodes. Now he can do his business at his leisure, although the muscles and electrodes will need replacement every five years. "Because of the remote control I can lead a normal life again," Galvin said.

Armed with his new sense of confidence, all Galvin has to do now to steal Steve Austin's crown is fight Sasquatch.

[Daily Telegraph, via Geekologie]

Syndicated 2009-11-17 16:47:08 from Popular Science - robots

Robo-Negotiator Talks Down Armed Lunatic

Hostage situations are often described like explosive devices, as ticking time bombs waiting to go off. And just as bomb disposal units have robots to help with their job, now police negotiators have a bot of their own for defusing a different kind of explosive situation.

In Colorado, a heavily armed 61-year-old man had barricaded himself in his home. Afraid that the volatile suspect would shoot anyone sent in to negotiate his surrender, the police found themselves at an impasse. Then, they sent in the robot.

The police equipped a bomb disposal robot with a microphone, a camera, and speakers, and maneuvered it into the house. An operator navigated the droid to the suspect, and then began to talk him down. Eventually, the man surrendered to the robot, and was taken into police custody.

Usually, when Murphy said "dead or alive, you're coming with me", the "alive" option rarely panned out. This robocop, however, already seems to be getting better results.

[Glenwood Springs Post Independent, via The Register]

Syndicated 2009-11-12 16:33:14 from Popular Science - robots

11 Nov 2009 (updated 15 Nov 2009 at 11:28 UTC) »

Soccer-Ball-Sized Submersible Robots Will Track Ocean Currents and Disasters at Sea

The National Science Foundation has awarded almost $1 million to develop a swarm of underwater robotic explorers <!--paging_filter-->

Hundreds of soccer-ball-sized robot drones could soon ply the friendly waves to help scientists track ocean currents and harmful algae blooms, or even swarm to disaster sites such as oil spills and airplane crashes. That's no mere flight of fancy, now that the National Science Foundation has provided almost $1 million in funding to researchers at the Scripps Institution of Oceanography in San Diego.
<!--break-->
The underwater swarm would coordinate with larger mothership drones as they move around and gauge the physics of ocean currents. Such information might allow researchers to scout out critical nursery habitats in protected marine areas, and might likewise lead salvage teams to recover the black boxes from airplane crash sites.


"You put 100 of these AUEs [Autonomous Underwater Explorers] in the ocean and let 'er rip," said Peter Franks, an oceanographer at Scripps. "We'll be able to look at how they spread apart and how they move to get a sense of the physics driving the flow."

More data gathered over time could also feed into better ocean models that try to capture the ocean weather and climate.

Scripps researchers first plan to build five or six prototypes the size of soccer balls, along with 20 smaller versions. They would join a growing fleet of underwater robots ranging from U.S. Navy submarine drones to ring-wing robots designed for oil exploration.

[via PhysOr g]

Syndicated 2009-11-11 19:02:03 from Popular Science - robots

Neuron-Like Computer Chips Could Portably Digitize Human Brain

Simulating the brain with traditional chips would require impractical megawatts of power. One scientist has an alternative

According to Kwabena Boahen, a computer scientist at Stanford University, a robot with a processor as smart as the human brain would require at least 10 megawatts to operate. That's the amount of energy produced by a small hydroelectric plant. But a small group of computer scientists may have hit on a new neural supercomputer that could someday emulate the human brain's low energy requirements of just 20 watts -- barely enough to run a dim light bulb.

Discover Magazine has the story on how the Neurogrid computer could completely overhaul the traditional approach to computers. It trades the extreme precision of digital transistors for the brain's chaos of many neurons firing, with misfires 30 percent to 90 percent of the time. Yet the brain works with this messy system by relying on crowds of neurons to shout over the noise of misfires and competing signals.

That willingness to give up precision for chaos could lead to a new era of creative computing that simulates the unpredictable patterns of brain activity. It could also represent a far more energy-efficient era -- the Neurogrid fits in a briefcase and runs on what amounts to a few D batteries, or less than a watt. Rather than transistors, it uses capacitors that get the same voltage of neurons.

Boahen has so far managed to squeeze a million neurons onto his new supercomputer, compared to just 45,000 silicon neurons on previous neural machines. A next-generation Neurogrid may host as many as 64 million silicon neurons by 2011, or approximately the brain of a mouse.

This new type of supercomputer will not replace the precise calculations of current machines. But its energy efficiency could provide the necessary breakthrough to continue upholding Moore's Law, which suggests that the number of transistors on a silicon chip can double about every two years. Perhaps equally exciting, the creative chaos from a chaotic supercomputer system could ultimately lay the foundation for the processing power necessary to raise artificial intelligence to human levels.

[Discover Magazine]

Syndicated 2009-11-06 20:43:33 from Popular Science - robots

Wearable Artificial Intelligence Could Help Astronauts Troll Mars for Signs of Life

Not since RoboCop has being a cyborg seemed so very cool. University of Chicago geoscientists are developing an artificial intelligence system that future Mars explorers could incorporate into their spacesuits to help them recognize signs of life on Mars' barren surface.

The systems would entail an AI system known as a Hopfield neural network that uses processes closely mimicking human thought to weigh evidence and make decisions based on previously known facts and patterns. Using digital eyes incorporated into astronauts' suits, the AI system would collect data from the environment and analyze it in the Hopfield networks located on the hips of the suits.

Preloaded data as well as data collected as the astronauts go about their Martian surface walks would be turned over in the AI systems much in the same way a human brain would crunch it. For instance, the Hopfield's algorithm can learn colors from a single image, then relate it to previously observed instances of that color, making connections between the two. Recent tests of a complete, wearable prototype suit at the Mars Desert Research Station in Utah found that the AI could tell the difference between lichen and the rock surrounding it.

But that's just scratching the surface; next, researchers plan to teach the Hopfield to differentiate between textures, and ultimately to engineer the system to work at scales ranging from wide landscapes to the minuscule. They have plenty of time to do so, as no one plans to send a manned mission to Mars any time soon. But the data the algorithm is already learning on Earth could ride with robotic missions to Mars in the more immediate future.

[PhysOrg]

Syndicated 2009-11-05 14:39:18 from Popular Science - robots

MIT Introduces a Friendly Robot Companion For Your Dashboard

Kitt? KITT? Is that you?

With all the sensors, computerized gadgetry and even Internet connectivity being built into cars these days, it's a wonder our automobiles aren't more like Optimus Prime. Our cars will now email us when they need to have their oil changed, and recognize our facial expressions to determine whether we're enjoying ourselves, but for all the information available to us when we're driving, it's often not possible to organize it all in real-time and package it in a way that we can digest while behind the wheel. Researchers at MIT and Audi created the Affective Intelligent Driving Agent to address exactly that problem.

AIDA communicates with the driver via a small, sociable robot built into the dashboard. The idea is to develop an informed and friendly passenger, the buddy perpetually riding shotgun who aside from reading the map and helping with navigation, acts as a companion. As such, AIDA is being developed to read drivers' moods via their facial expressions and other cues (hand gestures?) and respond to them in the proper social context. It communicates back in very human ways as well: with a smile, the blink of an eye, the drooping of its head.

Prompting memories of KIT from the (most excellent) television series Knight Rider, the idea is for AIDA to have personality and establish a relationship with the driver in which both parties learn from each another and help each other out. AIDA analyzes the driver's mobility patterns, common routes and destinations, and driving habits. It then merges its knowledge of the driver with its knowledge of the city around it, mashing up the drivers priorities and needs with real-time information on everything from tourist attractions to environmental conditions to commercial activity to help the driver make better decisions.

If, for instance, there's a parade route between you and the grocery store, AIDA will tell you about it and help you find your way around it. Or it might simply remind you that your gas tank is low, knowing that given the time of day you must be on your way to work several miles away. AIDA will even give you feedback on your driving, helping you increase your fuel efficiency or suggesting that your policy of making rolling illegal lefts through stop signs while in school zones may be ill-advised.

Unlike the sci-fi cars of our childhood fancy -- KIT, the Batmobile, Herbie -- AIDA is still a ways from pulling off daring rescue maneuvers or other heroic acts of derring-do. But it can make the road a safer, more informed place, and if the MIT robotics researchers have their way, one that's not quite so lonely.

[MIT]

Syndicated 2009-10-29 19:11:54 from Popular Science - robots

54 older entries...

X
Share this page