Older blog entries for jwp9447 (starting at number 54)

MIT Introduces a Friendly Robot Companion For Your Dashboard

Kitt? KITT? Is that you?

With all the sensors, computerized gadgetry and even Internet connectivity being built into cars these days, it's a wonder our automobiles aren't more like Optimus Prime. Our cars will now email us when they need to have their oil changed, and recognize our facial expressions to determine whether we're enjoying ourselves, but for all the information available to us when we're driving, it's often not possible to organize it all in real-time and package it in a way that we can digest while behind the wheel. Researchers at MIT and Audi created the Affective Intelligent Driving Agent to address exactly that problem.

AIDA communicates with the driver via a small, sociable robot built into the dashboard. The idea is to develop an informed and friendly passenger, the buddy perpetually riding shotgun who aside from reading the map and helping with navigation, acts as a companion. As such, AIDA is being developed to read drivers' moods via their facial expressions and other cues (hand gestures?) and respond to them in the proper social context. It communicates back in very human ways as well: with a smile, the blink of an eye, the drooping of its head.

Prompting memories of KIT from the (most excellent) television series Knight Rider, the idea is for AIDA to have personality and establish a relationship with the driver in which both parties learn from each another and help each other out. AIDA analyzes the driver's mobility patterns, common routes and destinations, and driving habits. It then merges its knowledge of the driver with its knowledge of the city around it, mashing up the drivers priorities and needs with real-time information on everything from tourist attractions to environmental conditions to commercial activity to help the driver make better decisions.

If, for instance, there's a parade route between you and the grocery store, AIDA will tell you about it and help you find your way around it. Or it might simply remind you that your gas tank is low, knowing that given the time of day you must be on your way to work several miles away. AIDA will even give you feedback on your driving, helping you increase your fuel efficiency or suggesting that your policy of making rolling illegal lefts through stop signs while in school zones may be ill-advised.

Unlike the sci-fi cars of our childhood fancy -- KIT, the Batmobile, Herbie -- AIDA is still a ways from pulling off daring rescue maneuvers or other heroic acts of derring-do. But it can make the road a safer, more informed place, and if the MIT robotics researchers have their way, one that's not quite so lonely.

[MIT]

Syndicated 2009-10-29 19:11:54 from Popular Science - robots

Image and video hosting by TinyPic

Image and video hosting by TinyPic

MIT researchers and designers are developing the Affective Intelligent Driving Agent (AIDA) - a new in-car personal robot that aims to change the way we interact with our car. The project is a collaboration between the Personal Robots Group at the MIT Media Lab, MIT’s SENSEable City Lab and the Volkswagen Group of America’s Electronics Research Lab.

“With the ubiquity of sensors and mobile computers, information about our surroundings is ever abundant. AIDA embodies a new effort to make sense of these great amounts of data, harnessing our personal electronic devices as tools for behavioral support,” comments professor Carlo Ratti, director of the SENSEable City Lab. “In developing AIDA we asked ourselves how we could design a system that would offer the same kind of guidance as an informed and friendly companion.”

AIDA communicates with the driver through a small robot embedded in the dashboard. "AIDA builds on our long experience in building sociable robots,” explains professor Cynthia Breazeal, director of the Personal Robots Group at the MIT Media Lab. “We are developing AIDA to read the driver's mood from facial expression and other cues and respond in a socially appropriate and informative way."

AIDA communicates in a very immediate way: with the seamlessness of a smile or the blink of an eye. Over time, the project envisions that a kind of symbiotic relationship develops between the driver and AIDA, whereby both parties learn from each other and establish an affective bond.

To identify the set of goals the driver would like to achieve, AIDA analyses the driver’s mobility patterns, keeping track of common routes and destinations. AIDA draws on an understanding of the city beyond what can be seen through the windshield, incorporating real-time event information and knowledge of environmental conditions, as well as commercial activity, tourist attractions, and residential areas.

“When it merges knowledge about the city with an understanding of the driver’s priorities and needs, AIDA can make important inferences,” explains Assaf Biderman, associate director of the SENSEable City Lab. “Within a week AIDA will have figured out your home and work location. Soon afterwards the system will be able to direct you to your preferred grocery store, suggesting a route that avoids a street fair-induced traffic jam. On the way AIDA might recommend a stop to fill up your tank, upon noticing that you are getting low on gas," says Biderman. “AIDA can also give you feedback on your driving, helping you achieve more energy efficiency and safer behavior.”

AIDA was developed in partnership with Audi, a premium brand of the Volkswagen Group, and the Volkswagen Group of America's Electronics Research Lab. The AIDA team is directed by Professor Cynthia Breazeal, Carlo Ratti, and Assaf Biderman. The SENSEable City Lab team includes team leader Giusy di Lorenzo and includes Francisco Pereira, Fabio Pinelli, Pedro Correia, E Roon Kang, Jennifer Dunnam, and Shaocong Zhou. The Personal Robots Group's technical and aesthetic team includes Mikey Siegel, Fardad Faridi and Ryan Wistort as well as videographers Paula Aguilera and Jonathan Williams. Chuhee Lee and Charles Lee represent the Volkswagen Group of America’s Electronics Research Lab.

Robotic Pathologist Performs Precise, Clean Autopsies on Humans

Autopsies, for all the useful information they provide, have significant downsides. They are often upsetting to the deceased's family, they prevent people from receiving certain kinds of religious burials, and they leave a bit of a mess. To correct for those problems and more, a team at the University of Bern, Switzerland, has developed a robot that can perform virtual autopsies.

The robot uses stereo cameras to record a 3-D image of the body's exterior, and a CT scanner to record the body's internal condition. This results in a complete, 3-D, computerized model of the entire body. Doctors can control the robot to perform micro-biopsies for tissue examination, doing away with any serious deformation of the body. The medical examiner can then analyze the image, perform virtual biopsies, and store the data for future use. The process leaves no pile of used organs, and no jars filled with alcohol and tissue.

In addition to making the whole process much easier, the robot-conducted virtual autopsy also makes it easier for medical examiners to compare the current corpse with previous cases, and build a database for future reference.

Additionally, the robot used for the autopsy is much cheaper than the robots usually used for surgery. Since there's no chance of hurting someone who's already dead, the robot that does the job can be a less precise, industrial model, the type designed to assemble a car, not remove an appendix. For similar reasons, surgical robots need a doctor monitoring them at all times, but the robo-medical examiner can operate autonomously.

Welcome to the future: if you're not killed by a robot, then at the very least, a robot will figure out how you died.

[via New Scientist]

Syndicated 2009-10-27 16:59:11 from Popular Science - robotics

A Look At Japan's Retro-Future

As much as we love the actual future here at Popular Science, we love the past's vision of the future almost as much. So we basically freaked out when our good friends over at Pink Tentacle discovered this spread from a 1969 issue of the Japanese magazine Shonen Sunday.

These pictures show a predicted 1989 where computers have changed how we live. The above photo depicts a classroom full of children learning on computers, watching a video of a teacher, and receiving beatings from enforcement robots. Considering that most students today actually use their computers for sexting, downloading music, and online poker, maybe those robo-thugs weren't such a bad idea.

This pic shows the home of the future. The personal computer and Roomba-like vacuum robot are pretty spot on, but why is Mom's computer using a punch card? And why does the robot need arms to clean the dishes? If they have video phones and flying cars, you'd think they'd at least have a dishwasher.

This third image is probably the most accurate. Sure, it was off by about 20 years, but that machine is the spitting image of the DaVinci medical robot. Listed as one of our top surgical advances of the last 20 years, remote surgery via robots was a bold prediction in 1969, but one that our present technology has vindicated.

This article leaves me with one question, though: Why is everyone in the future always wearing a jumpsuit?

[via Pink Tentacle]

Syndicated 2009-10-23 20:01:54 from Popular Science - robotics

University of Maryland's $500 Maple-Seed UAV Takes To the Skies

Last year, after untold millions of dollars, DARPA failed to renew a Lockheed program to design a UAV based on a maple tree seed. While that program, backed by tons of cash and one of the world's largest aerospace companies, amounted to bupkis, a University of Maryland project to create a maple seed UAV has finally accomplished what DARPA and Lockheed couldn't.

Over the course of about a year, the U of M students constructed a maple-seed-mimicking UAV, camera and all, from $500 worth of parts. The UAV can take off and land safely by itself, but the camera still needs a little work. It uses a battery to power a little propeller and a camera, and is piloted with a radio controller.

I think it's safe to say that the Lockheed version, a video of which can be seen here, cost a great deal more than $500. To see the University of Maryland UAV in action, along with a history of the project from conception, through testing, to completion, check out the video below. But ignore the music, it's a little over the top for a science project (what, no Carmina Burana?)


[via Bot Junkie]

Syndicated 2009-10-22 17:00:56 from Popular Science - robotics

22 Oct 2009 (updated 22 Oct 2009 at 13:58 UTC) »

Sticky Bot and Biobots

Hey Guys. I heard of Sticky Bot today and I thought I had to share it with robot.net. Sticky Bot is covered with thin polymer, which is the key matter that help StickyBot stick to ceilings, windows or everything! When we take a tape and polymer, and tie it on to Sticky Bot, the tape will be weakened and then it can fall off. However the polymer is much, much, much stronger and can go 1000 times more longer than a tape! Let see some videos! Oh! If you want to learn more about biobots check this URL!- http://www.discovermagazine.com/web/biobots

<!-- [if IE]><![endif]-- ></object>

<!-- [if IE]><![endif]-- ></object>

Thank You For Visiting My Blog Post!!!!!!!!!!!!!!!!!!!!!!!!!

Robot Skier Kills the Bunny Hills, Not Ready For Black Diamond

While it lacks the subtle charm of Alberto Tomba, this robot is just as much at ease flying down a slalom course. Designed by Bojan Nemec of the the Jozef Stefan Institute in Slovenia, the robot utilizes two computers to stay upright and pointed downhill.


The upper part of the robot contains a USB camera, a GPS system and the computer that processes the information from those sources to keep the robot heading in the right direction. The lower portion of the robot contains a computer that controls the legs, and the gyroscope that keeps the robot balanced.

Nemec created the robot to test ski equipment, and to help model virtual reality skiing simulations. So far, Picabo Street can rest easy, as the robot can't even ski in a straight line. In fact, Nemec doesn't think the robot will beat a human in the next five years.

To further assuage your fears of a metal athlete taking home the gold at the Winter Olympics, here's a blooper reel that proves something falling down on a ski slope is always funny, whether it be man or machine.

[via IEEE Spectrum]

Syndicated 2009-10-20 15:59:47 from Popular Science - robotics

Tiny Fire Spy Recon Bot Lets Firefighters See Inside The Blaze

If knowing is half the battle, then firefighters waging war on a blaze start at a serious disadvantage. A lack of information concerning what’s going on inside a fire means firefighting personnel often must speculate which way the fire is moving, where the hottest spots are, and most importantly, where people might be trapped by the flames. The Fire Spy Robot hopes to tip the scales back in firefighters’ favor by providing valuable intel from inside infernos even while helping to extinguish them.


Developed by South Korean firm Hoya, the remote-controlled Fire Spy can go places firefighters can't safely reach, beaming back images and sounds while relaying temperature, smoke and air quality data to firefighters over 50 yards away. It’s not a simple snapshot either: Fire Spy can roam a burning structure for up to half an hour, pushing through temperatures of up to 320 degrees Fahrenheit at a speed of one foot per second. Measuring just 12.5 cm in diameter and weighing just under four-and-a-half pounds, Fire Spy can search for survivors in tight spaces while surviving falls of up to six feet.

Using its onboard light and camera, Fire Spy’s main function is to precisely locate people trapped in burning structures so live firefighters spend less time searching for them. The data also provides firefighters critical information about safe routes in and out of a building so they can develop the best possible rescue plan. But like all good reconnaissance drones, Fire Spy can also join the fight, towing a hose and spraying water as it explores a blaze.

Of course, some fires are too big for the tiny Spy, so another South Korean firm, DRB Fatec, has developed its own larger firefighting robot. At nearly 3 feet tall, it can withstand temperatures up to 920 degrees Fahrenheit for an hour or more. Between the two of them, firefighters may soon have a one-two punch that delivers them the upper-hand when the heat is on.


[PhysOrg]

Syndicated 2009-10-08 17:58:12 from Popular Science - robotics

Physics: The First Step of Robotics-1(Other Kinds of Energy)

Physics: The First Step of Robotics-1(Other Kinds of Energy)

Everyone knows that physics is the basics for robotics. So today lets learn more about other energy that we can see during our lives.

Friction:
Friction is a energy that disturbs a objects motion. But why do they happen? Every objects are made atoms, which is the smallest particle that makes up a object. But this is not the end. Atoms combine to make molecules, which is the combination of atoms that has the object’s property. When molecules meet, they pull each other which makes friction!
F=μN is the formual, μ is a unit that measure how a object surface is rough or soft. And N means the objects weight.

(Example: when we walk, run, everyday life)

Image and video hosting by TinyPic

Elastical Energy: Elastical energy is a energy that happens when a object is very elastic. F=KX is the mathematical formula for measuring it.
K represents how and object is elastic. And also X represents the length of how much the object became pressurized or became long.
(Example: rubber, spring)

Syndicated 2009-10-07 13:54:55 from Mickey's Blog

Physics: The First Step of Robotics-1(Mechanical Energy)

Physics: The First Step of Robotics-1(Mechanical Energy)

As you know, physics is the most important thing we have to know to do robotics. So first, lets talk about ‘mechanical energy’.

Image and video hosting by TinyPic

Kinetic Energy:
This is a energy that occurs when a object moves.
It’s mathematical formula is E=1/2mv2. ‘M’ stands for mass, and ‘V’stands for velocity or speed. It mearsured in Joules or ‘J’, maded after the physicist “Joules”.

(Examples: A running man, trains, buses, cars, planes, rockets)

Potential Energy:
This energy occurs due to the height of an object. If I am in high above
Mt. Everest, that means I have potential energy. It’s mathematical formula is E=9.8mh. ‘M’ stands for mass, and ‘h’ stands for height. It mearsured in Joules or ‘J’, maded after the physicist “Joules”.

(Example: In above a building, tower, mountain, inside a flying plane)

Mechanical Energy:
This energy is measured in ‘kinetic energy+potential energy’ or
‘1/2mv2+9.8mh’. Mechanical energy is very useful in our lives. Let solve this question.
—A ball is about to fall down and collide with the ground. The ball fell down at the height of 100 meters. Calculate the velocity or speed of the ball when it collides with the ground.
A: First we don’t know the mass of the ball. So lets just skip that one.
When the ball fell at the height of a 100m, the kinetic energy is 0J because it didn’t move, and the potential energy is 9.8*100 so its 980J. So the mechanical energy at this point is 980J+0J so its 980J!

But when the ball fell, the potential energy gets smaller because the height is getting low. While since the ball is gaining speed, it gains more kinetic energy! So at any point of the route where the ball is falling, the mechanical energy is 980J!

When the ball collides with the earth, we already knowe that the mechanical energy is 980J. And since the height is 0 meters, the potential energy is oJ. So this means 1/2v2+9.8h=980, the potential energy is 0J, so
1/2v2=980.
——-Velocity= About 44m/second

Thank You!!!!!!

Syndicated 2009-10-05 12:03:06 from Mickey's Blog

45 older entries...

X
Share this page