The first video in this playlist is a presentation given last year at the announcement of the event. The rest were taken at the event itself, and show the nature of the competition as well as something of the level of sophistication of the competitors.
Professor Mary-Anne Williams came to robotics through RoboCup. Her background began in computer science, from which she moved to AI, with a primary interest in knowledge representation and reasoning, having done some work in belief revision (how to update a knowledge base when you receive new or contradictory information). In 2001, she attended the International Joint Conferences on Artificial Intelligence (IJCAI), with which RoboCup was co-located, and was captivated by the Sony AIBO (then used as players in RoboCup soccer). She was also impressed with how much progress there'd been since the first RobotCup, a few years earlier. Returning to Australia, she wanted to see her belief revision algorithms running on a robot, thinking they might improve performance. However, the only way to get an AIBO in 2001 was to actually start a soccer team, so that's what she did. That team placed third the following year and won the competition in 2004. Ms. Williams is a Research Professor at the University of Technology, Sydney (UTS), Australia. She is also Director of the Innovation and Enterprise Research Laboratory (a.k.a. The Magic Lab), which has come under the umbrella of Quantum Computation & Intelligent Systems at UTS. Her work focuses on cognitive models of decision making and behaviour in complex and dynamic environments, including applications in mobile robotics. In this interview, she talks about her work, her involvement with the International Conference in Social Robotics and the PR2 robot.
SoftWear Automation, the latest in a string of ventures founded by Steve Dickerson - a retired Professor of Mechanical Engineering and Professor Emeritus at Georgia Institute of Technology - and nurtured by Georgia Tech's startup accelerator, the Advanced Technology Development Center (ATDC), has been awarded a contract for $1.25 Million, by DARPA, to develop automated sewing work-cells, that the company hopes will reinvigorate the domestic garment industry and DARPA hopes will shorten the time from requisition to delivery while lowering costs, as well as reducing reliance on foreign suppliers. (The Department of Defense currently spends about $4 Billion per year on uniforms.) Part of that $1.25 Million will go to Georgia Tech, which is expected to provide considerable support for the development of the technology. The idea for SoftWear Automation began when, in 2007, Professor Dickerson was asked to participate in a seminar on the future of robotics. Danger Room provides perspective, and Gizmag has both more detail and a small gallery of conceptual hardware prototypes.
Tovbot's Shimi made its first public appearance two days ago at Google I/O, where not just one but three Shimis performed in perfect coordination. Tovbot was formed earlier this year by a group of robot researchers and entrepreneurs hailing from Georgia Tech, IDC in Israel, and MIT Media Lab. [Their] goal is to foster a new paradigm of personal robots - robots that don't just clean your floors or your pool, but also interact with you on a personal, almost human level. According to a news item on Georgia Tech's website, Shimi, a musical companion developed by Georgia Tech’s Center for Music Technology, recommends songs, dances to the beat and keeps the music pumping based on listener feedback.Automaton has more detail.
This year's Field Robot Event (FRE 2012) began today and runs through Saturday. I will bring together what reports I am able to find once the dust settles, but meanwhile you can view videos from past years' events by entering "Field Robot Event" in YouTube's search field.
Researchers at the University of Southern California's Viterbi School of Engineering have succeeded in making an artificial fingertip outperform humans in identifying a range of textures. That fingertip, the BioTac® from SynTouch LLC, is a molded elastomeric sleeve with a fingerprint-like pattern on the outside and sensors on the inside, filled with a conductive fluid. What the USC researchers have done is to develop algorithms for interpreting the data produced by the fingertip and for optimizing the movement of the robotic arm or hand on which it is mounted to most efficiently produce useful data. Their findings have been published in Frontiers in Neurorobotics. SynTouch LLC, founded in 2008, is a start-up technology business that develops and manufactures tactile sensors for mechatronic systems. BioTac® sensors are available as an evaluation kit, and also as kits for the BarrettHand and the Shadow hand.
Japanese company DOUBLE Research and Development has developed a three-fingered robotic hand using a single pressure sensor and a single actuator. The linkage through which the fingers are attached to their mount automatically equalizes the pressure applied by each.
Researchers at the Exertion Games Lab at RMIT University in Melbourne, Australia have created a robot to support you while exercising. Joggobot is a modified version of the popular AR Drone quadrocopter platform developed by French company Parrot. The robot will track a marker pattern printed on your t-shirt and fly ahead of you when you go out for a run. The researchers describe Joggobot as an "exertion game". They believe that jogging is play - we are not jogging to get from A to B, but for the experience of jogging - and point out that jogging with a physical device that reacts to its environment and, similar to a human jogger, has a limited amount of energy for exercise creates a very different interaction experience than pure audio-visual stimuli such as aerobic videos. They hope that the robot can improve the jogging experience and enhance our understanding of why we jog (and hence why we do not jog enough).
A CSH Lab news release says neuroscientists at the Brain Architecture Project have reached an important milestone. They've released the first installment of the 500 terabytes of data from the whole-brain wiring diagram of a mouse brain. The data is in the form of gigapixel whole-brain slice images. It's possible browse through the brain to the desired 20 micron-thick slice, then view the image, zooming it the level of individual neurons. Most importantly, the image data is being released in an open science initiative, freely available for anyone to view and use in their research. The technical approach used was developed by Partha P. Mitra.
"The pragmatic approach Mitra advocated and which is realized in this first data release, is to image whole mouse brains in a semi-automated, quality-controlled process using light microscopy and injected neural tracers (both viruses and classically used tracer substances). While the basic methodology has been available for some time, systematically applying it to a grid of locations spanning the entire brain, and digitizing and re-assembling the resulting collection of brains, is a new approach made feasible by the rapidly falling costs of computer storage."
There's a cool Robotics Trends article on robotics researchers studying how mosquitoes survive flying through rain when every raindrop is 50 times the mass of the mosquito. The idea is to make micro air vehicles that sturdy. The Swirling Brain tells us robot lifeguards are on the way. Nootrix did a post recently in which they speculate about using ROS with the new LEAP gesture sensor. IEEE Spectrum published an interesting piece about educational robotics in Africa. And Slate posted an essay by Dale Dougherty on how we could improve education in the US by replacing standardized testing with a program of teaching kids to do real things, like building robots and rockets. NASA, well known for building robots and rockets, let us know they're ready with their new autonomous robot competition in which teams have built planetery rover style sample return robots. Know any other robot news, gossip, or amazing facts we should report? Send 'em our way please. And don't forget to follow us on twitter.
The above video was posted one day prior to a major, much-publicized experiment, tracing water movement in California's Sacramento-San Joaquin river delta, which is prone to reversals in the direction of flow. A more polished video produced on the occasion of the launch of 100 floating sensors into that river system appears after the break. The Floating Sensor Network is a project of the University of California at Berkeley, involving the Lagrangian Sensor Systems Laboratory (LSSL), the Lawrence Berkeley National Laboratories (LBNL), and the California Department of Water Resources.
In early March, Boston Dynamics posted a video (embedded after the break) showing the Cheetah robot they are developing for DARPA running at 18 miles per hour (a new record for a robot running on legs), without any stabilization straps attached. More recently the MIT Biomimetic Robotics Lab has posted videos of their version of the Cheetah, first walking (embedded after the break), then trotting, with some stabilization (embedded both above and after the break). The MIT version appears to be more complex than the Boston Dynamics version, particularly in the way the legs are jointed, but also in the way the rear legs connect to the rest of the body, although it's impossible to tell whether what appear to be vertebrae, in the MIT version, are actually functional as such, from the video alone.
Also presented recently at ICRA, Takahiro Kizaki and Akio Namiki from the Graduate School of Engineering at Chiba University in Japan demonstrated a system comprised of a fast vision system (500 fps) coupled with a fast robotic arm and three-fingered hand, capable of juggling two balls by tracking them in the air and adjusting accordingly. Automaton has more detail.