Older blog entries for jwp9447 (starting at number 74)

MIT Plans to Rebuild Artificial Intelligence from the Ground Up

After 50 years and countless dead ends, incremental progress, and modest breakthroughs, artificial intelligence researchers are asking for a do-over. The $5 million Mind Machine Project (MMP), a patchwork team of two dozen academics, students and researchers, intends to go back to the discipline's beginnings, rebuilding the field from the ground up. With 20/20 hindsight, a few generations worth of experience, and better, faster technology, this time researchers in AI -- an ambiguous field to begin with -- plan to get things right.

The study of AI is a half a century old, beginning with lofty expectations at a 1956 conference but quickly fragmenting into different specializations and sub-fields. The MMP wants to roll back the clock, fixing early assumptions that are now foundations of the field and redefining what the objectives of AI research should be.

The fundamental problem, it seems, is that the mind, memory and body function both together and separately to solve any number of problems, and the way they work together (and alone) varies from problem to problem. The human mind alone applies various systems and functions to any given problem. Many AI solutions have attempted to solve all the problems with one system or function rather than multiple systems working together as in the human mind, a "silver bullet" approach that hinders real progress.

Likewise, when it comes to memory, researchers have created models that work more like computers, where everything is either one or zero. Real memory is filled with gray areas, ambiguities and inconsistencies, but functions in spite of not always being congruent. MMP researchers also intend to bring computer science and physiology together, forcing computers to work within the confines of physical space and time just like the body does.

The team even proposes discarding the Turing Test, the long-recognized standard for determining artificial intelligence. Instead, MMP researchers want to test for a machine's comprehension of a children's book -- rather than a human's comprehension of another human being -- to gain a better understanding or the AI's ability to process and regurgitate thought.

It's a big-picture approach to a big challenge, and while it's perhaps unlikely that the team can re-imagine AI in the ambitious five-year window they've given themselves, it very well could shore up some of the loose underpinnings of a discipline that has boundless potential to shape a better world (or, for you SkyNet junkies, limitless potential to destroy it). If nothing else, it's a responsible admission from the scientific community that they simply don't have it quite right, that we need to rethink what we think we know.

Climatologists, take notes.

[MIT News]

Syndicated 2009-12-07 18:19:30 from Popular Science - robots

Air Force Admits Existence of Outed Skunkworks Stealth Drone

A U.S. Air Force official provides details on the mysterious aircraft spotted in Afghanistan

A mysterious, unidentified drone that bore strong resemblance to previous stealth aircraft has finally been revealed. The U.S. Air Force confirmed its existence and identity to the Aviation Week last Friday, after photos circulating online had caused much speculation among defense buffs.

Informally dubbed the "Beast of Kandahar," the tailless flying wing apparently is an RQ-170 Sentinel designed to provide recon and surveillance for warfighters on the frontlines. Online photos had shown the drone at a General Atomics Aeronautical Systems's hangar in Kandahar, Afghanistan.

The 30th Reconnaissance Squadron also operates the RQ-170 at Tonopah Test Range -- a location perhaps better known as Area 52 -- under the aegis of Creech Air Force Base in Nevada.

The unmanned aerial system (UAS) comes courtesy of Lockheed Martin's Skunk Works, which has worked on the U-2 and Blackbird spy planes as well as a swimming spy plane for the U.S. Navy. Lockheed had previously displayed a different stealth drone design that bore resemblance to the Reaper drones currently flying missions in Afghanistan.

The Ares technology blog points out that the stealth features suggest a drone meant for tactical operations in support of combat troops, rather than sneaky intelligence-gathering.

Meanwhile, the UK has launched its own call for a laser-toting stealth UAS. We'll let you know when those designs leak out.

[via Ares: Aviation Week]

Syndicated 2009-12-07 16:54:44 from Popular Science - robots

This Week in the Future, November 30-December 4, 2009

Leave a comment to win a TWITF t-shirt!

Pork grown in a lab, butterflies hatched in space, male fish going girly and a brand new rainbow trap? The future sure keeps us on our toes.

(Get the details, and win the t-shirt, after the jump).

We don't exaggerate, people. Check out this week's future news:

What is the future going to roll out next? Whatever it is, you know you'll see it first here on PopSci.com.

In the meantime, wanna win a t-shirt?

Leave a comment (any comment) to put your name in the pile; we'll randomly choose and announce our winner right here next Friday, December 11. And, if you just can't wait that long, you can buy the shirt for yourself here. Good Luck!

Congratulations to last week's winner, PopSci user "yash."

Until next time. Enjoy our past weekly illustrated roundups here.

Syndicated 2009-12-04 21:31:16 from Popular Science - robots

Robot Bartender Pours Your Drink Based on Your Tetris Skill

An engineer showcases interactive drink-mixing video games for the upcoming Roboexotica event

Robots, alcohol and video games make one tantalizing combination to put on distant-future Christmas lists. Now geek boozers are in luck: the one-man Nonpolynomial Labs has developed interactive versions of Mario and Tetris that incorporate a robotic bartender to mix up drinks during real-time play.

The interactive games come courtesy of Kyle Machulis, a self-described "mild-mannered engineer" who tackles some decidedly unorthodox garage projects that have included a "Moaning Lisa" sensor-feedback mannequin and a "LifeCycle" that uses an exercise bike to drive virtual vehicles in Second Life. He created "Adult Mario" and "Bartris" to showcase in the upcoming Roboexotica event held in Vienna, Austria, where robots display their cocktail mixing skills.

"Adult Mario" looks like a typical game of Mario Bros., except with a few interactive twists. Jumping on an enemy causes players to receive a small bit of rum in their cup. Grabbing coins adds a small squirt of Coke to the mix. Reaching the end-level flagpole triggers a shaking motion for as long as Mario slides down the pole, and adds an additional kick of rum during that time for good measure.

There are also less "adult" interactive elements, such as fans blowing into players' faces when they run Mario faster through the game.

"Bartris" plays like a normal game of Tetris with falling puzzle blocks, except that brown pieces represent Coke, gray pieces represent rum, and blue pieces represent water. The robotic bartender mixes accordingly.

"So you actually have the chance of making a drink that absolutely sucks," Machulis says in a video.

We also like the different "Bartris" modes, which include "Booze mode" (no water pieces dropped) and "Designated Driver mode" (only water pieces drop). And if you make a drink that's too stiff or too watery for your tastes, you only have yourself to blame -- not like playing with that straitlaced SOBEaR robot.

[Nonpolynomial Labs Roboexotica via Kotaku]

Syndicated 2009-12-04 21:16:33 from Popular Science - robots

Robotic Sea-Glider Achieves First Unmanned Underwater Transatlantic Crossing

Cold robotic purpose can apparently break records as well as human fortitude

Charles Lindbergh may have shown human fortitude by flying across the Atlantic in his "Spirit of St. Louis," but now he has robotic company when it comes to transatlantic records. An underwater robotic glider built by Rutgers University students and scientists has achieved the first underwater robot crossing, after traveling beneath the waves for 221 days.

Rutgers researchers joined some Spanish colleagues today aboard the "Investigador" ship to recover the drone, after launching it on April 27, 2009 off the coast of New Jersey. The submersible bot made its 4,591-mile journey at the slow but steady pace of 4 centimeters per second.

Scarlet Knight: There and back again  Rutgers University


Named "The Scarlet Knight" for Rutgers sports -- despite its fine yellow appearance -- RU27 technically already claimed its transatlantic record on Nov. 14 after 201 days at sea. But the Rutgers team clinched the accomplishment after recovering the scarlet lady, and reportedly gave her a dose of champagne to celebrate.

Rutgers University alone has a small underwater fleet of up to seven gliders operating off the coast of New Jersey, with one even cruising around the Antarctic. The U.S. Navy has likewise deployed a number of drone submersibles (not to mention sea mammals), and private companies may also soon send out swarms of underwater explorers for oil prospecting.

Looks like Scarlet won't be too lonely the next time she decides to take a dip.

Tracking Scarlet: A journey's end is reached  Rutgers/Google

Syndicated 2009-12-04 17:01:02 from Popular Science - robots

Optical Sensors in Robots' Skin Give Them A Softer Touch

Whether they are assisting the elderly, or simply popping human skulls like ripe fruit, robots aren't usually known for their light touch. And while this may be fine as long as they stay relegated to cleaning floors and assembling cars, as robots perform more tasks that put them in contact with human flesh, be it surgery or helping the blind, their touch sensitivity becomes increasingly important. Thankfully, researchers at the University of Ghent, Belgium, have solved the problem of delicate robot touch.

Unlike the mechanical sensors currently used to regulate robotic touching, the Belgian researchers used optical sensors to measure the feedback. Under the robot skin, they created a web of optical beams. Even the faintest break in those beams registers in the robot's computer brain, making the skin far more sensitive than mechanical sensors, which are prone to interfering with each other.

Robots like the da Vinci surgery station already register feedback from touch, but a coating of this optical sensor-laden skin could vastly enhance the sensitivity of the machine. Additionally, a range of Japanese robots designed to help the elderly could gain a lighter touch with their sensitive charges if equipped with the skin.

Really, any interaction between human flesh and robot surfaces could benefit from the more lifelike touch provided by this sensor array. And to answer the question you're all thinking but won't say: yes. But please, get your mind out of the gutter. This is a family site.

[New Scientist]

Syndicated 2009-12-01 22:54:54 from Popular Science - robots

Sighted: Afghanistan's Mystery UAV

Since April, a steady string of reports have detailed sightings of a mysterious, unidentified UAV prowling the skies above Afghanistan. Grainy, Loch Ness Monster-like photos revealed a flying-wing-type aircraft with stealth features.

Now, the French blog Secret Defense has published the clearest photos yet of the secret plane, and the mystery has only deepened.

The plane pictured above is clearly a next-generation UAV, but the question of which next-generation UAV it is has led to some debate. At first look, Steve Trimble of The DEW Line thought it resembled Lockheed's Polecat. However, Popsci's resident UAV expert Eric Hagerman pegged the mysterious drone as Boeing's X-45. Then again, John Pike of GlobalSecurity.net noted "for every UAV program we know about, there's one that we don't know about," suggested the new UAV may be part of some previously unannounced program.

In many ways, the confusion only highlights the uniformity of the next generation of UAVs. Both the X-45 and the Polecat incorporate stealth features, resemble the flying-wing shape first perfected by the B-2, and have just enough development behind them that battlefield testing doesn't seem unreasonable.

Amazing. A mere seven years after the CIA carried out its first drone strike, the Predator's replacements have already arrived in theater.

[Secret Defense, via The Dew Line]

Syndicated 2009-12-01 18:58:07 from Popular Science - robots

At the International Robot Exhibition in Japan, Robots For Your Every Need

Want a nap? Hate walking? Need uncomplaining workers? Robot makers have something for you

That economic recession has hardly slowed down the growing swarm of robots designed for almost every task imaginable.

Many of them showcased their skills at Japan's International Robot Exhibition 2009, along with a host of human handlers. Consumers in the market for a pair of robot skates need not hold their breath for much longer.

Launch the gallery for a selection of our favorite 'bots.

Syndicated 2009-11-30 20:30:07 from Popular Science - robots

This Week in the Future, November 23-26, 2009: Thanksgiving Special

Leave a comment to win a TWITF t-shirt!

There's a lot to be thankful for in the future. Gather 'round the table, all you Navy Sea Lions, Jazz Bots, Star Wars enthusiasts and ethical scientists. Today is a day for futuristic feasting.

(Get the details, and win the t-shirt, after the jump).

What a wonderful Thanksgiving week it's been here on PopSci.com, and what a fascinating future lies before us:

Wasn't that just delicious?

Love our graphic? Win this t-shirt!

Leave a comment (any comment) to put your name in the pile; we'll randomly choose and announce our winner right here next Friday, December 4. And, if you just can't wait that long, you can buy the shirt for yourself here. Good Luck!

Congratulations to last week's winner, PopSci user "Dane619."

Until next time. Enjoy our past weekly illustrated roundups here.

Syndicated 2009-11-26 17:08:52 from Popular Science - robots

A.I. Anchors Replace Human Reporters In Newsroom of the Future

In the great media reshuffling ushered in by the Internet Age, print journalists have suffered the most from online journalism’s ascent. Broadcast journalists, however, may be the next group to feel technology’s cruel sting. Engineers at Northwestern University have created virtual newscasts that use artificial intelligence to collect stories, produce graphics and even anchor broadcasts via avatars.

The project, dubbed “News At Seven,” goes beyond simply regurgitating news stories gleaned from the Web. The system can generate opinionated content like movie reviews or pull the most relevant facts from a box score to pen a hometown sports story. The AI is even learning to crack wise, injecting humor into reports. But don’t take our human-generated word for it, check out the NSF video below.

[National Science Foundation]

Syndicated 2009-11-25 18:40:37 from Popular Science - robots

65 older entries...

X
Share this page