Google Project Tango mobile phone

0

After acquiring just about any robotics company worth something, Google research have now revealed one of their internal projects that aims to create a mobile phone that has all the necessary hardware and software to create beautiful 3D models of indoor environments. Project Tango is the first prototype of a highly customized smart phone with all the necessary sensors for localization and mapping.

In the past, we have introduced the problem of SLAM (Simultaneous Localization and Mapping) and its significance in robotics.

Project Tango adds 2 computer visions processors and a depth sensor (not clear what this depth sensor is but probably a kinect-like sensor of smaller size and resolution is a possible candidate) along with a motion tracking camera to accompany a standard 4MP phone camera. With such a great collection of sensors and a myriad of very advanced SLAM algorithms developed over the last 15 years by the robotics community, the reality of Project Tango should not be surprising.

It will be interesting to see what kinds of applications could be built using such a platform. For the time being, creating 3D models for gaming is an obvious focus but they also suggest assistive living, e.g., guide application for the visually impaired, as another possibility. Google is giving away 200 prototype phones for those interested in developing applications.

The following video was released by the Project Tango team and tries to explain some of the reasoning behind the creation of this project and also discuss some future directions.

KUKA robot versus Tim Boll

0

First, robots defeated man in chess.

Second, robots defeated man in the game of Jeopardy.

Third, robots proved to man that they can be better drivers.

Now, they will get the chance to best man in the game of ping pong. I saw it coming a mile away!

So, industrial robot manufacturer KUKA will be hosting a table tennis match between their KR AGILUS robot arm and puny human champion Timo Boll. The purpose of the exhibition is to celebrate the opening of a new robot factory, or so they say! We all know that the robots are finally coming out of the closet and after embarrassing humans in tasks of intellect, will now also demonstrate their superior physical abilities. All, jokes aside, however, the trailer looks pretty cool (see it below) and the game will be broadcast live on March 11th. You can watch it here.

BigDog dynamic manipulation: It throws a cement block!

2

Boston Dynamics have amazed us more than once over the years having created some of the most incredible and, at times, scariest (not in the uncanny valley sense but more on the Terminator is real sense) robots. Their creations range from the jumping Precision Urban Hopper, the incredibly realistic dynamically balanced humanoid PETMAN, and, of course, the very familiar 4-legged robot mule BigDog.

So, what are they up to recently?

They just released a new video of BigDog equipped with a powerful robotics arm lifting and throwing a cement block with ease and elegance; also, it does this while maintaining perfect balance. And this is what much of the work at Boston Dynamics is focused on, real-time control of the most advanced robots on this planet. The main reason we are also so flabbergasted every time they release a new video.

So, watch BigDog picking up and throwing a cement block across the room and join me in hoping that Daniel H. Wilson will soon released an updated version of his How To Survive a Robot Uprising: Tips on Defending Yourself Against the Coming Rebellion including tips on how to avoid getting knocked out by flying cement blocks :)

Multi-user spatial collaboration using augmented reality on mobile devices

2

Augmented reality for mobile devices has grown in popularity in recent years partly because of the proliferation of smart phones and tablet computers equipped with exceptional cameras and partly because of developments in computer vision algorithms that make implementing such technologies on embedded systems possible.

Said augmented reality applications have always been limited to a single user receiving additional information about a physical entity or interacting with a virtual agent. Researchers at MIT’s Media Lab have taken augmented reality to the next level by developing a multi-user collaboration tool that allows users to augment reality and share that we other users essentially turning the real world into a digital canvas for all to share.

The Second Surface project as it is known is described as,

…a novel multi-user Augmented reality system that fosters a real-time interaction for user-generated contents on top of the physical environment. This interaction takes place in the physical surroundings of everyday objects such as trees or houses. The system allows users to place three dimensional drawings, texts, and photos relative to such objects and share this expression with any other person who uses the same software at the same spot.

If you still have difficulty understanding how this works and why I believe when made available to the general masses it will be a game changing technology for augmented reality and mobile devices, check out the following explanatory video.

Now, imagine combining this technology with Google Glass and free-form gesture recognition. How awesome would that be?

[source]

Quadrocopter inverted pendulum acrobatics

0

Quadrocopters are all the rage these day but nobody gets them to more impressive acrobatics than the machine learning and control team, led by Prof. Raffaello D’Andrea, at ETH.

Whereas just last year the team demonstrated a single quadrocopter balancing an inverted pendulum, just recently they demonstrated how two flying machines not only can individually do the balancing trick but can also throw and catch a pole between them. The quadrocopters use machine learning algorithms to improve their throwing and catching performance over time. The results are rather impressive as you can tell for yourself after watching the team’s demonstration video following.

I wonder what they will achieve next!

[source]

UK’s Oxford robot car is here

0

UK robot carI guess English scientists and engineers don’t like to be outdone by their American counterparts so after many years of hearing constantly about Google’s rapid development of self-driving cars, a team from Oxford university recently unveiled their very own robot car. And they make some very big claims about the new vehicle’s capabilities compared to the competition.

So, UK’s robot car is the brainchild of a small team (22 members) of researcher from Oxford led by Pr. P. Newman. The vehicle is a modified Nissan LEAF, i.e., all electric. The team outfitted the car with camera and laser sensors as well as an additional on board computer for number crunching, all (hold on to your hats folks!) for no more than 5000 pounds. And they estimate that in just a few years, they will reduce the cost down to only 100 pounds. Now, if that hasn’t gotten your attention, I don’t know what will.

We all want to spend as little money as possible for access to the newest technology but does this robot car actually work? The researcher claim and have published several videos demonstrating the car driving autonomously in an urban environment. It can achieve speeds of up to 40Km/h which is not particularly impressive but certainly a great start.

robot car laser sensor data visualization

And before I forget, the team has developed new vision and laser-based navigation and localization algorithms that allow the car to drive to a destination without the use of GPS; now, GPS is not the most useful when driving in the downtown of a city since the buildings tend to block the satellite signals so I can understand the need for GPS-free navigation. The experience-based approach the researchers have developed uses stereo vision to localize and track the vehicle. That’s great, but having worked with vision systems in outdoor environments, I am not completely convinced that this would work in all cases, especially when the weather turns bad or the vehicle gets stuck behind a truck or SUV. So, fusing data from multiple sensors such as cameras, laser, accelerometers, and, of course, GPS would be the more reliable navigation solution and the one more likely to allow for autonomous cars to be given legal permission to drive in our cities.

But enough talk. Let us enjoy the videos the Oxford team has published showcasing their new toy :)

The team’s introductory video of the autonomously driving Nissan LEAF vehicle.

Demonstration of laser-based semantic map used for navigation and dynamic obstacle detection, tracking and avoidance.

Experience-based navigation system using vision and laser sensing.

Sir James Dyson on robot vacuums: Not good enough!

0

Sir James DysonSir James Dyson, the man whose name is synonymous with vacuum cleaners that work (among other wonderful inventions) recently expressed his opinion on the robot vacuums available in the market today. He is not impressed considering them just a gimmick!

It was iRobot just a few years ago that created the Roomba essentially the first robot vacuum that was worth paying money for (and at the time reasonable priced as well!) The company has been extremely successful selling Roombas and other related cleaning robots (and the also very successful and useful Packbot line of military and law enforcement robots). Since Roomba was introduced just 10 years ago, it has sold more than 3 million units. That’s good for a domestic robot that costs only a few hundred dollars. This explains why in the last few years, there has been a plethora of other companies introducing similar robot vacuum cleaners often at a higher price but not necessarily doing a better job.

Regardless, Dyson is not shy on declaring that these gimmicks are both terrible vacuum cleaners (and he knows a lot about vacuuming) and robots. He things that a vacuum cleaners current offerings do a bad job at actually cleaning the floor. And as robots, they are not very intelligent in how they do their job and tend to be very inefficient at that; this should not be a surprise to those of us who are familiar with Rodney Brook’s (co-founder of iRobot) ideas on robot design and programming which advocate simplistic, insect-like, sensing and decision making for robots.

So, other than criticizing others, what is Dyson up to when it comes to robot vacuums? Well, he did not say that he doesn’t think that it is a good product idea; he simply expressed his opinion that there is plenty of room for a better product. And since he did not dismiss the possibility of his company entering this space, I would say that a Dyson robot vacuum is very likely to be introduced within a year or two. The question will then be, is it all that Sir Sames Dyson thinks it is going to be compared to the competition?

Time will tell!

[source]

robotic arm prosthesis

Robotic prosthetic arm that can feel

0

robotic arm prosthesisThe year 2013 has started brilliantly, even more so than 2012 ended. If you recall, it was in late 2012, when it was announced scientists and engineers had developed a robotic arm that a quadriplegic woman was capable of controlling using her thoughts to the effect of manipulating objects and gesturing to other people. That happened in North America via the incredible efforts of the researchers at the University of Pittsburgh and John Hopkins University’s Applied Physics Laboratory (JHU/APL)[ source].

But wait a few months, cross the Atlantic ocean and you can do one better. Whereas up to a few days ago, researchers had successfully created brain-controlled prosthetic arms, none had a link back to the brain providing feedback. But the ingenious researchers at Translational Neural Engineering (TNE) laboratory, Ecole Polytechnique Federale de Lausanne (EPFL), led by Dr. Micera have created such a bidirectional link via the body’s nervous system and an interface consisting of an collection of higly sensitive electrodes. The below picture gives an overview of this human-machine interface that could potentially have a huge impact in the lives of amputees and the elderly.

robotic arm human-machine interface

So, for the first time ever, a person outfitted with this new robotic arm will actually regain their sense of touch. And this wonderful moment is fast approaching as there are already plans for the first person to receive this new prosthetic arm in an operation to take place in Italy.

We will be watching with much anticipation the outcome of this trial!

[source]

Chainsaw robot makes furniture for your home

1

Yes, you have read the title correctly!

Some interior design students have a pretty wild imagination (and very likely some friends with robotics expertise and access to some cool hardware:) ) as they actually designed and programmed a robot that can make a couple of stools out of a piece of wood. The robot is a standard Kuka industrial robot with a chainsaw end effector. With some clever programming, the high precision robot can curve a tree trunk with millimeter accuracy into 2-3 usable stools.

Move over 3D printers; this robot takes the cake!

Enjoy the video of the chainsaw wielding robot in action.

[source]

MABEL two-legged robot fastest in the world

2

It would appear that we have a new champion in the “what robot can run fastest race”. The two-legged robot MABEL under development for several years at the University of Michigan was recently revealed to reach a top running speed of 6.8 miles per hour or roughly 11 kilometers per hour. This means that MABEL is significantly faster than the previous record holder which was Toyota’s humanoid robot with a top speed of 7 kilometers per hour; Honda’s ASIMO is now in 3rd place with a top speed of 6 kilometers per hour.

Another technical achievement behind MABEL and above Toyota’s robot is that MABEL’s gate while walking and running much more closely resembles that of a human. In addition, MABEL jumps 3-4 inches above ground (both legs) when running whereas Toyota’s robot hardly does (it does enough to consider it’s fast walking gate running). One disadvantage is that MABEL is not a complete robot that includes an upper body with arms and a head compared to Toyota’s, Honda’s and other humanoid robots coming out of Korea and Japan.

The below video shows MABEL running including some explanation of the related technical achievement. Additional information on the project can be found at the project’s website here.

Go to Top