This power amplification robot, called Power Loader, is currently under development by Activelink, a Panasonic subsidiary venture.
The aim is to achieve a robot that can freely utilize power beyond human strength, in emergencies or on construction sites. Power Loader’s role is to link people with construction machinery.
- “Power Loader receives the force input by a person through its force sensors, and amplifies it using motors. In this way, it assists the person, by producing a large force that the person can’t achieve alone. The concept we’ve used to develop Power Loader is, you get into it, rather than wearing it. Using this concept makes it safer to operate.”
When Power Loader was first developed, Activelink made a very large version. But following the accident at the Fukushima Daiichi nuclear plant, development has shifted to Power Loader Light, a more compact version.
- “In each sole, there’s a six-axis force sensor. In line with the force vectors detected there, three axes for each leg are used to control motors in the ankle, knee, and hip, exerting a force in the direction of support.”
- “We want to make Power Loader capable of carrying 50-60 kg while moving with agility. The legs could be used to support something very heavy, such as a radiation suit, and we think it could also carry 50-60 kg easily using the robot arms.”
This equipment serves as a platform for research on power loader control, which is being considered by Activelink and the Japan Atomic Power Company. It can be used to carry 30 kg with one arm, while exerting a minimum of effort.
- “This is a trial harness, for use in designing a connection to the Power Loader Light legs. We’ve made it as compact as possible while producing this much power.”
- “After that, we’re considering a very large version. The big Power Loader, which we were developing before, uses 22 motors. We’d like to achieve an exoskeleton with that kind of all-axis assist. When we do that, we think we’ll have a robot that can carry at least 100 kg easily.”
Samsung Electronics has launched a robotic vacuum cleaner “Smart Tango Corner Clean” with upgraded dust removal capability for corners on the 2nd.
Unlike the previous versions of robotic vacuum cleaners with fixed side brushes, Smart Tango Corner Clean has the world’s very first “pop out brush” to enhance the efficiency by giving the possibility to this new little Tango to access corners and other difficult areas.
Researchers at the CRNS-AIST Joint Robotics Laboratory, are working on ways to control robots via thought alone.
“Basically we would like to create devices which would allow people to feel embodied, in the body of a humanoid robot. To do so we are trying to develop techniques from Brain Computer Interfaces (BCI) so that we can read the peoples thoughts and then try to see how far we can go from interpreting brain waves signals, to transform them into actions to be done by the robot.”
The interface uses flashing symbols to control where the robot moves and how it interacts with the environment around it.
“Basically what you see is how with one pattern, called the SSVEP, which is the ability to associate flickering things with actions, it’s what we call the affordance, means that we associate actions with objects and then we bring this object to the attention of the user and then by focusing their intention the user is capable of inducing which actions they would like with the robot, and then this is translated.”
“He is wearing a cap which is embedded with electrodes, and then we read the electric activities of the brain that are transferred to this PC, and then there is a signal processing unit which classifies what the user is thinking, and then as you see here there are several icons that can be associated with tasks or you can recognize an object that will flicker automatically, and with different frequencies we can recognize which frequency the user is focusing their attention on and then we can select this object and since the object is associated with a task then it’s easy to instruct the robot which task it has to perform.”
“And the applications targeted are for tetraplegics or paraplegics to use this technology to navigate using the robot, and for instance, a paraplegic patient in Rome would be able to pilot a humanoid robot for sightseeing in Japan.”
Been keeping up to date with the quirky robotic ball named Sphero? We’ve been wondering when its Augmented Reality Engine would finalize into a full-fledged app since we first witnessed it as E3 as a simple 2D tech demo. Well, Today is the day that this Android and iOS-controlled ball makes it first official-release steps into the world of AR — the engine has grown up, powering Orbotix’s latest free app, Sharky the Beaver. While the game itself is still admittedly silly and demo-like since we saw an early adaptation in August, there’s no question that the AGR is now is a polished state.
As a refresher, unlike other implementations that require a stationary marker, Sphero serves as one that can move around your area, while also relaying information about its position. The 3D character on screen rotates its directions as you spin Sphero, and, as you can see above, it even allows you to pick the ball up while it’s being tracked. The frame-rate of tracking in the app itself looked very smooth, and it does an admirable job keeping track of the ball, even if it ends up off-screen. At the point, gameplay is limited to flicking cupcakes on the ground that Sharky goes to automatically, and there’s no word on if and when we’ll see the features shown off in the early version (namely, the part where the Sharky part of the name was actually a key element, as you chased people on-screen to get their cupcakes). All in all, we’re more curious than anything to see what else the folks at Orbotix will come up with in the realm of AR — for more in the meantime, check out the our video hands-on after the break.