When designing anything the materials are always important. It is necessary to never get caught in fads or personal preferences. There is always an optimal solution that few ever reach because they don't appreciate this rule.
The prototype of our Jerry home robot has a wooden gripper. This was done in the interest of cost and speed of prototyping. Jerry as a whole is made almost entirely of wood. Because there is was no better solution.
But the wooden gripper is not performing well enough. A new design is now necessary. Considering the gripping mechanism it would be ideal to use metal. Its resilience would be great. But the gripper is relatively complex. Building it in metal would increase the part count significantly. We are also redesigning the gripper to be able to operate in both the vertical and horizontal planes.
Due to the complexity of the design, additive manufacturing has become the go to.
Let me be clear. 3-D printing is very often overused. It has become a fad rather than a practical tool. There are very specific applications for additive manufacturing and they are not far reaching. 3-D printing is not very scalable and the quality of the parts is not terribly reliable. But in this case 3-D printing is the best solution.
So, I would like to introduce you to Version 0.4 of Jerry's new universal 3-D printed gripper.
If you were not aware of the Robotics Challenge that occurred in 2015 here is a great chance to catch up on some of the machines that were built.
The grand challenge brought about a huge amount of robotics technology. Particularly in legged robots like bipeds. The point of the competition was to create robots that could be used in a nuclear plant disaster or similar accident. But of course, much of the hardware and software will quickly leak into commercial applications.
JPL - RoboSimian
RoboSimian was one of the rare entrants to the Robotics Challenge that was not bipedal.
Carnegie Mellon - Chimp
Korea - Hubo
Boston Dynamics - Atlas
Here are a series of demos that we did with Jerry to show off his physical capabilities. Jerry is still operating by remote control but software will be completed for several of these tasks soon. Either way it is always cool to see a robot getting you a soda, or vacuuming the floor.
Recently Schaft, a robotics company owned by Alphabet, unveiled a new bipedal robot. The robot is basically, two legs that can waddle around.
Schaft has a good history in advanced robots since they won the DARPA robotics challenge and certainly have a great deal of talent in their team. But what are the actual challenges of building a walking robot?
When you stand there is really not too much going on that a robot can't do. Liquid filled sensors in your ears determine how you are angled, similarly to how an accelerometer in a robot does. Once that angle is determined, within your brain then a command is sent to muscles in your legs to adjust and move your center of gravity back over your feet, which again is what robots do. But there are a few major differences. The first is that our computer is vastly superior to any inside of a robot, and we have dozens more muscles than many robots today, so we can make small, imperceptible changes to our entire body rather than just tilting a bit, as a robot is required to do. Though it should be noted that many robots don't do that. When they stand they attempt to get into a position where they are basically a solid structure. They are basically "propped" in a standing position. While our brain continues to think about standing while we stand a robot basically shuts down once it feels that it is stable.
Now, with walking, everything becomes very complex from a design and control standpoint. Humans walk by using our perfectly engineered legs. Throughout the motion of taking a step, we use gravity to swing our legs forward and more or less use our muscles just to make a controlled fall into a new step. The only time our muscles are really activated are when we push off from a step. But all of out motions are very fluid and smooth. This gives us a phenomenally efficient means of locomotion.
When you look at a robot walking, most of the time you will see them slightly crouched. Their knees are continually bent and their torso is upright. What the robot is doing is making sure that it is always in a stable positions. They try to be a rigid structure so that if a leg ever goes on the fritz it will appear that the robot is just stopped mid-step. Robots like Honda's Asimo and all hobby humanoids use this methodology. Instead of using a human-like controlled fall, they first try to remain standing in any position.
Though robots are beginning to move away from the "moving structure" design. Boston Dynamics has really lead the charge in this front. Their robots look very human when in motion. They are relatively fluid and actually use hydraulic actuators instead of motors to better replicate the human musculature. The robots revealed by Schaft also seems to be very fluid, though it uses electric servo motors. The holy grail of walking robots is fluidity.
When a robot is able to move without looking like a machine, that is breakthrough. Fluid motion in a machine denotes efficiency. Walking robots are notoriously inefficient. While they move like a rigid structure most times they still have their motors activated when standing in order to hold the position. And when they walk in a crouch you can understand how much effort it takes. Go ahead and crouch a bit and then walk a ways. You will feel the burn.
So why don't we just simplify the walking to make it more fluid. The reason for that is in the what are called control algorithms. These are mathematical formulas that describe the motion of the robot. These algorithms allow the robot to predict how it is moving with accuracy. As it turns out, the crouched, "moving structure" method is easier to model than the chaotic "controlled falling" of a human. Though simulations and computing are getting to a point where human walking is becoming possible to model.
Let me put this in perspective as to the challenge. When Asimo was being created, engineers were likely designing much of his algorithms on paper. They would lay out the overall design and control variables, like gravity, and acceleration, and then perhaps feed them into a computer to some analysis. For a robot that is a "moving structure," that is possible because dynamics are not highly involved. But when you move into the design of a human walking robot you must consider hundreds if not thousands of changing circumstances. You must detect how the swing of the leg induces a rotation on the body. You must perceive ground conditions. You must remain balanced and in control even while falling forward and ensure that the leg is going to contact where you want it.
Let me make another comparison. In four legged animals 3 legs are always on the ground. A horse can stand on three legs and then extend out a fourth to take a step. The stepping leg is independent of the rest of the body. the body stands there rigidly and the leg moves ahead. This is what robots like Asimo try to achieve. Now consider a horse running. In this situation maybe one leg is on the ground at any given time and the entire body is moving forward and adjusting to remain balanced on changing terrain. In this situation on leg must move with all the rest of the body in a glorious symphony otherwise the whole system comes crashing down.
A bipedal robot that walks like a human in a running horse. Everything within the body of the machine must interact and move together so that it does not crash and burn. Imagine trying to develop an equation that describes every component of those motions, down to how quickly a hand must swing forward when a foot slips a quarter of an inch from where it was placed.
Those are the challenges of designing and building an advanced bipedal robot. Boston Dynamics has done an incredible job achieving these with their robots. But the trouble is that they have hit a common problem in the robotics industry in getting customers, their robots are very expensive. Schaft appears to have created something that seems to move fluidly, but it also appears to be less complex. The greater simplicity may make it much less expensive than a Boston Dynamics robot. If it is affordable enough Schaft may bring walking robots into use in industrial areas relatively soon, where wheels robots have been dominant if used at all.
I would imagine that walking robots would find use in areas like construction where transport over uneven areas is necessary and is tasks that humans do not enjoy. As far as walking robots in your house. Probably not for awhile.
With the launch of Oculus Rift and the release of gadgets like Hololens there are a whole bunch of virtual worlds coming our way. Very soon we will actually be able to kill zombies with more than our thumbs and never have to buy a tv again as an entire wall becomes a screen. But with all of this coming to the consumer market how will it affect consumer robotics? Depends on where you are in the space.
As a whole, VR and AR are going to be a technological boon for robotics. In order for these VR systems to work they have to map the world and make their characters interact with it and with you. This is exactly what robots need. Robots have to learn about their environment and interact with it and you. The trouble is that robots have always been a very expensive medium to practice 3D mapping. VR is eliminating that obstacle. The relative affordability, and widespread adoption of VR, will allow hundreds of people to develop algorithms for computer creatures to see and interact with the world. Those algorithms and apps will then be able to port directly over into a home robot that has to do the dishes. VR is going to he a huge stimulus to robotics sensing technology.
VR is also going to lend itself to human machine interaction. The ability to create a virtual version of your robot and then interact with it and have it react as it would in the real world will allow for more tweaks to be made. Roboticists and interaction designs will be able to prototype a head nod or an eye blink that would normally have required a physical prototype. In this sense VR is going to speed up the development of personal robots that interact with us. Soon the interactions and the hardware will be able to be developed simultaneously, when it used to be sequentially.
But there is one robot killer in VR that will be significant. Today, if you look at many of the robots coming to market, they are often what are called "social robots." These personal assistants are basically a motorized interface that can interact with you at a deep level through great expression. These robots, like Jibo and Pepper, are characters that you use to interact with the Web or your house. But since they are not a device that interacts physically with the world, like picking up a shirt, they are easily replaced by AR. Microsoft has already demoed a partial version of this with their Hololens, when it projected an interactive robot onto a simple mobile robot. Social robots that serve no purpose beyond an interface are going to be in great danger of being replaced entirely by the software in a headset that can be built by a kid in a garage for free. Robots that may be replaced include tour guides, entertainment robots, some educational robots, purely social robots, robot pets.
The only way that robots can remain relevant within the VR world that is coming will be to perform physical tasks that software never can. They must vacuum floors, do laundry, watch the kids, secure the house, cook a meal, or prepare pills. Basically they must be a laborer, not a friend.
Though there is no reason to go to one extreme or another. Again Microsoft's demo of an AR robot showed how the social components of the robot could be combined with the physical. Such a hybrid system might be ideal. For games and interaction a home robot might be virtually masked with a cape or a face. But the rest of the time it will simply be the little box roaming around sweeping the floor.
So how are we going to go about letting Jerry get a soda autonomously. Well again he has the capability already he just needs taught. There are several major motions that Jerry has to meet so let's break those down.
First, any robot that is going to get something from the fridge has to open its door. (We've already got the identification of the fridge, that is easy) Normally, a robot would be designed with a 7 DOF arm that is able to open a door without the robot actually moving. Since Jerry has such a limited arm that is not an option. In the video you can see that Jerry has to use his whole body to swing the door open. This motion will not be that difficult to achieve autonomously. Jerry is able to track the edge of the door very easily. He also has a general knowledge of the location of his arm with respect to that edge. This is where our mechanically adaptive design, or the slop, in the arm is useful. What we will likely do is have Jerry track the edge of the door and its motion and then have him move his arm along the projected path of that motion . But as the door moves to a new location he can update that projection. This is not a precise method. But it will be effective and his arm will stretch and bend a bit to allow for it to occur. So, so put it crudely, instead of grasping the handle precising he basically lassos it with the gripper and gives a jerk then continues jerking.
So we have the door open. Based on what he is retrieving he will have to do an identification. We plan to rely on a basic database of objects and the owners actually training Jerry when he arrives. He will know what a tupperwear bowl is and certainly what a can is. That is all proven and we have the code running for it already. So he identifies the can and again uses his entire body to move his gripper to the location. Again we don't rely on precision. Jerry is going to push right on past items that he is not after. However we don't want him to jerk out all the items in front when he pulls out a can from the back. This will be difficult. But in order to simplify it a bit we will likely have him blaze a trail on the way in to the item he wants. Then he can come back out. Honestly this is an equal action to a human. When we want a soda we are going to get to it. And we will mess up the organization of the frdige to get it.
As far as bringing the item to the person Jerry can just retrace his steps from where he began. Really the most difficult component of this task it to have Jerry open a door. Our solution is sloppiness. We are building a robot that take some abuse and adjust mechanically in order to eliminate processing. Unlike in an industrial setting where a chip has to be placed at great accuracy the human world is designed for uncoordinated humans, so our robots can be uncoordinated also.
Today, if you go looking for a robot to use in your house you really only have a few options to choose from. The first type of product is a single task system like a vacuum that is really more of an appliance. Another option is to get something that is basically a smartphone on a stand. It can move and interface with smart appliances in your house but really can't do anything your phone can't. The last choice is a robot for the sole purpose of entertainment. These machines are meant to be friendly and interactive thus replacing a dog (or friends). The Jetsons-style home robots that cooks dinner, does the wash, and has an attitude does not exist at a commercial level.
So why isn't Rosie in our kitchen yet? Well a big part of it is cost. The robot that is closest to what we imagine as a home robot is the PR2 (personal robot) developed by Willow Garage. The PR2 has two arms, each with a human-like 7 degrees of freedom and can roll around. It is outfitted with dozens of sensors and has the capability to do just about anything. But the PR2 costs 400,000 dollars.
The cost of the PR2 is due to a couple of reasons. The first is that there is no economy of scale. Since its introduction in 2007 there have only been a few dozen or perhaps a couple of hundred made (no actual numbers have been released). With such a low volume, but still needing to pay salaries the margins on robots like the PR2 are necessarily huge. But note that the markup is not so great that the robot would be affordable if it were sold at cost. PR2s for education have been released at as low as $150,000 which is very likely near the cost of the robots. So it is reasonable to assume that on the low side the PR2 costs around 100,000 dollars, in parts, to make. And since there is still the time and effort of assembly, that number is likely higher. Now why is that price so high? The answer is robots are machines not computers.
Robots are composed of motors, and gears, and magnets, oh my! Physical materials such as these have a floor price that they may achieve based on limited supply. Unlike chips, magnets do not get exponentially cheaper, because we can only extract and process so much neodymium. So even with an economy of scale, robots will be expensive due to the cost of raw materials. Looking at the PR2. It has two arms, each with seven motors and the grippers. Then there are the four drive wheels and their articulation. As well as any other actuation for sensors. Those components, which are high performance, along with the gearboxes and other custom mechanics could easily run up to around 50,000 dollars. And that is before the sensing and computing are installed, which, though getting cheaper, are still expensive.
Essentially, robots are expensive because they are currently custom, high performance machines. Slant built Jerry to be practically the opposite of that. Jerry is designed to be manufactured affordably by a guy in a garage.
The first step we took in this direction was to make Jerry out of wood. Wood was an ideal material because, not only it is beautiful and affordable, it is a simple material to train people to work with, so skilled labor can be used affordably until a factory can be made.
Jerry addresses the primary issue of motors by reducing them. He uses a basic differential drive system that allows for effective locomotion with only two motors. The arm actually only has two degrees of freedom and a gripper. But the design is efficient enough to give Jerry a work space, from the ground to his gripper, 38 inches high. The arm is able to move up and down as well as forward and backward. Any side-to-side motion can be performed by his drive wheels. This system is a bit more complex than a 7 DOF arm as far as software is concerned, but it was simpler mechanically. and since mechanics are what causes the cost to the customer that is what we focused on.
Additionally, Jerry is minimalist on sensors. He relies almost entirely on his vision system to perceive himself and the world. This was chosen because vision has the capability to provide all the information that could be needed about the world as long as a computer is powerful enough to glean that information, which they are.
Jerry has been designed to be affordable through reducing expensive mechanics and replacing it with cheap computing. All this has resulted in a robot that is capable of any number of tasks and will cost around $2500. So now you have an option of buying a robotic swiffer for around $400 or getting a true home robot that can grab a swiffer, walk the dog, monitor the house, and any number of other things for $2500.
Prosthetics are one of the most challenging and humanitarian fields of robotics. Having to create a replacement body part is an ongoing challenge. The human body is so mechanically perfect that we are still far from being able to completely replace parts that have been lost.
Another challenge to add to this is cost. First the most expensive prostheses don't match a normal human arm in any way technologically. But the least expensive arms are in the range of tens of thousand of dollars. But this is getting better.
Here are videos of several different design methodologies and technologies being used in prosthetic design.
Jerry is rolling!
We have the prototype up and running. From now on we are in pure software dev mode with just a few mechanical tweaks. The educational/hobby version of Jerry will be released later in 2016. The full home version will be out in 2017.
We've made a short demo video. More will be coming. Enjoy
This video came before the waterglasses video. Kuka has done an excellent job showcasing robotic capabilities in these videos.