IEEE Spectrum Automation

IEEE Spectrum
Subscribe to IEEE Spectrum Automation feed IEEE Spectrum Automation


A tiny robot version of the scaly mammal known as the pangolin may one day perform medical procedures inside the body, a new study finds.

Scientists are increasingly exploring how miniature robots might find use in medicine—for instance, to carry drugs, genes or cells into the body. Soft robots especially hold the promise of accessing hard-to-reach areas in a safe, minimally invasive way.

Instead of powering these machines using bulky cables or batteries, researchers are investigating untethered versions of these robots that receive their energy remotely, such as via light or ultrasound. Magnetic fields have shown promise for this application due to how they can penetrate human tissues in a safe manner.

Researchers in Germany have developed a small robot that can deliver drugs to specific locations in a patient’s (in this case) digestive tract and then be unfolded via externally-applied magnetic fields. Inspiration for this ‘bot came courtesy of the pangolin, a scaly mammal whose armor covers and fiercely protects the animal’s body while still providing it a wide range of motion. Max Planck Institute for Intelligent Systems

The clinical uses for miniature, untethered magnetic robots are still limited because the devices mainly rely just on mechanical interactions with the body, instead of also using, say, heat. Magnetic fields can also remotely supply energy to robots for heating. However, magnetic fields are best at heating rigid metallic parts, and such components generally take away the advantage of having a soft body.

Now scientists have developed a tiny magnetic robot that combines hard and soft by drawing inspiration from the pangolin. The animal, which resembles a walking pine cone, is the only mammal completely covered in hard scales. However, the pangolin is still capable of flexible motion due simply to how it arranges its scales in an overlapping way, letting it curl up into a ball in case of danger.

“I’ve always read about how scientists were inspired by nature, but I’ve never imagined that I myself too would be.”
—Ren Hao Soon, Max Planck Institute for Intelligent Systems, Stuttgart

In the beginning, “I focused on fishes and armadillos as I was more familiar with them,” recalls study lead author Ren Hao Soon, a roboticist at the Max Planck Institute for Intelligent Systems in Stuttgart, Germany. “However, they were not ideal. Individual fish scales were interconnected, which reduced their mechanical deformation. On the other hand, while the armadillo design provided the necessary mechanical compliance, they were not able to provide the necessary heating performance—to enhance the heating performance, a larger volume of material, such as metal scales, needed to be incorporated on the robot.”

While watching YouTube videos on fish and armadillos, Soon says he “stumbled serendipitously on the pangolin, which served our purposes ideally. It was a very exciting moment for me when I thought about it. Personally speaking. I’ve always read about how scientists were inspired by nature, but I’ve never imagined that I myself too would be.”

The new robot is a small flat rectangle 1 centimeter by 2 centimeters by 200 microns in size. It consists of aluminum scales on top of a flexible body made of silicone rubber mixed with magnetic microparticles and the liquid alloy eutectic gallium-indium.

In experiments, the scientists used high-frequency magnetic fields to heat the robot to more than 70 degrees C in less than 30 seconds. They applied these magnetic fields remotely, from a distance of more than 5 centimeters.

“We challenge the ‘fully soft’ assumption in soft robots. Up till now, there was an inherent assumption that robots used for biomedical applications had to be fully soft.”
—Ren Hao Soon, Max Planck Institute for Intelligent Systems, Stuttgart

When the researchers used low-frequency magnetic fields, they could roll up the robot and move it back and forth. Once curled up, it could transport cargo such as medicine, releasing its contents when heated. For instance, a 50-micron-thick and an 80-micron thick scale were dipped in beeswax to which a blue and a green cargo packet was attached, respectively. When heated magnetically, the thinner packet reached the melting temperature of beeswax 1 second faster than the thicker packet, letting the robot release only the blue cargo but not the green one.

“We challenge the ‘fully soft’ assumption in soft robots,” Soon says. “Up till now, there was an inherent assumption that robots used for biomedical applications had to be fully soft. While the compliance of soft materials might make it safer, it greatly restricted the functionalities of these robots—hard materials might have desirable material properties that a fully soft system can utilize.”

The researchers showed they could roll up this robot, so it can fit into a pill that could get swallowed for deployment in hard-to-reach areas. In experiments with the robot in pig stomachs from a slaughterhouse, they found it could stop bleeding, and they suggest it could also help treat ulcers, polyps, and tumors, as well as target other areas of the gut.

A small robot developed by German researchers provides the prospect of selective deployment of cargos (e.g. medicines) depending on the strength and location of an externally-applied magnetic field. Here a simulation is pictured in which a blue-colored cargo is deployed while other portions of the bot remain undeployed. Max Planck Institute for Intelligent Systems

“Minimally invasive surgical devices can reach almost all of the gastrointestinal tract with the endoscope and colonoscope,” Soon says. “However, a large part of the gastrointestinal tract, namely the small intestine, still remains out of reach to these devices, because it is too tortuous to reach with these devices. Untethered robots hence have the opportunity to complement or even supersede existing therapies for diseases in the small intestine.”

The scientists now hope to work with clinicians “to identify a real medical need for which such robots might be useful in,” Soon says.

The scientists detailed their findings 20 June in the journal Nature Communications.



The United States skipped a step in the space race. Before NASA sent the first astronauts to the moon in 1969, it sent robotic scouts to crash into the lunar surface, land on it, and map it from orbit. The last three Apollo missions included lunar rovers driven by the astronauts. But in those rushed years to be first, the U.S. never sent a robotic rover, a good way to look at the lunar landscape close-up. Only the Soviet and Chinese governments have. Other countries have tried and failed.

Now, 50 years later, the United States is making up for lost time—but if plans hold up, its first remote-controlled rover won’t be sent by NASA, or SpaceX, or some other large space agency or company. The rover is called Iris, and it’s largely the work of students at Carnegie Mellon University in Pittsburgh.

Iris is tiny by space-rover standards—it’s about the size of a shoebox, with four wheels each about the size of small pizza—but one lesson it teaches is that there’s no such thing as a small moon project. Mission managers estimate that 300 students have spent the equivalent of a century in work-hours on the rover since it began in 2017.

“There were no constraints—just come in at 2 kilograms or less!” laughs Raewyn Duvall, the program manager for the Iris rover. She got her Ph.D. at Carnegie Mellon while developing Iris, and she has stayed as a research associate to see the project through.

A Tiny Rover with an Uncertain Timeline

Iris is one of more than a dozen scientific payloads carried by the Peregrine lunar lander, built by Astrobotic Technology, a Pittsburgh-based company spun off from the university. If everything works, Peregrine Mission One will be sent to the Gruithuisen Domes region in the moon’s northern hemisphere by a new rocket called Vulcan Centaur under development by the U.S.-based United Launch Alliance. The rocket is awaiting its first test flight, called Cert-1—but hey, if you’re launching a giant new booster, hoping it will become a workhorse of the space industry, why not shoot for the moon while you’re at it?

Cert-1’s launch date is uncertain; the Vulcan Centaur, once aiming for a first flight in 2019, has been delayed by COVID-19 and myriad technical issues. The Iris team says it took advantage of some of the holdups to check to make sure all was well with the rover, which is still waiting with the lander in a clean room at Astrobotic.

William L. “Red” Whittaker, a professor of robotics at Carnegie Mellon and a leading name in the field, was the driving force behind Iris. “In space, what counts is what flies,” he said at a news briefing in April, before a hoped-for launch date in May. Whittaker is a cofounder of Astrobotic, where Iris and Peregrine are among several privately built lunar projects in the pipeline. Scientists want to find and harvest water ice near the moon’s south pole, potentially useful for future missions, and therefore, in Whittaker’s words, “the most valued resource in the solar system.”

Simple, but (Hopefully) Effective

If you were Whittaker’s student team, how would you squeeze everything into a 2 kg rover? An early design showed a small two-wheeled vehicle trailing its tail, but that was nixed because it couldn’t back up. They wanted to power it with photovoltaics, but that was nixed because articulated solar panels would add mass and complexity—and also, being so low to the ground, they might get coated with lunar dust. Lithium-ion batteries became the final compromise: safe and reliable, but if Iris is working well, it will probably drain them in fewer than 50 hours.

The final version is as simple as possible. It is symmetrical, with a camera at each end, so it makes no difference whether it goes forward or backward. It has no suspension. It has a skid-steer system, able to turn in place by turning the wheels on each side in opposite directions. The chassis and wheels are made of carbon fiber—strong and lightweight, though Duvall says that if there were any microfractures, they’d be hard to detect. That’s one of the necessary tradeoffs to keep Iris small. “Being a university project,” she says, “we were willing to take on more risk than a government project or anything like that.”

The Peregrine lander is a squat, four-legged ship, designed to accommodate many different sorts of lunar projects. Iris is firmly clamped to the bottom of its payload deck. The flight plan calls for Peregrine to touch down after local sunrise at the landing site. Then, presuming everything checks out, two hold-down mechanisms open—and the little rover falls about a meter, in the low lunar gravity, to the ground below.

Once the Peregrine lander is safely landed on the moon, the Iris rover will drop from its payload deck in the center.Astrobotic Technology

“In order to save mass, we don’t have a ramp, we don’t have a fancy mechanism, we just have to survive the drop down to the lunar surface,” said Nicholas Acuna, a student who served as mechanical lead engineer for the project while getting his master’s degree.

The rover and other payloads relay their signals to Earth via a wireless local area network (WLAN) on the lander—not unlike what you’d find in buildings and homes on Earth. The rover is not autonomous; Duvall says that in the busy couple of Earth-days that it’s active, it will probably travel 30 centimeters at a time on average, send an image of whatever is in front of it, and wait for more instructions. It may only travel a couple of hundred meters before its batteries are depleted, but even that, Duvall says, would prove that a lunar robot is in anyone’s reach.

“It’s going to be a great success when we deploy,” Duvall says. “It’s going to be a great success when we get any sort of telemetry back from it. It’s going to be a success when we take a nice image, or even any image at all. Everything that happens is going to be—wow, we did it!”



During the final fight scene of the 2011 film Real Steel, two giant robots battle it out in a boxing ring—and the underdog is losing. Before the final round, Hugh Jackman’s character, Charlie, takes his robot aside and tells it to watch him and follow his lead from the sidelines.

The final round commences, Charlie puts his guard up, and—at just the right moment—throws a mean uppercut. His robot mimics the same motion in the ring, rocking the bigger robot hard. The pair then unleash a flurry of punches against their opponent in an epic comeback before the round ends, just before the underdog can complete the knockout.

Inspired by this scene, researchers have created a similar teleoperated robot, called QIBBOT, that mimics the real movements of a human fighter. And not only can this robot pack a punch, it does so with unprecedented speed. With a latency of just 12 milliseconds, it is likely the fastest teleoperated robot created to date, its creators claim.

The researchers then put QIBBOT to the test—against an AI-guided opponent that learns and adapts while it fights. You can watch the battle here:

World’s Fastest Tele-Robot [Big Robot for Fighting Game with VR Controller: QIBBOT]Qibo Robot Company

Yining (James) Geng is the CEO of Qibo Robot Co. in Weihai, China, which led the development of QIBBOT. “Taking inspiration from the film [“Real Steel”] and incentivized by the entertainment value, we decided to see if we can realize the concept of real-life fighting robots,” he says. “We thought players’ experience with big and real robots would be very new and different from computer games.”

Several different teleoperated robots already exist, but these have completely different purposes, and most tend to be small and only move at slow or medium speeds. “Fast speed is the first priority of fighting robots,” Geng says.

With teleoperated robots, there are a lot of different aspects that factor into their speed, including communication, interface, actuation, transmission, the controller, and computation.

To help QIBBOT achieve high speeds, Geng’s team focused on addressing issues related to the robot’s mechanics and controller. They first built an accurate kinematics and dynamics model, which they used to optimize QIBBOT’s mass distribution, mechanical structure, actuation, and communications. Then, in addition to using a conventional feedback controller that works reactively, they designed a new feedforward controller, which proactively responds to the motion commands from the VR controller. This approach cancels some of the latency caused by other components in the system, according to Geng.

To create a formidable robot opponent that is guided by AI, Geng and his colleagues combined several different existing AI programs. Collectively, these programs help the autonomous AI-guided opponent distinguish between attack and defense, generate fighting strategies, and reference a library of data on the parameters for specific fighting scenarios. As seen in the video above, the AI opponent can learn moves while in the midst of a fight with the robot guided by Geng.

While the battle is entertaining and undoubtedly looks fun, there are still some aspects of QIBBOT that need to be addressed.

“We optimized the mechanical system design of QIBBOT for fast response, and that has a price—the accuracy,” says Geng. He notes that although accuracy may not matter as much in a fight scenario, it would be important if this tech was adapted for other purposes.

He says QIBBOT is just a prototype for testing the design and controller. “It has some problems, like unwanted vibration, unnatural motion styles, et cetera. We are designing a new robot that will have dual arms and more joints in the body and arms, and will solve these problems.”



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

RoboCup 2023: 4–10 July 2023, BORDEAUX, FRANCERSS 2023: 10–14 July 2023, DAEGU, SOUTH KOREAIEEE RO-MAN 2023: 28–31 August 2023, BUSAN, SOUTH KOREAIROS 2023: 1–5 October 2023, DETROITCLAWAR 2023: 2–4 October 2023, FLORIANOPOLIS, BRAZILHumanoids 2023: 12–14 December 2023, AUSTIN, TEXAS

Enjoy today’s videos!

Biology has the best grippers, hands-down. Even when it’s not hands.

And look at this cute li’l guy!

[Paper]

RoboCat is a self-improving AI agent for robotics that learns to perform a variety of tasks across different arms, and then self-generates new training data to improve its technique.

[Deep Mind]

uMobileLab is a mobile and versatile collaborative lab automation solution that integrates seamlessly into existing lab workflows. It increases efficiency during workload fluctuations and enables the efficient completion of tasks, while navigating safely in various lab settings and environments.

You know it’s a robot that does science, because it wears glasses.

[United Robotics]

The most interesting video is right at the end, where this collision-tolerant quadrotor starts pushing a big box around.

[ASU]

New work from Carnegie Mellon University has enabled robots to learn household chores by watching videos of people performing everyday tasks in their homes. The research could help improve the utility of robots in the home, allowing them to assist people with tasks like cooking and cleaning. Two robots successfully learned 12 tasks including opening a drawer, oven door and lid; taking a pot off the stove; and picking up a telephone, vegetable or can of soup.

[CMU]

The objective of the HomeRobot: Open Vocabulary Mobile Manipulation (OVMM) Challenge is to create a platform that enables researchers to develop agents that can navigate unfamiliar environments, manipulate novel objects, and move away from closed object classes towards open-vocabulary natural language. This challenge aims to facilitate cross-cutting research in embodied AI using recent advances in machine learning, computer vision, natural language, and robotics.

The winner will receive their own Stretch robot!

[HomeRobot]

Thanks, Aaron!

Aerial deployment and retrieval of a twin-arm manipulator that resembles a human being were successfully demonstrated in order to carry out the installation of specialized bird-flight diverters on a real power line.

[GRVC]

Whether intentional or not, this is pretty hilarious.

[Engineered Arts]

Sanctuary’s robot is ready for preschool. Sounds like a slight, but it’s really not, for a robot.

[Sanctuary]

This wheelchair tennis-playing robot from Georgia Tech is named ESTHER, for (Experimental Sport Tennis wHEelchair Robot). It’s a stretch, but it gets a pass since it was named after arguably the greatest wheelchair tennis player, Esther Vergeer.

[Georgia Tech]

This video is not that impressive, but I still really like the concept behind these squishy robots.

[Squishy Robotics]



This is a guest post. The views expressed here are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE.

Researchers at the Small Artifacts Lab (SMART LAB) at the University of Maryland have been working on a small wearable robot called Calico. Weighing just 18 grams, the robot attaches to a special track sewn on top of your clothing and is ready for work by traversing your garments, which allows it to do all kinds of things, including acting as a stethoscope to listen to your heart and lungs and coaching you through a fitness class.

University of Maryland

One of the biggest challenges for Calico is localization: GPS, of course, isn’t precise enough to help you determine where the robot is on your clothing. The researchers solved this problem by embedding neodymium magnets into the clothing track at even intervals, which can be used as markers. With onboard sensors, Calico can detect these magnets, and use them to estimate where the robot is currently located and allowing it to effectively plan a path across your body. It’s a very effective system, as the robot never missed a single marker throughout the development cycle.

The Calico robot can carry a 20-gram payload. Depending on the direction of travel, the robot can achieve speeds between 115 millimeters per second and 227 mm/s. Thanks to the low-power design, the little rover’s 100 milliampere-hour battery will last more than 8 hours in an idle state or 30 minutes with continuous movement. Wireless charging could further extend the operating time of the robot.

When describing their research, the SMART LAB members came up with many applications for their wearable. A microphone and a stethoscope add-on enable the robot to sense vital signs, for example. The robot could travel to a predefined location to listen to your organs, or a doctor could teleoperate it in real time. And because the system already has an accelerometer, it could be used to detect falls.

If you want to learn how to dance or be guided through a workout routine, the system can guide you through some moves, track your form, and provide haptic feedback on your performance. Extending that general idea to medical applications, your little buddy could be used for rehabilitation as a motivator to perform exercises and as a progress monitor.

A wearable assistant with no display makes it challenging to provide meaningful feedback to the user. To overcome this issue, the researchers came up with data physicalization. For example, the device can show your progress on specific tasks by turning your arm into a physical progress bar. As you get closer to hitting your daily goals, the device will move further up your arm.

In the future, your assistant could be personalized through accessories like fur and googly eyes.University of Maryland

And of course, a purely “show-off” application exists for this wearable. With the addition of some fur and googly eyes, having a fluffy friend wandering around you at all times would get you some reactions. If you add sound, LEDs, or even displays, you have a guaranteed conversation starter on yourself.

For more information about Calico, check out the Small Artifacts Lab website and the paper describing this work in more detail published 7 September 2022 in the Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies.



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

RoboCup 2023: 4–10 July 2023, BORDEAUX, FRANCERSS 2023: 10–14 July 2023, DAEGU, SOUTH KOREAIEEE RO-MAN 2023: 28–31 August 2023, BUSAN, SOUTH KOREAIROS 2023: 1–5 October 2023, DETROITCLAWAR 2023: 2–4 October 2023, FLORIANOPOLIS, BRAZILHumanoids 2023: 12–14 December 2023, AUSTIN, TEXAS

Enjoy today’s videos!

The Mori3 is a modular robot built by the Reconfigurable Robotics Lab at EPFL. The Mori3 can change its own shape and function through changing the way its modules interconnect. Each module is its own robot; each has its own power, motors, and sensors. By themselves, they can drive around on the ground and change the length of each of their triangular edges. However, working together, they function as a complete system capable of achieving many different types of tasks. The Mori3 is geared towards difficult-to-reach environments where the task isn’t always known ahead of time, such as outer space.

[ EPFL ]

This video details the creation of the Dingo, a low-cost robot quadruped designed and built by Alexander Calvert and Nathan Ferguson as a capstone engineering project for the bachelor of Robotics and Mechatronics Engineering at Monash University. Total cost is about $1,300 AUD. The full bill of materials can be found on GitHub.

[ GitHub ]

Thanks, Michael!

ABB Robotics and Junglekeepers, a nonprofit dedicated to replanting the Amazon rainforest in Peru, have joined forces to work on a special pilot program. Our mission: to find ways of using technology to help with one of the biggest problems of our time, the destruction of the rainforest, while freeing up rangers’ time to patrol and teach locals about the importance of preserving this precious resource.

[ ABB ]

We are pleased to share with you the latest progress in our flying humanoid robot project. Jet-HR2 has been able to track the desired trajectory in the air for hovering. The robot made four 90-degree yaw turns in the air according to a predefined square trajectory and returned to its initial position for landing. The relevant technical details will be specified in a upcoming paper.

My favorite part of this video is the poor guy in the back, desperately covering his ears.

Thanks, Zhifeng!

This paper presents for the first time a shape memory alloy (SMA) wire-reinforced soft bending actuator made out of a castor oil-based self-healing polymer, with the incorporated ability to recover from large incisions via shape memory assisted healing. The integrated SMA wires serve three major purposes: (i) Large incisions are closed by contraction of the current-activated SMA wires that are integrated into the chamber. These pull the fracture surfaces into contact, enabling the healing. (ii) The heat generated during the activation of the SMA wires is synergistically exploited for accelerating the healing. (iii) Lastly, during pneumatic actuation, the wires constrain radial expansion and one-side longitudinal extension of the soft chamber, effectuating the desired actuator bending motion.

[ Nature ]

Thanks, Bram!

Researchers led by the University of California, San Diego, have developed a new model that trains four-legged robots to see more clearly in 3D. The advance enabled a robot to autonomously cross challenging terrain with ease—including stairs, rocky ground, and gap-filled paths—while clearing obstacles in its way.

[ UCSD ]

Thanks, Liezel!

Grabbing an object as thin as a plastic bag is an incredible challenge for robotic hands, but Phoenix robots have the fine manipulation required to do it.

[ Sanctuary ]

Catching high-speed targets in flight is a complex and typically highly dynamic task. However, existing methods require manual setting of catching height or time, resulting in a lack of adaptability and flexibility and an inability to deal with multiple targets. To bridge this gap, we propose a planning-with-decision scheme called Catch Planner. Extensive experiments are carried out in real-world and simulated scenes to verify the robustness and expansibility when facing a variety of high-speed flying targets.

[ HKUST ]

This paper discusses the incorporation of a pair of Supernumerary Robotic Limbs (SuperLimbs) onto the next generation of NASA space suits. The wearable robots attached to the space suit assist an astronaut in performing extravehicular activities (EVAs). A full-scale prototype of Space Suit SuperLimbs was constructed and tested. Results from the experimentation indicated that with the aid of SuperLimbs, metabolic loading during EVAs is reduced significantly.

[ MIT ]

The next frontier of technology is here. Meet Torc Robotics’ self-driving semitrucks, the next great advancement in the vast world of transportation. With innovation, safety, and efficiency in mind, Torc Robotics has engineered autonomous vehicles that are revolutionizing our present and future.

[ Torc Robotics ]

The Bristol Robotics Lab is full of good people.

[ BRL ]

In honor of Raj Reddy’s 60th birthday in 1998, a symposium and celebration was held where many leaders in the field of AI and computer science spoke. Herbert Simon gave this forward-looking talk that is not only historically relevant but also speaks to the issues surrounding artificial intelligence that we are discussing today.

[ CMU ]

On this episode of The Robot Brains Podcast, Pieter interviews Raff D’Andrea.

[ Robot Brains ]

This GRASP on Robotics talk is from Pauline Pounds and the University of Queensland: Drones, Bipeds and Sensors–10 Years of the UQ Robotics Design Lab.

[ UPenn ]

Pages