Zurich, September 4, 2025 — In a striking demonstration of robotic agility and intelligence, Swiss researchers have taught a four-legged robot—named ANYmal—to play badminton. The robot can now complete impressive 10-shot rallies with human opponents, signaling a major leap forward in dynamic physical coordination through AI.
From Simulation to Shuttlecock Return
ANYmal’s prowess stems from an advanced training regime using reinforcement learning, where the robot honed its badminton skills entirely in a virtual environment before taking to the court. Simulations included thousands of shuttlecock serves to refine its timing, racket angle, swing speed, and movement efficiency across the court.
Once transferred to the real-world robot, this training enabled ANYmal to track a fast-moving shuttlecock using stereo vision, navigate the court, and strike accurately—with swing speeds reaching around 39 feet per second. While that’s only about half the pace of an average amateur human player, the precision and coordination are truly remarkable.
Coordination Between Limbs and Vision
Balancing coordinated leg movements and racket swings posed a significant challenge. ANYmal needed to look at the shuttlecock to track it—but keeping the target in view often restricted its speed. Its neural controller had to balance this trade-off, enabling it to shift between standing, scuttling, or galloping based on shuttlecock distance.
Notably, after striking, the robot instinctively returned to a central position—much like a human preparing for the next shot—despite not explicitly being taught to do so. This behavior emerged organically from its training model, underscoring the sophistication of the control system linking perception, locomotion, and action.
Beyond the Court: Real-World Applications Ahead
While the badminton demonstration is playful, the implications are far-reaching. Researchers behind the project envision applications in high-stakes environments—such as disaster relief, industrial tasks, or search-and-rescue operations—where robots must move precisely in unpredictable and dynamic settings. The ability to coordinate complex limb movements based on real-time visual input can elevate robotic utility in many safety-critical domains.
Leave a Reply