MIT and NVIDIA researchers have developed a framework allowing users to correct a robot’s behavior during deployment using intuitive interactions such as pointing, tracing a trajectory, or nudging the robot's arm. This approach eliminates the need for new data collection and retraining, making real-time corrections feasible and improving task success rates. The framework offers three ways to guide the robot: pointing at objects, tracing trajectories on a screen, or physically nudging the robot. Future advancements aim to speed up the sampling procedure and explore new environments for robot policies.
Sort: