This kind of stuff illustrates dexterity of the robots and their ability to maintain balance while performing quick actions. Even though the routine is programmed ahead of time, the robot still has to adjust on the fly.
This is the foundation for making robots that can navigate and interact with the environment effectively. It's basically the nervous and balance system.
Then higher level AI models can focus on actually setting goals and accomplishing them, and high level commands will be delegated to the underlying platform to execute.
It's akin to the way humans operate as well. When you go to grab a cup from the table, you think about your actions at high level. You're not conscoously aware of all the muscle contractions and movement adjustments that are constantly happening. You're just thinking of the high level goal. I want to move my hand in this direction until I reach the cup, then I want to grasp it, then retract the hand.
I expect that we're basically going to get real life Star Wars droids in a few years. You'll have affordable robots that you can communicate with using natural language, and that will be able to accomplish a lot of common tasks in human environments.