Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Code for controlling Spot in simulation #27

Open
AayushManShrestha opened this issue Nov 14, 2024 · 2 comments
Open

Code for controlling Spot in simulation #27

AayushManShrestha opened this issue Nov 14, 2024 · 2 comments

Comments

@AayushManShrestha
Copy link

In the YouTube demo video, a Spot robot was being controlled via ROSA. I couldn't find the code used in the demo video. It would be helpful if code used to control a Spot robot in simulation was available. Thank you.

@RobRoyce
Copy link
Collaborator

Setting up ROSA for physical robot control is implementation-specific. That's one of the reasons we decided to be un-opinionated when it comes to uplink command and control.

For the demo, we basically gave ROSA an Xbox controller. It publishes joystick messages (axes and button presses) to the joy topic, which gets interpreted by the teleop_twist_joy node, which then publishes twists to cmd_vel. We also have a node that multiplexes any /joy publishers to ensure ROSA commands can always be overridden by human input (JoyMux will override ROSA commands when it detects ESTOP button on Xbox controller).

  graph TD;
      ROSA --> |/joy| JoyMux{JoyMux};
      XboxCtrl --> |/joy| JoyMux{JoyMux};
      JoyMux --> |/joy_cmd| TeleopJoy;
      TeleopJoy --> |/cmd_vel| Spot
Loading

Can you tell me how you're currently controlling the robot (e.g. publishing to /cmd_vel)? And which capabilities you're interested in from the demo? We will consider creating a default Spot embodiment if it can be generalized to cover most implementations.

Alternatively, you can check out the publish_twist_to_cmd_vel tool implemented for the turtle_agent, which can easily be adapted as a custom tool for your use case.

@AayushManShrestha
Copy link
Author

I am currently controlling the Spot robot by publishing velocity commands to the /spot/cmd_vel topic, which are then processed by a ROS node to execute the desired movements using the Spot SDK. If you are considering to create a general implementation, then using Spot with arm would be better.

Image

I am interested in the following capabilities:

  1. Controlling Spot's movement using natural language instructions
  2. Controlling Spot arm to manipulate object
  3. Capturing visual data from the simulation environment
  4. I don't know if this goes beyond the scope of the project but integrating VLMs would be nice

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants