Skip to content

Commit aa4c23b

Browse files
authored
Update README.md
1 parent c28ab63 commit aa4c23b

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -84,32 +84,32 @@ Note that the current demo has only been tested on the JACO 2.
8484

8585
## Running the Demo in Simulation
8686

87-
Run the following commands from your ROS workspace:
87+
Run the following commands from your ROS workspace (each in a new terminal unless it terminates immediately).
8888

8989
1. `catkin build`
90-
1. `source devel/setup.bash`
90+
1. `source devel/setup.bash` (in *every* new terminal)
9191
1. `roscore`
9292
1. `rviz`
9393
1. `roslaunch libada simulation.launch` (will put 2 simulated *cantaloupe* on the plate)
9494
1. `roslaunch ada_feeding feeding.launch` (will quit after writing ROS parameters)
9595
1. `cd my_catkin_workspace/devel/bin/` and `./feeding`
96-
1. In RViz, subscribe to the topic `feeding/update/InteractiveMarkers` to actually see the robot.
96+
1. In RViz, subscribe to the topic `feeding/update/InteractiveMarkers` ("Add" -> "By Topic") to actually see the robot.
9797

9898
## Running the Demo on the JACO 2
9999

100100
### Additional Workspace Setup
101101

102102
1) Build your workspace with `catkin build`
103-
2) Download the checkpoint by going into `src/pytorch_retinanet` and running `load_checkpoint.sh` (or train your own checkpoint)
104-
2) Do the same in `src/bite_selection_package`: run `load_checkpoint.sh` (or train your own checkpoint)
103+
2) Download the PRL checkpoint with `. src/pytorch_retinanet/load_checkpoint.sh` (or train your own checkpoint)
104+
2) Do the same in: `. src/bite_selection_package/load_checkpoint.sh` (or train your own checkpoint)
105105
3) Make sure your source `devel/setup.bash` in *every* terminal you use.
106106

107107
###
108108

109109
1) Start `roscore`, `rviz`
110110
2) Turn on ADA
111111
3) Once the lights on the joystick go solid, home ADA by holding the orange button until the robot stops moving.
112-
4) `ssh nano` (you may need to add `nano` to your `.ssh/config`, this is the Jetson on the robot). Once there, set your ROS Master using `uselovelace`, `useweebo`, or `useweebowired` (or set your ROS_MASTER_URI manually), execute `./run_camera.sh` to start streaming RGBD data.
112+
4) `ssh nano` (you may need to add `nano` to your `.ssh/config`, this is the Nvidia Jetson Nano on the robot). Once there, set your ROS Master using `uselovelace`, `useweebo`, or `useweebowired` (or set your ROS_MASTER_URI manually), execute `./run_camera.sh` to start streaming RGBD data.
113113
* You may have to adjust the camera exposure, depending on the lighting condition. Either run `run_adjust_camera_daylight.sh` or `run_adjust_camera_all.sh` after running `run_camera.sh`. Check the image stream via rviz, by adding the image topic `/camera/color/image_raw/color`. If some area is too bright and look burnt or saturated, reduce the exposure.
114114
5) `roslaunch forque_sensor_hardware forque.launch` (Optionally add `forque_ip:=<IPv4>` if your Net-FT is on a non-default IP)
115115
6) `rosrun face_detection face_detection`

0 commit comments

Comments
 (0)