├── images
├── 1.png
├── 2.png
├── 3.png
├── w3_o1.png
├── w3_or2.png
├── w3_or3.png
├── w3_or4.png
└── w3_or5.png
├── LICENSE
├── irobot_vision_tutorial
├── scripts
│ ├── y_coordinate.py
│ ├── rotate_robot.py
│ ├── detect_face.py
│ └── ball_tracking1.py
├── package.xml
└── CMakeLists.txt
├── README.md
└── README_full.md
/images/1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/HEAD/images/1.png
--------------------------------------------------------------------------------
/images/2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/HEAD/images/2.png
--------------------------------------------------------------------------------
/images/3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/HEAD/images/3.png
--------------------------------------------------------------------------------
/images/w3_o1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/HEAD/images/w3_o1.png
--------------------------------------------------------------------------------
/images/w3_or2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/HEAD/images/w3_or2.png
--------------------------------------------------------------------------------
/images/w3_or3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/HEAD/images/w3_or3.png
--------------------------------------------------------------------------------
/images/w3_or4.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/HEAD/images/w3_or4.png
--------------------------------------------------------------------------------
/images/w3_or5.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/HEAD/images/w3_or5.png
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2018 SMART Lab at Purdue University
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/irobot_vision_tutorial/scripts/y_coordinate.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import rospy
3 | from std_msgs.msg import Float32
4 | from geometry_msgs.msg import Twist
5 | msg=Twist()
6 | def callback(data):
7 |
8 | rospy.loginfo(rospy.get_caller_id() + "I heard %f", data.data)
9 | pub=rospy.Publisher('cmd_vel',Twist,queue_size=10)
10 | if(data.data>300):
11 | msg.linear.x=0.8
12 | elif((data.data>100) and (data.data<300)):
13 | msg.linear.x=0.5
14 | elif((data.data > 0) and (data.data<100)):
15 | msg.linear.x=0.1
16 | else:
17 | msg.angular.z=-0.174533
18 | pub.publish(msg)
19 |
20 | def listener():
21 |
22 | # In ROS, nodes are uniquely named. If two nodes with the same
23 | # node are launched, the previous one is kicked off. The
24 | # anonymous=True flag means that rospy will choose a unique
25 | # name for our 'listener' node so that multiple listeners can
26 | # run simultaneously.
27 | rospy.init_node('y_coordinate', anonymous=True)
28 |
29 | rospy.Subscriber('coordinates', Float32, callback)
30 |
31 | # spin() simply keeps python from exiting until this node is stopped
32 | rospy.spin()
33 |
34 | if __name__ == '__main__':
35 | listener()
36 |
--------------------------------------------------------------------------------
/irobot_vision_tutorial/package.xml:
--------------------------------------------------------------------------------
1 |
2 |
3 | irobot_vision_tutorial
4 | 0.1.0
5 | The irobot_vision_tutorial package for the Purdue SMART lab ROS tutorial on controlling iRobot create2 based on keyboard and vision-based approaches such as face detection and colored object detection
6 |
7 | Ramviyas Parasurman
8 | MIT
9 | Shaocheng Luo
10 | Arabinda Samantaray
11 |
12 |
13 | catkin
14 | roscpp
15 | rospy
16 | std_msgs
17 | roscpp
18 | rospy
19 | std_msgs
20 | roscpp
21 | rospy
22 | std_msgs
23 |
24 |
25 |
26 |
27 |
28 |
29 |
30 |
31 |
--------------------------------------------------------------------------------
/irobot_vision_tutorial/scripts/rotate_robot.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import rospy
3 | from std_msgs.msg import Float32
4 | from geometry_msgs.msg import Twist
5 | msg=Twist()
6 | def callback(data):
7 |
8 | rospy.loginfo(rospy.get_caller_id() + "I heard %f", data.data)
9 | #pub=rospy.Publisher('cmd_vel',Twist,queue_size=10)
10 | if(data.data>30):
11 | msg.angular.z=30*0.0174533
12 | elif(data.data<-30):
13 | msg.angular.z=-30*0.0174533
14 | else:
15 | msg.angular.z=(data.data)*0.0174533
16 | pub.publish(msg)
17 |
18 | def listener():
19 |
20 | # In ROS, nodes are uniquely named. If two nodes with the same
21 | # node are launched, the previous one is kicked off. The
22 | # anonymous=True flag means that rospy will choose a unique
23 | # name for our 'listener' node so that multiple listeners can
24 | # run simultaneously.
25 | rospy.init_node('rotate_robot', anonymous=True)
26 |
27 | rospy.Subscriber('rotation_angle', Float32, callback)
28 |
29 | # spin() simply keeps python from exiting until this node is stopped
30 | rospy.spin()
31 |
32 | if __name__ == '__main__':
33 | rospy.init_node('rotate_robot', anonymous=True)
34 | pub=rospy.Publisher('cmd_vel',Twist,queue_size=10)
35 | rospy.Subscriber('rotation_angle', Float32, callback)
36 | rospy.spin()
37 |
38 |
--------------------------------------------------------------------------------
/irobot_vision_tutorial/scripts/detect_face.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import cv2
3 | import numpy as np
4 | import rospy
5 | from std_msgs.msg import Float32
6 | from sensor_msgs.msg import CompressedImage
7 | import sys
8 | sys.path.append('/opt/ros/indigo/lib/python2.7/dist-packages')
9 |
10 | def image_callback(ros_data):
11 |
12 | np_arr = np.fromstring(ros_data.data, np.uint8)
13 | image_np = cv2.imdecode(np_arr, cv2.IMREAD_COLOR)
14 | gray=cv2.cvtColor(image_np,cv2.COLOR_BGR2GRAY)
15 | faces=face_cascade.detectMultiScale(gray,scaleFactor=1.1,minNeighbors=5,minSize=(30, 30),flags=cv2.CASCADE_SCALE_IMAGE)
16 | for (x,y,w,h) in faces:
17 | cv2.rectangle(gray,(x,y),(x+w,y+h),(255,0,0),2)
18 | distance_from_center=((2*x+w)/2)-320
19 | distance_per_degree=4
20 | rotation_angle=distance_from_center/distance_per_degree
21 | pub.publish(rotation_angle)
22 | rospy.loginfo(rotation_angle)
23 | cv2.imshow('Video',gray)
24 | cv2.waitKey(2)
25 |
26 |
27 |
28 |
29 |
30 | if __name__ == '__main__':
31 | try:
32 | rospy.init_node('detect_face', anonymous=True)
33 | pub=rospy.Publisher('rotation_angle',Float32,queue_size=1)
34 | face_cascade=cv2.CascadeClassifier('/home/arabinda/catkin_ws/src/ros_seminar/scripts/haarcascade_frontalface_default.xml')
35 | sub = rospy.Subscriber("/usb_cam/image_raw/compressed",CompressedImage, image_callback, queue_size = 1)
36 | rospy.spin()
37 | except rospy.ROSInterruptException:
38 | pass
39 |
--------------------------------------------------------------------------------
/irobot_vision_tutorial/scripts/ball_tracking1.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import cv2
3 | import numpy as np
4 | import rospy
5 | from std_msgs.msg import Float32
6 | from collections import deque
7 | import argparse
8 | import imutils
9 | import sys
10 |
11 |
12 | def talker():
13 | rospy.init_node('ball_tracking1', anonymous=True)
14 | pub=rospy.Publisher('coordinates',Float32,queue_size=1)
15 | rate = rospy.Rate(20) # 10hz
16 | ap = argparse.ArgumentParser()
17 | ap.add_argument("-v", "--video", dest="/home/arabinda/catkin_ws/src/ros_seminar/scripts/ball_tracking_example.mp4",help="path")
18 | ap.add_argument("-b", "--buffer", type=int, default=64,help="max buffer size")
19 | args = vars(ap.parse_args())
20 |
21 | greenLower = (20, 100, 100)
22 | greenUpper = (64, 255, 255)
23 | pts = deque(maxlen=64)
24 | if not args.get("video", False):
25 | camera = cv2.VideoCapture(0)
26 | else:
27 | camera = cv2.VideoCapture('args["video"]')
28 |
29 | while not rospy.is_shutdown():
30 | (grabbed, frame) = camera.read()
31 | if args.get("video") and not grabbed:
32 | break
33 | frame = imutils.resize(frame, width=600)
34 | hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
35 | mask = cv2.inRange(hsv, greenLower, greenUpper)
36 | mask = cv2.erode(mask, None, iterations=2)
37 | mask = cv2.dilate(mask, None, iterations=2)
38 | cnts = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_SIMPLE)[-2]
39 | center = None
40 | if len(cnts) > 0:
41 | c = max(cnts, key=cv2.contourArea)
42 | ((x, y), radius) = cv2.minEnclosingCircle(c)
43 | M = cv2.moments(c)
44 | center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"]))
45 | #print(center)
46 | pub.publish(int(M["m01"] / M["m00"]))
47 | rospy.loginfo(int(M["m01"] / M["m00"]))
48 | rate.sleep()
49 | if radius > 10:
50 | cv2.circle(frame, (int(x), int(y)), int(radius),(0, 255, 255), 2)
51 | cv2.circle(frame, center, 5, (0, 0, 255), -1)
52 | pts.appendleft(center)
53 | for i in xrange(1, len(pts)):
54 | if pts[i - 1] is None or pts[i] is None:
55 | continue
56 | thickness = int(np.sqrt(64 / float(i + 1)) * 2.5)
57 | cv2.line(frame, pts[i - 1], pts[i], (0, 0, 255), thickness)
58 |
59 | cv2.imshow("Frame", frame)
60 | if cv2.waitKey(1) & 0xFF==ord('q'):
61 | break
62 |
63 | rospy.spin()
64 |
65 | if __name__ == '__main__':
66 | try:
67 | talker()
68 | except rospy.ROSInterruptException:
69 | video_capture.release()
70 | cv2.destroyAllWindows()
71 | pass
72 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # ROS Tutorial on Teleoperation and Vision based Robot Control
2 | ROS tutorial by Purdue SMART lab: iRobot Create2 teleoperation and Computer Vision based object detection for mobile robot control.
3 |
4 | # Objectives
5 | This ROS tutorial provides an overview of teleoperating (control) an iRobot create2 (roomba) mobile robot. Then, we look at ways to make use of the robot to track a ball/person using openCV algorithms. For those who do not have a creat2 robot, we also show a case of controlling a turtlebot in simulation (instead of iRobot control) in this tutorial.
6 | Additionally, this tutorial will help you learn some basics of computer vision and robot control methods in ROS.
7 |
8 | ## Demonstration video:
9 | Please watch the videos below to get an idea of what is expected to be achieved by this tutorial.
10 |
11 |
12 |
13 | [](https://www.youtube.com/watch?v=ZyD-bbF6ts4)
14 |
15 |
16 |
17 | [](https://www.youtube.com/watch?v=-P3i7L_g1OM)
18 |
19 |
20 |
21 | # Credits
22 | This tutorial is prepared by Arabinda Samantaray (samantar@purdue.edu) and Shaocheng Luo (scluo@purdue.edu).
23 |
24 | The git repository is maintained by Ramviyas Parasuraman (ramviyas@purdue.edu)
25 |
26 | We acknowledge the following sources which were used to prepare this tutorial:
27 |
28 | - http://wiki.ros.org/ROS/Tutorials
29 |
30 | - http://milq.github.io/install-opencv-ubuntu-debian/
31 |
32 | - https://docs.opencv.org/3.3.0/d7/d8b/tutorial_py_face_detection.html
33 |
34 | - https://www.pyimagesearch.com/2015/09/14/ball-tracking-with-opencv/
35 |
36 | - ROS create_autonomy package summary
37 |
38 | - ROS driver for iRobot Create 1 and 2
39 |
40 | - ROS joy
41 |
42 | # Prerequisites
43 | A basic background of using Linux-based OS, ROS and OpenCV will be required to understand the following tutorial. If you need some help, please visit:
44 |
45 | - http://wiki.ros.org/ROS/Tutorials
46 |
47 | - http://files.ubuntu-manual.org/manuals/getting-started-with-ubuntu/14.04e2/en_US/screen/Getting%20Started%20with%20Ubuntu%2014.04%20-%20Second%20edition.pdf
48 |
49 | - https://docs.opencv.org/3.0-beta/doc/py_tutorials/py_tutorials.html
50 |
51 | - http://wiki.ros.org/rospy_tutorials/Tutorials/WritingImagePublisherSubscriber
52 |
53 | This tutorial was tested successfully in Ubuntu 14.04, ROS-indigo with OpenCV 3.0.0.
54 |
55 | # Tutorial
56 |
57 | * Section 1: [Installation of relevant dependencies/packages](https://github.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/wiki/Installation)
58 |
59 | * Section 2: [iRobot Create2 robot teleoperation control with Keyboard](https://github.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/wiki/iRobot-control-with-a-keyboard)
60 |
61 | * Section 3: [Rotate the robot to follow a person's face (using HAAR cascade filter and openCV)](https://github.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/wiki/Robot-control-with-face-tracking)
62 |
63 | * Section 4: [Move the robot to follow a green color object (using openCV)](https://github.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/wiki/Robot-control-with-colored-object-detection)
64 |
65 | * Section 5: [Optional extension - iRobot Remote control with a Gamepad/Joystick](https://github.com/SMARTlab-Purdue/ros-tutorial-robot-control-vision/wiki/Robot-control-with-Joystick)
66 |
67 |
68 | # Summary
69 |
70 | After this tutorial, one should gain skills in teleoperating (control) an iRobot create2 robot. This tutorial uses ROS Indigo version and we deployed packages including teleoperation-twist-keyboard, ros-indigo-joy, and OpenCV.
71 | We also showed how to control a robot motion using vision-based object detection and demonstrated this idea with an iRobot create2 robot. Eventually, we exhibited the use of the robot to track a ball/person using OpenCV algorithms, thus one can learn how to integrate openCV in robot control. We encourage readers to learn more about our other ROS tutorials.
72 |
73 | If you have any feedback please feel free to reach the authors through email: Arabinda Samantaray (samantar@purdue.edu) and Shaocheng Luo (scluo@purdue.edu).
74 |
--------------------------------------------------------------------------------
/irobot_vision_tutorial/CMakeLists.txt:
--------------------------------------------------------------------------------
1 | cmake_minimum_required(VERSION 2.8.3)
2 | project(irobot_vision_tutorial)
3 |
4 | ## Compile as C++11, supported in ROS Kinetic and newer
5 | # add_compile_options(-std=c++11)
6 |
7 | ## Find catkin macros and libraries
8 | ## if COMPONENTS list like find_package(catkin REQUIRED COMPONENTS xyz)
9 | ## is used, also find other catkin packages
10 | find_package(catkin REQUIRED COMPONENTS
11 | roscpp
12 | rospy
13 | std_msgs
14 | )
15 |
16 | ## System dependencies are found with CMake's conventions
17 | # find_package(Boost REQUIRED COMPONENTS system)
18 |
19 |
20 | ## Uncomment this if the package has a setup.py. This macro ensures
21 | ## modules and global scripts declared therein get installed
22 | ## See http://ros.org/doc/api/catkin/html/user_guide/setup_dot_py.html
23 | # catkin_python_setup()
24 |
25 | ################################################
26 | ## Declare ROS messages, services and actions ##
27 | ################################################
28 |
29 | ## To declare and build messages, services or actions from within this
30 | ## package, follow these steps:
31 | ## * Let MSG_DEP_SET be the set of packages whose message types you use in
32 | ## your messages/services/actions (e.g. std_msgs, actionlib_msgs, ...).
33 | ## * In the file package.xml:
34 | ## * add a build_depend tag for "message_generation"
35 | ## * add a build_depend and a run_depend tag for each package in MSG_DEP_SET
36 | ## * If MSG_DEP_SET isn't empty the following dependency has been pulled in
37 | ## but can be declared for certainty nonetheless:
38 | ## * add a run_depend tag for "message_runtime"
39 | ## * In this file (CMakeLists.txt):
40 | ## * add "message_generation" and every package in MSG_DEP_SET to
41 | ## find_package(catkin REQUIRED COMPONENTS ...)
42 | ## * add "message_runtime" and every package in MSG_DEP_SET to
43 | ## catkin_package(CATKIN_DEPENDS ...)
44 | ## * uncomment the add_*_files sections below as needed
45 | ## and list every .msg/.srv/.action file to be processed
46 | ## * uncomment the generate_messages entry below
47 | ## * add every package in MSG_DEP_SET to generate_messages(DEPENDENCIES ...)
48 |
49 | ## Generate messages in the 'msg' folder
50 | # add_message_files(
51 | # FILES
52 | # Message1.msg
53 | # Message2.msg
54 | # )
55 |
56 | ## Generate services in the 'srv' folder
57 | # add_service_files(
58 | # FILES
59 | # Service1.srv
60 | # Service2.srv
61 | # )
62 |
63 | ## Generate actions in the 'action' folder
64 | # add_action_files(
65 | # FILES
66 | # Action1.action
67 | # Action2.action
68 | # )
69 |
70 | ## Generate added messages and services with any dependencies listed here
71 | # generate_messages(
72 | # DEPENDENCIES
73 | # std_msgs
74 | # )
75 |
76 | ################################################
77 | ## Declare ROS dynamic reconfigure parameters ##
78 | ################################################
79 |
80 | ## To declare and build dynamic reconfigure parameters within this
81 | ## package, follow these steps:
82 | ## * In the file package.xml:
83 | ## * add a build_depend and a run_depend tag for "dynamic_reconfigure"
84 | ## * In this file (CMakeLists.txt):
85 | ## * add "dynamic_reconfigure" to
86 | ## find_package(catkin REQUIRED COMPONENTS ...)
87 | ## * uncomment the "generate_dynamic_reconfigure_options" section below
88 | ## and list every .cfg file to be processed
89 |
90 | ## Generate dynamic reconfigure parameters in the 'cfg' folder
91 | # generate_dynamic_reconfigure_options(
92 | # cfg/DynReconf1.cfg
93 | # cfg/DynReconf2.cfg
94 | # )
95 |
96 | ###################################
97 | ## catkin specific configuration ##
98 | ###################################
99 | ## The catkin_package macro generates cmake config files for your package
100 | ## Declare things to be passed to dependent projects
101 | ## INCLUDE_DIRS: uncomment this if your package contains header files
102 | ## LIBRARIES: libraries you create in this project that dependent projects also need
103 | ## CATKIN_DEPENDS: catkin_packages dependent projects also need
104 | ## DEPENDS: system dependencies of this project that dependent projects also need
105 | catkin_package(
106 | # INCLUDE_DIRS include
107 | # LIBRARIES irobot_vision_tutorial
108 | # CATKIN_DEPENDS roscpp rospy std_msgs
109 | # DEPENDS system_lib
110 | )
111 |
112 | ###########
113 | ## Build ##
114 | ###########
115 |
116 | ## Specify additional locations of header files
117 | ## Your package locations should be listed before other locations
118 | include_directories(
119 | # include
120 | ${catkin_INCLUDE_DIRS}
121 | )
122 |
123 | ## Declare a C++ library
124 | # add_library(${PROJECT_NAME}
125 | # src/${PROJECT_NAME}/irobot_vision_tutorial.cpp
126 | # )
127 |
128 | ## Add cmake target dependencies of the library
129 | ## as an example, code may need to be generated before libraries
130 | ## either from message generation or dynamic reconfigure
131 | # add_dependencies(${PROJECT_NAME} ${${PROJECT_NAME}_EXPORTED_TARGETS} ${catkin_EXPORTED_TARGETS})
132 |
133 | ## Declare a C++ executable
134 | ## With catkin_make all packages are built within a single CMake context
135 | ## The recommended prefix ensures that target names across packages don't collide
136 | # add_executable(${PROJECT_NAME}_node src/irobot_vision_tutorial_node.cpp)
137 |
138 | ## Rename C++ executable without prefix
139 | ## The above recommended prefix causes long target names, the following renames the
140 | ## target back to the shorter version for ease of user use
141 | ## e.g. "rosrun someones_pkg node" instead of "rosrun someones_pkg someones_pkg_node"
142 | # set_target_properties(${PROJECT_NAME}_node PROPERTIES OUTPUT_NAME node PREFIX "")
143 |
144 | ## Add cmake target dependencies of the executable
145 | ## same as for the library above
146 | # add_dependencies(${PROJECT_NAME}_node ${${PROJECT_NAME}_EXPORTED_TARGETS} ${catkin_EXPORTED_TARGETS})
147 |
148 | ## Specify libraries to link a library or executable target against
149 | # target_link_libraries(${PROJECT_NAME}_node
150 | # ${catkin_LIBRARIES}
151 | # )
152 |
153 | #############
154 | ## Install ##
155 | #############
156 |
157 | # all install targets should use catkin DESTINATION variables
158 | # See http://ros.org/doc/api/catkin/html/adv_user_guide/variables.html
159 |
160 | ## Mark executable scripts (Python etc.) for installation
161 | ## in contrast to setup.py, you can choose the destination
162 | # install(PROGRAMS
163 | # scripts/my_python_script
164 | # DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION}
165 | # )
166 |
167 | ## Mark executables and/or libraries for installation
168 | # install(TARGETS ${PROJECT_NAME} ${PROJECT_NAME}_node
169 | # ARCHIVE DESTINATION ${CATKIN_PACKAGE_LIB_DESTINATION}
170 | # LIBRARY DESTINATION ${CATKIN_PACKAGE_LIB_DESTINATION}
171 | # RUNTIME DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION}
172 | # )
173 |
174 | ## Mark cpp header files for installation
175 | # install(DIRECTORY include/${PROJECT_NAME}/
176 | # DESTINATION ${CATKIN_PACKAGE_INCLUDE_DESTINATION}
177 | # FILES_MATCHING PATTERN "*.h"
178 | # PATTERN ".svn" EXCLUDE
179 | # )
180 |
181 | ## Mark other files for installation (e.g. launch and bag files, etc.)
182 | # install(FILES
183 | # # myfile1
184 | # # myfile2
185 | # DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION}
186 | # )
187 |
188 | #############
189 | ## Testing ##
190 | #############
191 |
192 | ## Add gtest based cpp test target and link libraries
193 | # catkin_add_gtest(${PROJECT_NAME}-test test/test_irobot_vision_tutorial.cpp)
194 | # if(TARGET ${PROJECT_NAME}-test)
195 | # target_link_libraries(${PROJECT_NAME}-test ${PROJECT_NAME})
196 | # endif()
197 |
198 | ## Add folders to be run by python nosetests
199 | # catkin_add_nosetests(test)
200 |
--------------------------------------------------------------------------------
/README_full.md:
--------------------------------------------------------------------------------
1 | # ROS Tutorial on Teleoperation and Vision based Robot Control
2 | ROS tutorial by Purdue SMART lab: iRobot Create2 teleoperation and Computer Vision based object detection for mobile robot control.
3 |
4 | The wiki page of this repository also has the same tutorial instructions:
5 |
6 | # 1. Objectives
7 | This ROS tutorial provides an overview of teleoperating (control) an iRobot create2 (roomba) mobile robot. Then, we look at ways to make use of the robot to track a ball/person using openCV algorithms. For those who do not have a creat2 robot, we also show a case of controlling a turtlebot in simulation (instead of iRobot control) in this tutorial.
8 | Additionally, this tutorial will help you learn some basics of computer vision and robot control methods in ROS.
9 |
10 | ## Demonstration video:
11 | Please watch the videos below to get an idea of what you can expect to achieve from this tutorial.
12 |
13 |
14 |
15 | [](https://www.youtube.com/watch?v=ZyD-bbF6ts4)
16 |
17 |
18 |
19 | [](https://www.youtube.com/watch?v=-P3i7L_g1OM)
20 |
21 |
22 |
23 | # 2. Credits
24 | This tutorial is prepared by Arabinda Samantaray (samantar@purdue.edu) and Shaocheng Luo (scluo@purdue.edu).
25 |
26 | The repository is maintained by Ramviyas Parasurman (ramviyas@purdue.edu).
27 |
28 | We acknowledge the following sources that were used to prepare this tutorial:
29 |
30 | - http://wiki.ros.org/ROS/Tutorials
31 |
32 | - http://milq.github.io/install-opencv-ubuntu-debian/
33 |
34 | - https://docs.opencv.org/3.3.0/d7/d8b/tutorial_py_face_detection.html
35 |
36 | - https://www.pyimagesearch.com/2015/09/14/ball-tracking-with-opencv/
37 |
38 | - ROS create_autonomy package summary
39 |
40 | - ROS driver for iRobot Create 1 and 2
41 |
42 | - ROS joy
43 |
44 | # 3. Prerequisites
45 | A basic background of using Linux-based OS, ROS and OpenCV will be required to understand the following tutorial. If you need some help, please visit:
46 |
47 | - http://wiki.ros.org/ROS/Tutorials
48 |
49 | - http://files.ubuntu-manual.org/manuals/getting-started-with-ubuntu/14.04e2/en_US/screen/Getting%20Started%20with%20Ubuntu%2014.04%20-%20Second%20edition.pdf
50 |
51 | - https://docs.opencv.org/3.0-beta/doc/py_tutorials/py_tutorials.html
52 |
53 | - http://wiki.ros.org/rospy_tutorials/Tutorials/WritingImagePublisherSubscriber
54 |
55 | This tutorial was tested successfully in Ubuntu 14.04, ROS-indigo with OpenCV 3.0.0.
56 |
57 | # 4. Installation of relevant dependencies/packages
58 |
59 | ### Install ROS
60 | We used ROS Indigo in this tutorial. But the tutorial could work even in ROS Kinetic/Lunar, although we did not validate it in Kinetic Kame.
61 |
62 | We recommend you install _ros-indigo-desktop-full_ so that you have all the necessary packages. The full package comes with Gazebo 2.2 as default. We recommend using a desktop PC or a laptop with Ubuntu 14.04+.
63 |
64 | ``` sudo apt-get install ros-indigo-desktop-full ```
65 |
66 | For more help with ROS installation, follow the instructions here:
67 |
68 | ### Installing iRobot Create2 (Roomba 600) Driver
69 | The ROS driver for Create2 robot is provided by the Autonomy lab at SFU, Canada.
70 | Install the ROS driver for the robot using the following instructions. For detailed driver installation instructions, please look at .
71 |
72 | ### Workspace Setup
73 |
74 | If you want to create a new catkin workspace for the robot driver, then
75 | ``` bash
76 | $ cd ~
77 | $ mkdir -p create_ws/src
78 | $ cd ~/create_ws
79 | $ catkin_make
80 | ```
81 | Else, use an existing ROS workspace for the next step.
82 |
83 | Clone the repository of the robot driver
84 | ``` bash
85 | $ cd ~/create_ws/src
86 | $ git clone https://github.com/AutonomyLab/create_autonomy.git
87 | ```
88 |
89 | Build/Compile
90 | ``` bash
91 | $ cd ~/create_ws
92 | $ catkin_make
93 | ```
94 |
95 | ### Check USB Permissions
96 |
97 | 1. In order to connect to Create over USB, ensure your user is in the dialout group
98 | ``` bash
99 | $ sudo usermod -a -G dialout $USER
100 | ```
101 |
102 | 2. Logout of the OS and log in again for the permission to take effect
103 |
104 | ### Teleoperation Twist Keyboard Package Installation
105 |
106 | Install the package of Teleoperation Twist Keyboard
107 |
108 | ``` bash
109 | $ sudo apt-get install ros-indigo-teleop-twist-keyboard
110 | ```
111 |
112 | ### Install OpenCV
113 | - The easiest way to install openCV (if you do not have it installed already) is to do:
114 | ``` bash
115 | sudo apt-get install libopencv-dev python-opencv
116 | ```
117 | - If the above resulted in errors, then perhaps the following resource might be useful:
118 |
119 | ### Install USB camera drivers
120 |
121 | - In your home workspace\src
122 |
123 | ``` sudo apt-get install ros-indigo-usb-cam ```
124 |
125 | ### Download HaarCascade file for image processing
126 | - Download ``` haarcascade_frontalface_default.xml``` file from and save it in your designated ```ros_package/scripts``` folder, which contains the script that uses the haar cascade.
127 |
128 | ### Install turtle simulator (if no physical robot is available)
129 | ``` sudo apt-get install ros-indigo-turtlesim ```
130 |
131 |
132 | # 5. Tutorial
133 |
134 |
135 | ## 5.1 iRobot control with a keyboard
136 |
137 | 
138 |
139 | ### ROS Launch Files
140 |
141 | First, run a roscore, which will act as master and establishes connections between all ROS nodes.
142 |
143 | ``` roscore ```
144 |
145 | For Create2 robot (Roomba 600/700 series), launch the ROS drive using the following command:
146 |
147 | ``` bash
148 |
149 | $ roslaunch ca_driver create_2.launch
150 |
151 | ```
152 | 
153 |
154 | ### Check the ROS Topics list to see if the robot drivers are publishing the state such as the odometry output in /odom, bumper sensor readings, etc.
155 |
156 | ``` bash
157 |
158 | $ rostopic
159 | $ rostopic list
160 |
161 | ```
162 |
163 | You can use ``` $ rostopic pub /cmd_vel geometry_msgs/Twist ``` as shown in the below image to see if your robot is responding to the velocity commands.
164 |
165 | 
166 |
167 | If you change `z: 0.0` to `z: 5.0" -r 10`, you rotate the robot. The mechanism behind is, you send `geometry_msgs/Twist` messages to the topic `cmd_vel`, which the robot listens to to get forward and angular velocity inputs.
168 | 
169 |
170 | ### iRobot Teleoperation with Keyboard
171 |
172 | Assuming you have installed the Teleoperation Twist Keyboard ROS package, run the ROS node for the Teleoperation Twist Keyboard
173 |
174 | ``` bash
175 | $ rosrun teleop_twist_keyboard teleop_twist_keyboard.py
176 | ```
177 |
178 | Now you could use keys listed on the terminal interface to control the robot.
179 | 
180 |
181 | `teleop_twist_keyboard` is to enable keyboard control to iRobot, reading from the keyboard input and publishing it to /cmd_vel topic with the message type geometry_msgs/Twist. For more details on this package, visit .
182 |
183 | ### Turtlesim simulator
184 |
185 | If you do not have a physical robot like create2, then you can still test the tutorial using a simulated turtle robot using
186 |
187 | ``` bash
188 | rosrun turtlesim turtlesim_node
189 | ```
190 |
191 | Note, you will have to remap the /cmd_vel topic to /turtle1/cmd_vel if you use the turtlesim.
192 |
193 |
194 | ## 5.2 Get iRobot Roomba to rotate and track an individual's face
195 |
196 | Follow the below steps:
197 |
198 | #### Step 1: Create a node that would use HAAR cascade detector to detect an individual's face and publish those values ####
199 |
200 | * Create a python file in your scripts folder and name it detect_face.py.
201 | * Save the script in your ```ros_package/scripts``` folder. Alternatively, download the ROS package called irobot_vision_tutorial supplied with this tutorial.
202 | * Add the following code in this script. We detail each segment of the code below.
203 |
204 | ``` bash
205 | #!/usr/bin/env python
206 | import cv2
207 | import numpy as np
208 | import rospy
209 | from std_msgs.msg import Float32
210 | from sensor_msgs.msg import CompressedImage
211 | import sys
212 | sys.path.append('/opt/ros/indigo/lib/python2.7/dist-packages')
213 | ```
214 |
215 | * This code will import all the necessary packages into your script. Note, we used Python 2.7.
216 |
217 | * The following code will initiate the ``` detect_face ``` node and begin publishing a topic ``` rotation_angle```. Since we will be using the USB camera we will be obtaining the video frames from a topic called ``` /usb_cam/image_raw/compressed ``` which we will use for detecting a person's face. The subscriber will initiate a callback function ``` image_callback ``` every time it gets a new message.
218 |
219 | ``` bash
220 | if __name__ == '__main__':
221 | try:
222 | rospy.init_node('detect_face', anonymous=True)
223 | pub=rospy.Publisher('rotation_angle',Float32,queue_size=1)
224 | #rate = rospy.Rate(1) # 10hz
225 | face_cascade=cv2.CascadeClassifier('/home/arabinda/catkin_ws/src/ros_seminar/scripts/haarcascade_frontalface_default.xml')
226 | sub = rospy.Subscriber("/usb_cam/image_raw/compressed",CompressedImage, image_callback, queue_size = 1)
227 | rospy.spin()
228 | #talker()
229 | except rospy.ROSInterruptException:
230 | pass
231 | ```
232 |
233 | * The ``` image_callback``` function code will utilise the ``` haarcascade_frontalface_default.xml ``` file to detect the individuals face on the compressed image it obtains as ``` ros_data ```. Subsequently, we will calculate the coordinates of the center of the bounding box and evaluate its distance from the center x-coordinate of the window frame (in my case it is 320 as the x-coordinate of window frame ranges from 0-640).
234 | Then we will convert this distance into the degree value the robot should rotate. This degree value is published over the topic ``` rotation angle ```.
235 |
236 | ``` bash
237 | def image_callback(ros_data):
238 |
239 | np_arr = np.fromstring(ros_data.data, np.uint8)
240 | image_np = cv2.imdecode(np_arr, cv2.IMREAD_COLOR)
241 | gray=cv2.cvtColor(image_np,cv2.COLOR_BGR2GRAY)
242 | faces=face_cascade.detectMultiScale(gray,scaleFactor=1.1,minNeighbors=5,minSize=(30, 30),flags=cv2.CASCADE_SCALE_IMAGE)
243 | for (x,y,w,h) in faces:
244 | cv2.rectangle(gray,(x,y),(x+w,y+h),(255,0,0),2)
245 | distance_from_center=((2*x+w)/2)-320
246 | distance_per_degree=4
247 | rotation_angle=distance_from_center/distance_per_degree
248 | pub.publish(rotation_angle)
249 | rospy.loginfo(rotation_angle)
250 | #rate.sleep()
251 | cv2.imshow('Video',gray)
252 | cv2.waitKey(2)
253 |
254 | ```
255 | 
256 |
257 | #### Step 2: Create a new node that subscribes to ```rotation_angle```, convert's the angle value into radians and publishes it as the "angular value for the z-axis" over the /cmd_vel topic. ####
258 |
259 | * Create a new python file in your ``` ros_package/scripts``` folder and name it rotate_robot.py.
260 |
261 | * Import the necessary packages, initiate the python inverter and create an object of the ```Twist``` class.
262 |
263 | ``` bash
264 | #!/usr/bin/env python
265 | import rospy
266 | from std_msgs.msg import Float32
267 | from geometry_msgs.msg import Twist
268 | msg=Twist()
269 | ```
270 |
271 | * Initiate the python script by running the ```__main__``` function and calling the ``` listener ``` function
272 | ``` bash
273 | if __name__ == '__main__':
274 | listener()
275 | ```
276 |
277 | * The ``` listener() ``` is responsible for initializing the ``` rotate_robot ``` node and subscribing to the ``` rotation_angle ``` topic. It also passes the rotation angle data obtained from the topic to the ``` callback ``` function
278 |
279 | ``` bash
280 |
281 | def listener():
282 |
283 | # In ROS, nodes are uniquely named. If two nodes with the same
284 | # node are launched, the previous one is kicked off. The
285 | # anonymous=True flag means that rospy will choose a unique
286 | # name for our 'listener' node so that multiple listeners can
287 | # run simultaneously.
288 | rospy.init_node('rotate_robot', anonymous=True)
289 |
290 | rospy.Subscriber('rotation_angle', Float32, callback)
291 |
292 | # spin() simply keeps python from exiting until this node is stopped
293 | rospy.spin()
294 | ```
295 |
296 | * The callback function converts the angle value passed into it using ``` data ``` and converts it into radians. This value is then published ``` turtle1/cmd_vel ```
297 |
298 | ``` bash
299 | def callback(data):
300 |
301 | #rospy.loginfo(rospy.get_caller_id() + "I heard %f", data.data)
302 | pub=rospy.Publisher('turtle1/cmd_vel',Twist,queue_size=10)
303 | if(data.data>30):
304 | msg.angular.z=30*0.0174533
305 | elif(data.data<-30):
306 | msg.angular.z=-30*0.0174533
307 | else:
308 | msg.angular.z=(data.data)*0.0174533
309 | pub.publish(msg)
310 |
311 | ```
312 | 
313 |
314 | #### Step 3: Make iRobot rotate based on the location of person's face on the camera image.
315 |
316 | - If you have the robot installed with a camera, then run the above nodes and launch the create2 drivers ``` roslaunch ca_driver create_2.launch``` drivers and usb_cam node ``` rosrun usb_cam usb_node ```
317 |
318 | - A demonstration video is available here:
319 |
320 | #### Step 4: Make turtlesim rotate
321 |
322 | - If you do not have an iRobot Roomba, launch the turtlesim by using the command ``` $ rosrun turtlesim turtlesim_node ```
323 |
324 | - When you move face in front of the webcam the turtlesim/irobot will rotate
325 |
326 | 
327 |
328 |
329 | ## 5.3 Get iRobot Roomba (or turtlesim) to move towards a green colored object
330 |
331 | This tutorial will be about detecting a green colored object and enabling any robot such as the Roomba or turtlesim to move towards the object.
332 |
333 | This tutorial will involve the following steps:
334 |
335 | #### Step 1: Create a node to detect the ball ####
336 |
337 | * Create a new python file in your ``` ros_package/scripts``` folder and name it ball_tracking1.py
338 |
339 | * Import the necessary packages
340 |
341 | ``` bash
342 | #!/usr/bin/env python
343 | import cv2
344 | import numpy as np
345 | import rospy
346 | from std_msgs.msg import Float32
347 | from collections import deque
348 | import argparse
349 | import imutils
350 | import sys
351 | ```
352 |
353 | * The following code will initiate the script and call a ```talker()``` function that is responsible for detecting the individuals face and publishing angle values. Also when the node is stopped using ``` CTRL+C ``` the camera would be released by OpenCV and all the window frames would be destroyed, until then the talker function would be run in a loop.
354 |
355 | ``` bash
356 | if __name__ == '__main__':
357 | try:
358 | talker()
359 | except rospy.ROSInterruptException:
360 | video_capture.release()
361 | cv2.destroyAllWindows()
362 | pass
363 | ```
364 |
365 | * In case you do not have any green object around you, we have provided a video ``` ball_tracking_example.mp4 ``` of a person playing with a green colored ball in the code section of ``` Week 12```. Download the video and place it in your ``` ros_package/scripts``` folder.
366 |
367 | * The talker function will run the video script or the webcam depending upon your preference. The ``` talker()``` will detect any object whose color value ranges between ``` greenLower ``` and ``` greenUpper ```. The object would be detected and a circular contour would be placed over each frame. The centroid coordinates of the circular contour would then be published using ``` coordinates ``` topic
368 |
369 | ``` bash
370 | def talker():
371 | rospy.init_node('ball_tracking1', anonymous=True)
372 | pub=rospy.Publisher('coordinates',Float32,queue_size=1)
373 | rate = rospy.Rate(20) # 10hz
374 | ap = argparse.ArgumentParser()
375 | ap.add_argument("-v", "--video", dest="/home/arabinda/catkin_ws/src/ros_seminar/scripts/ball_tracking_example.mp4",help="path")
376 | ap.add_argument("-b", "--buffer", type=int, default=64,help="max buffer size")
377 | args = vars(ap.parse_args())
378 |
379 | greenLower = (29, 86, 6)
380 | greenUpper = (64, 255, 255)
381 | pts = deque(maxlen=64)
382 | if not args.get("video", False):
383 | camera = cv2.VideoCapture(0)
384 | else:
385 | camera = cv2.VideoCapture('args["video"]')
386 |
387 | while not rospy.is_shutdown():
388 | (grabbed, frame) = camera.read()
389 | if args.get("video") and not grabbed:
390 | break
391 | frame = imutils.resize(frame, width=600)
392 | hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
393 | mask = cv2.inRange(hsv, greenLower, greenUpper)
394 | mask = cv2.erode(mask, None, iterations=2)
395 | mask = cv2.dilate(mask, None, iterations=2)
396 | cnts = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_SIMPLE)[-2]
397 | center = None
398 | if len(cnts) > 0:
399 | c = max(cnts, key=cv2.contourArea)
400 | ((x, y), radius) = cv2.minEnclosingCircle(c)
401 | M = cv2.moments(c)
402 | center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"]))
403 | #print(center)
404 | pub.publish(int(M["m01"] / M["m00"]))
405 | rospy.loginfo(int(M["m01"] / M["m00"]))
406 | rate.sleep()
407 | if radius > 10:
408 | cv2.circle(frame, (int(x), int(y)), int(radius),(0, 255, 255), 2)
409 | cv2.circle(frame, center, 5, (0, 0, 255), -1)
410 | pts.appendleft(center)
411 | for i in xrange(1, len(pts)):
412 | if pts[i - 1] is None or pts[i] is None:
413 | continue
414 | thickness = int(np.sqrt(64 / float(i + 1)) * 2.5)
415 | cv2.line(frame, pts[i - 1], pts[i], (0, 0, 255), thickness)
416 |
417 | cv2.imshow("Frame", frame)
418 | if cv2.waitKey(1) & 0xFF==ord('q'):
419 | break
420 |
421 | rospy.spin()
422 | ```
423 | #### Step 2: Create a node to find the y-coordinate value and based on it publish a linear speed over /cmd_vel topic ####
424 |
425 | * Create a new python file in your ``` ros_package/scripts``` folder and name it y_coordinate.py
426 |
427 | * Import the necessary packages and create an object of ``` Twist ``` class
428 | ``` bash
429 | #!/usr/bin/env python
430 | import rospy
431 | from std_msgs.msg import Float32
432 | from geometry_msgs.msg import Twist
433 | msg=Twist()
434 | ```
435 |
436 | * Initiate the python script by running the ```__main__``` function and calling the ``` listener ``` function
437 | ``` bash
438 | if __name__ == '__main__':
439 | listener()
440 | ```
441 |
442 | * The ``` listener() ``` is responsible for initializing the ``` y_coordinate ``` node and subscribing to the ``` coordinates ``` topic. It also passes the y_coordinate data obtained from the topic to the ``` callback ``` function
443 |
444 | ``` bash
445 |
446 | def listener():
447 |
448 | # In ROS, nodes are uniquely named. If two nodes with the same
449 | # node are launched, the previous one is kicked off. The
450 | # anonymous=True flag means that rospy will choose a unique
451 | # name for our 'listener' node so that multiple listeners can
452 | # run simultaneously.
453 | rospy.init_node('rotate_robot', anonymous=True)
454 |
455 | rospy.Subscriber('rotation_angle', Float32, callback)
456 |
457 | # spin() simply keeps python from exiting until this node is stopped
458 | rospy.spin()
459 | ```
460 |
461 | * The callback function determines the linear velocity of the irobot/turtlesim based on how far the y-coordinate is and publishes it over ``` turtle1/cmd_vel ```.
462 |
463 | ``` bash
464 |
465 | def callback(data):
466 |
467 | rospy.loginfo(rospy.get_caller_id() + "I heard %f", data.data)
468 | pub=rospy.Publisher('turtle1/cmd_vel',Twist,queue_size=10)
469 | if(data.data>50):
470 | msg.linear.x=0.1
471 | else:
472 | msg.linear.x=0
473 | pub.publish(msg)
474 |
475 |
476 | ```
477 |
478 | * Make irobot/turtlesim move linearly
479 |
480 | - If you have the robot installed with a camera, then run the above nodes and launch the create2 and usb_cam drivers
481 |
482 | - If you do not have an iRobot Roomba launch the turtlesim by using the command ``` $ rosrun turtlesim turtlesim_node ```
483 |
484 | - When you place a green colored object infront of the webcam the turtlesim/irobot will move towards it
485 |
486 | - A demonstration video is available here:
487 |
488 |
489 | # 6. Optional extensions - iRobot Remote control using Joystick (Gamepad)
490 | One can also use a joystick (e.g. Logitech Wireless Gamepad F710) to control the robot remotely. We provide a brief tutorial to inspire our readers.
491 |
492 |
493 | ## Brief Introduction to Joystick
494 | In this project, we could use Microsoft Xbox 360 Wireless Controller for Linux.
495 | Table of index number of /joy.buttons:
496 |
497 | Index Button name on the actual controller
498 | 0 A
499 | 1 B
500 | 2 X
501 | 3 Y
502 | 4 LB
503 | 5 RB
504 | 6 back
505 | 7 start
506 | 8 power
507 | 9 Button stick left
508 | 10 Button stick right
509 |
510 | Table of index number of /joy.axis:
511 | Index Axis name on the actual controller
512 | 0 Left/Right Axis stick left
513 | 1 Up/Down Axis stick left
514 | 2 Left/Right Axis stick right
515 | 3 Up/Down Axis stick right
516 | 4 RT
517 | 5 LT
518 | 6 cross key left/right
519 | 7 cross key up/down
520 |
521 | Or we can use joystick Logitech Gamepad F710, .
522 |
523 |
524 | ## Joystick Driver Installation
525 |
526 | Installing the joystick driver
527 |
528 | 0. Start by installing the package
529 | ``` bash
530 | $ sudo apt-get install ros-indigo-joy ros-indigo-teleop-twist-joy ros-indigo-teleop-tools-msgs
531 | ```
532 |
533 | 1. Configuring the Joystick and make sure that the joystick is recognized by Linux working.
534 | ``` bash
535 | $ ls /dev/input/
536 | $ sudo jstest /dev/input/jsX
537 | ```
538 | If the joystick device is not configured properly and you need to
539 | ``` bash
540 | $ sudo chmod a+rw /dev/input/jsX
541 | ```
542 |
543 |
544 | ## Starting the Joy Node
545 | 0. In this section we use ros-indigo-joy package. Assume the joystick driver has been isntalled, to get the joystick data published over ROS we need to start the joy node. First let's tell the joy node which joystick device to use- the default is js0.
546 | ``` bash
547 | $ roscore
548 | $ rosparam set joy_node/dev "/dev/input/jsX"
549 | ```
550 |
551 | 1. Configuring the Joystick and make sure that the joystick is working.
552 | ``` bash
553 | $ ls /dev/input/
554 | $ sudo jstest /dev/input/jsX
555 | ```
556 | Now we can start the joy node
557 | ``` bash
558 | $ rosrun joy joy_node
559 | ```
560 |
561 | 2. In a new terminal you can rostopic echo the joy topic to see the data from the joystick
562 | ``` bash
563 | $ rostopic echo joy
564 | ```
565 |
566 | 3. Then run the teleop_twist_joy node from teleop_twist_joy package with command configurations. We recommend the following launch file which launches both joy node and teleop node with right configurations.
567 | ``` bash
568 | $ roslaunch teleop_twist_joy teleop.launch
569 | ```
570 |
571 |
572 | # Summary
573 | After this tutorial, one should gain skills in teleoperating (control) an iRobot create2 robot. This tutorial used ROS Indigo version and we deployed packages including teleoperation-twist-keyboard, ros-indigo-joy, and OpenCV.
574 | We also showed how to control a robot motion using vision based object detection and demonstrated this idea with an iRobot create2 robot. Eventually, we exhibited the use of the robot to track a ball/person using OpenCV algorithms, thus one can learn how to integrate openCV in robot control. We encourage readers to learn more about our other ROS tutorials.
575 |
576 | If you have any feedback please feel free to reach us through email: Arabinda Samantaray (samantar@purdue.edu) and Shaocheng Luo (scluo@purdue.edu).
577 |
--------------------------------------------------------------------------------