Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 2203 articles
Browse latest View live

How do you add new frames to the tf tree of a Jackal robot?

$
0
0
I have a Jackal robot that is running ubuntu 14.04 with ROS indigo installed. I have attached a Hokuyo UTM 30lx laser to the jackal robot so I am trying to modify the jackal.urdf.xacro file to include the hokuyo laser so I can use the laser scans to do gmapping. I have got the system working in simulation, I was able to add the hokuyo laser to the jackal in gazebo and use the scans to do autonomous navigation. When I have copied over the jackal.urdf.xacro file onto the real robot, I am unable to see the hokuyo_base_link and the front_laser (hokuyo laser) link in the tf tree, and thus when I visualize the robot in Rviz I get an error on LaserScan saying: "Transform [sender=unknown_publisher] for frame [front_laser]: frame [front_laser] does not exist." I can attach my jackal.urdf.xacro file if needed. Please could someone help me and explain how I need to modify my files to get the hokuyo laser working on the real robot? Thank you very much.

rospy.Time(nsecs) bugs ?

$
0
0
Hi, I'm using ROS indigo on Ubuntu 14.04 I have realized that rospy.Time() hangs when called with nsecs argument: >>> rospy.Time(secs=1507324334622140084) rospy.Time[1507324334622140084000000000] >>> rospy.Time(nsecs=1507324334622140084000000000) ... (hangs) Is this a bug on Indigo ? Regards

ROS installation issue on ubuntu 14.04

$
0
0
Hi, I am trying to install these modules on a fresh ubuntu 14.04 OS "python-rosdep", "python-rosdistro", "python-rosdistro-modules", "python-catkin-pkg", "python-catkin-pkg-modules", "python-netifaces", "python-qt4", "python-qt4-dev", "python-pyside", "libapr1", "libaprutil1", "libzzip-0-13" and I am getting the following error '/usr/bin/apt-get -y -o \"Dpkg::Options::=--force-confdef\" -o \"Dpkg::Options::=--force-confold\" install 'python-rosdep' 'python-rosdistro' 'python-rosdistro-modules' 'python-catkin-pkg' 'python-catkin-pkg-modules' 'python-netifaces' 'python-qt4-dev' 'python-pyside' 'libapr1' 'libaprutil1' 'libzzip-0-13' -o APT::Install-Recommends=no' failed: dpkg-deb: error: archive '/var/cache/apt/archives/python-catkin-pkg-modules_0.4.7-1_all.deb' has premature member 'control.tar.xz' before 'control.tar.gz', giving up\ndpkg: error processing archive /var/cache/apt/archives/python-catkin-pkg-modules_0.4.7-1_all.deb (--unpack):\n subprocess dpkg-deb --control returned error exit status 2\ndpkg-deb: error: archive '/var/cache/apt/archives/python-catkin-pkg_0.4.7-100_all.deb' has premature member 'control.tar.xz' before 'control.tar.gz', giving up\ndpkg: error processing archive /var/cache/apt/archives/python-catkin-pkg_0.4.7-100_all.deb (--unpack):\n subprocess dpkg-deb --control returned error exit status 2\nErrors were encountered while processing:\n /var/cache/apt/archives/python-catkin-pkg-modules_0.4.7-1_all.deb\n /var/cache/apt/archives/python-catkin-pkg_0.4.7-100_all.deb\nE: Sub-process /usr/bin/dpkg returned an error code (1)\n"

gazebo velocity controller

$
0
0
I tried to simulate my car model in the gazebo, I built the front and rear wheel of the car, I tried to use the velocity controller in the ros_control to control my car front wheel and rear wheels respectively. had been loaded out of Wheel1_velocity_controller and Wheel2_velocity_controller: Test "rostopic pub-1/rrbot/wheel1_velocity_controller/command std_msgs/float64 data:1.5" "Rostopic pub-1/rrbot/wheel1_velocity_controller/command std_msgs/float64 data:1.0" The car can be movement, theoretically should be driven in a straight line, and the result of the simulation is turning movement, why?

ROS on Manjaro

$
0
0
Is it straightforward to install and run ROS1 indigo on [manjaro](https://manjaro.org/) (based on [archlinux](https://wiki.archlinux.org/))? Does someone know what needs to be considered in addition/in difference to arch? Does someone know about a docker image which would ease the setup?

How to interpret robot's quaternion?

$
0
0
Hello Again! (Using Indigo/Rospy) Is there a way, using the robot's quaternion (using slam_out_pose orientation), to determine if its going towards, or away, from the center of the global map (the origin, 2D). The origin is, I'm assuming, where the robot initial started up its mapping procedure. This is not the same as determining if the robot is going forwards or backwards. This is probably due to my poor understanding of quaternions, especially in the robot's reference frame in conjunction with the static reference frame of the global map. Thank you in advance.

Can't rosmake nor catkin_make pi_tracker

$
0
0
Hello, While attempting to rosmake "pi_tracker", It came out with an error. No package or stack specified. Add current directory 'pi_tracker-indigo-devel' is not a package name or stack name ubuntu 12.04 x64 indigo openni drivers (openni-bin-dev-linux-x64-v1.5.4.0) nite (nite-bin-dev-linux-x64-v1.5.2.21

Problem insatlling ROS indigo in ubuntu 14.04: python-catkin-pkg is not getting installed

$
0
0
Straight dump of error: gyan@L-7874:~$ sudo apt-get install ros-indigo-desktop-full Reading package lists... Done Building dependency tree Reading state information... Done You might want to run 'apt-get -f install' to correct these: The following packages have unmet dependencies: python-catkin-tools : Depends: python-catkin-pkg (>= 0.2.9) but it is not going to be installed python-rosdep : Depends: python-catkin-pkg but it is not going to be installed python-rosdistro-modules : Depends: python-catkin-pkg-modules but it is not going to be installed ros-indigo-catkin : Depends: python-catkin-pkg (> 0.2.9) but it is not going to be installed ros-indigo-desktop-full : Depends: ros-indigo-desktop but it is not going to be installed Depends: ros-indigo-perception but it is not going to be installed Depends: ros-indigo-simulators but it is not going to be installed Depends: ros-indigo-urdf-tutorial but it is not going to be installed E: Unmet dependencies. Try 'apt-get -f install' with no packages (or specify a solution). I have tried most of the commonly known tricks given in the internet. Anyone who fa ed similar issue with regards to catkin package

catkin_make not happen

$
0
0
OS : Ubuntu 14.04 ROS : Indigo catkin_make is not working. But there is no error. Just nothing happened. ![image description](http://) I upgraded cmake version 3.12 (before : 2.x) I created pkg with catkin_create_pkg command. I set devel/setup.bash. And roscore is normally running. I think there is no problem. But why the catkin_make not works ?? Please help me.

EKF localization with unknown correspondence

$
0
0
I am working on a simulation with the huskyA200. I want to implement the EKF localization with unknown correspondences (CH7.5) in the [probabilistic robotics book by Thrun](https://github.com/liulinbo/slam/blob/master/Probabilistic%20Robotics%20_Sebastian%20Thrun%20et%20al..pdf ), to understand SLAM better. Here is a picture of the algorithm in the book: ![image description](/upfiles/15331609323141694.jpg) ![image description](/upfiles/1533160941554923.jpg) I am a little confused on where to place this algorithm in my python code. following is my simulation setup for this algorithm, I have created a world with 6 landmarks shown below: ![1](/upfiles/15331603438645169.png), then, provided a map using gmapping in rviz shown below: ![image description](/upfiles/15331603956514511.png). Then, I wrote down coordinates of 6 landmarks using the *publish point* in the rviz GUI. Now, in my code, I am planning to call `ekf_localization()` algorithm in my `laserCallback` function, shown below: #!/usr/bin/env python from std_msgs.msg import String from sensor_msgs.msg import LaserScan from nav_msgs.msg import Odometry import rospy import numpy as np import math #parameters alp1 = 1 alp2 = 1 alp3 = 1 alp4 = 1 sigmaR = 1 sigmaT = 1 sigmaS = 1 NUMBER_OF_LANDMARKS = 6 start_angle = -1.57079637051 end_angle = 1.56643295288 dA = 0.00436332309619 inc_angle = 0; z=0 u = np.matrix(np.zeros((3, 1))) #lankmark coordinates in the map m = np.matrix('0.372 1.85; \ 3.95 1.95; \ 5.33 5.73; \ 2.23 -2.77; \ 7 -1.95; \ 8.36 1.08') mu = np.matrix(np.zeros((3, 1))) cov = np.eye(3) dt = 0.5; flag = 1 def ekf_localization(hMu, hCov, u, z, m): #ekf algorithm code from ch7.5 # # def odomCallback(msg): global x, y, theta, u, mu, cov, m #retrieve the pose from odometry #x = msg.pose.pose.position.x #y = msg.pose.pose.position.y #theta = msg.pose.pose.orientation.z #retrieve linear and angular velocity u = np.matrix([ [msg.twist.twist.linear.x], [msg.twist.twist.angular.z]]) def laserCallback(msg): global z #retrieve the laser range data z = msg.ranges #this is where I call the ekf algorithm ekf_localization(mu, cov, u, z, m); def main(): print("Husky_ekf_localization start!!!") rospy.init_node('Husky_ekf_localization',anonymous = True) rospy.Subscriber("/husky_velocity_controller/odom",Odometry,odomCallback) rospy.Subscriber("scan",LaserScan,laserCallback) rospy.spin() if __name__ == '__main__': main() My questions: Is this is the correct approach to implement this algorithm?

openni_tracker isn't found

$
0
0
I tried to use this clone https://github.com/ros-drivers/openni_tracker http://wiki.ros.org/openni_tracker but i can't run the program, anyone can give any advice to this beginner? Thanks! Here is my result mob17@Seguro:~$ cd ros_ws mob17@Seguro:~/ros_ws$ catkin_make Base path: /home/mob17/ros_ws Source space: /home/mob17/ros_ws/src Build space: /home/mob17/ros_ws/build Devel space: /home/mob17/ros_ws/devel Install space: /home/mob17/ros_ws/install #### #### Running command: "make cmake_check_build_system" in "/home/mob17/ros_ws/build" #### #### #### Running command: "make -j4 -l4" in "/home/mob17/ros_ws/build" #### [100%] Built target openni_tracker mob17@Seguro:~/ros_ws$ source devel/setup.bash mob17@Seguro:~/ros_ws$ rosrun openni_tracker openi_tracker [rosrun] Couldn't find executable named openi_tracker below /home/mob17/ros_ws/src/openni_tracker

Upload or download file into YRC1000

$
0
0
Hi, I am currently developping an application using ROS, GP7 robot and YRC1000 controller from Yaskawa, and I want to know if it's possible to upload or download program file from the controller to the PC and vice versa. In addition is it possible to launch this program file from the computer. Is there any existing libraries and if not do you think it's possible for a newbie in ROS like me to develop it ? Thank you, Baptiste.

How to record data from rostopics using MongoDB

$
0
0
I have installed the mongodb_store package on my jackal clearpath robot (Ubuntu 14.04) running ros-indigo, and I have started the mongoldb_server by running the commands: rosparam set mongodb_port 62345 rosparam set mongodb_host car-j100-0101 rosrun mongodb_store mongodb_server.py and in the terminal I get a message saying: "[INFO] [WallTime:...] The Aug 2 08:40:27.317 [initandlisten] waiting for connections on port 62345" and nothing else happens after this. Ideally I want to use mongodb_store to store the data that is being published on all of the rostopics on my jackal.

Question about joinstate publisher and navigation ?

$
0
0
hi , i have a real robot and i'm trying to work with the navigation stack , i don't have an urdf model and i'm wondering if the navigation and the localization depend on it ? , and i want to know if i have to use the JointState publisher and robotstate Publisher although i don't have a 3D model , is it esential to the navigation stack ? thank you for answering

Invalid argument passed to lookupTransform argument source_frame in tf2 frame_ids cannot be empty

$
0
0
I'm using the [surface_perception package](http://wiki.ros.org/surface_perception) to get some table top data but I've been getting this error message Invalid argument passed to lookupTransform argument source_frame in tf2 frame_ids cannot be empty after running the surface segmentation for the multiple times and I can't seem to figure out what the problem is. The error message appears when i launch this file: i tried adding the launch-prefix="xterm -e gdb --args" to the tf2_web_republisher node but it just shows up the same error messages without indicating where its occuring. An odd thing that happens is that the visualisation shows one of the axes to be enlarged as shown [here](https://imgur.com/a/nVas9R2) I know this is related to some empty frame_ids but i cant seem to figure out where they are. When I run rostopic echo /tf2_web_republisher/result then all frame ids are empty even when there's no error message. any advice would be highly appreciated!

Question about imu emplacement

$
0
0
hi , i have a sensor imu (razor_imu_9dof) and i'm using the ekf package to combine it with the odom data that coming from encoder , my question is where should i put the imu on the robot , i have a differential robot like a big version of turtlebot but i don't know if i have to put the imu between the wheels or anywhere in the robot and i use the tf to transfomr data to base_link , can you give any advice ? thank you

Question about odometry and radians unit

$
0
0
hi , i have a question about odometry , here is the code i'm using to get odometry from wheel encoder i don't understand something , why vth unit is m/s ? if ' th' is an angle it should be in radians but since vth unit is m/s then 'th' is not in radian , i don't understant why it's working , because when i transform 'vth' to rd/s so that 'th' be in radian the odometry is not working so i'm really confuse dt=(current_time - last_time).toSec(); tick_Left = tick_l - _PreviousLeftEncoderCounts; tick_Right = tick_f - _PreviousRightEncoderCounts; speed_left = (tick_Left * DistancePerCount) / dt; //speed left wheel in m/s speed_right = (tick_Right * DistancePerCount) / dt; // speed right wheel in m/s vx = (speed_right + speed_left) / 2; // vx in m/s vy = 0; vth = (speed_right - speed_left)/lengthBetweenTwoWheels; //position of the robot double delta_x = (vx * cos(th)) * dt; double delta_y = (vx * sin(th)) * dt; double delta_th = vth * dt; // what is the unit of delta_teta ? x += delta_x; y += delta_y; th += delta_th; // teta should be in radian ? geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(th); geometry_msgs::TransformStamped odom_trans; odom_trans.header.stamp = current_time; odom_trans.header.frame_id = "odom"; odom_trans.child_frame_id = "base_link"; odom_trans.transform.translation.x = x; odom_trans.transform.translation.y = y; odom_trans.transform.translation.z = 0.0; odom_trans.transform.rotation = odom_quat; //send the transform odom_broadcaster.sendTransform(odom_trans); //Odometry message nav_msgs::Odometry odom; odom.header.stamp = current_time; odom.header.frame_id = "odom"; //set the position odom.pose.pose.position.x = x; odom.pose.pose.position.y = y; odom.pose.pose.position.z = 0.0; odom.pose.pose.orientation = odom_quat; //set the velocity odom.child_frame_id = "base_link"; odom.twist.twist.linear.x = vx; odom.twist.twist.linear.y = vy; odom.twist.twist.angular.z = vth; //publish the message odom_pub.publish(odom); _PreviousLeftEncoderCounts = tick_x; _PreviousRightEncoderCounts = tick_y; last_time = current_time;

Sourcing of containerized ROS via docker-py

$
0
0
I want to run a docker container wrapping ROS indigo using `docker-py` without running a command.>>> import docker>>> docker_client = docker.DockerClient(base_url='unix://var/run/docker.sock')>>> container = docker_client.containers.run(image="osrf/ros:indigo-desktop-trusty", detach=True, tty=True) According to `docker container ls` the container is running. `/ros_entrypoint.sh` as `COMMAND` made me think the entry point script is executed when the container is run. CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 69b97aa978f1 osrf/ros:indigo-desktop-trusty "/ros_entrypoint.s..." 2 minutes ago Up 2 minutes festive_lichterman However when I call a ros command line tool in the container >>> logs = container.exec_run(cmd='rosparam list')>>> print(logs) I get this error ExecResult(exit_code=126, output='rpc error: code = 2 desc = oci runtime error: exec failed: container_linux.go:247: starting container process caused "exec: \\"rosparam\\": executable file not found in $PATH"\n\r\n') with possible root cause that (a) the entry point script is not executed when the container is run or (b) that `exec_run()` is executed in another shell which has entry point script not executed. Does someone have an iidea about how to reliably source when using `docker-py`?

Can't launch/locate nodes - but nodes exist!

$
0
0
Hi all, I'm a ROS neophyte, trying to get my Neato Botvac D80 working with ROS Indigo, following Julian Tatsch's instructions ([part 1](http://www.tatsch.it/neato-botvac-80-wifi-mod/),[part 2](http://www.tatsch.it/use-the-neato-botvac-with-ros/)). Here's what I've done so far: - Gotten an RPi Zero W set up to behave like the Hame mini-router he references. I can telnet in on port 3000 and send serial commands to make the robot play sounds, etc. - Installed ROS Indigo in an Ubuntu 14.04 LTS VM running in VirtualBox - Followed Tatsch's instructions to install other required packages, with a few modifications (had to remove the version of ros-planning/navigation that he specified; fixes to correct missing ["bullet"](https://answers.ros.org/question/220676/how-to-install-bullet-on-indigo-in-ubuntu/) and ["sdl"](https://github.com/ros-planning/navigation/issues/579); cloning [openslam_gmapping](https://github.com/ros-perception/openslam_gmapping)) - Successfully running catkin_make with no errors after the above steps (hooray!) - Sourcing my catkin workspace to the ROS_PACKAGE_PATH. Initially, I did this with echo, but I've subsequently figured out how to do it the right way (source devel/setup.bash) I'm now trying to launch the neato ros_node. Things are mostly working - a bunch of nodes (including the Neato robot node) load successfully, Rviz pops up, etc. However, I get the following two errors: ERROR: cannot launch node of type [amcl/amcl]: can't locate node [amcl] in package [amcl] ERROR: cannot launch node of type [move_base/move_base]: can't locate node [move_base] in package [move_base] What's confusing to me is that I can find both of these packages (in the ROS core directories - /opt/ros/indigo/share, I think). In the case of amcl, I can actually launch the examples in the /examples directory. So the packages are clearly there! Anyway - I've done some looking around, and answers seem to be bimodally distributed between "source your workspace" and "super complicated answer to much more involved question". I'm guessing that I'm missing something simple - either about version compatibility, or about something in build parameters. If anyone can give me any guidance on what the troubleshooting steps should be for this, I'd appreciate it very much. Thanks!

error on Installing ros on ubuntu14 i386

$
0
0
I'm trying to install ros by: `sudo apt-get install ros-indigo-desktop` but I got following errors: The following packages have unmet dependencies: python-rosdep : Depends: python-catkin-pkg but it is not going to be installed ros-indigo-catkin : Depends: python-catkin-pkg (> 0.2.9) but it is not going to be installed ros-indigo-rospack : Depends: python-catkin-pkg but it is not going to be installed E: Unmet dependencies. Try 'apt-get -f install' with no packages (or specify a solution). i tried for :`apt-get -f install` and following errors occurred: subprocess dpkg-deb --control returned error exit status 2 Errors were encountered while processing: /var/cache/apt/archives/python-catkin-pkg-modules_0.4.7-1_all.deb /var/cache/apt/archives/python-catkin-pkg_0.4.7-100_all.deb E: Sub-process /usr/bin/dpkg returned an error code (1) but it didnt work!
Viewing all 2203 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>