Robotics StackExchange | Archived questions

Getting position of an object to be shown as a marker

I am finding a problem to define the exact position of a simple bluesquare using camera and laserscan, I am using ros noetic and turtlebot for the simulation. I tried a lot of solution in my vision module but I still can't get it right. The best outcomes that I had was that the correct position will be calculated correctly only if the camera is facing the object with angle 0, that means the object has to be in the middle of the image, otherwise it will rotate with it or follow the wall line instead.

I am not sure if the problem is a miscalculation in the x and y values that I defined or is it when setting the position for blueSquareMap

This is how I do the transformation: transformStamped = tfBuffer.lookupTransform("map", "base_scan", ros::Time(0));

and this is the code I have now:

geometry_msgs::PointStamped get_point()
{
     geometry_msgs::PointStamped p;
     p.header.frame_id = "map";
     p.header.stamp = ros::Time::now();

     p.point.x = transformStamped.transform.translation.x;
     p.point.y = transformStamped.transform.translation.y;
     p.point.z = transformStamped.transform.translation.z;

return p;
}


// Extract blue square marker positions
for (const auto &contour : blueContours)
{
    double area = cv::contourArea(contour); // Get how big is the object

    if (area > 100)
    {
        // Calculate coords of the middle
        cv::Rect boundingRect = cv::boundingRect(contour);
        geometry_msgs::Point markerPoint;
        markerPoint.x = boundingRect.x + boundingRect.width / 2;
        markerPoint.y = boundingRect.y + boundingRect.height / 2;
        markerPoint.z = 0.0;

        // Adjust camera image and laser scanner
        double angle_offset = ((markerPoint.x - 320) / 320);
        double adapted_camera_angle = estimated_camera_angle * angle_offset;

        // Calculate the angle in which the blue square is
        double final_angle = fmod((2 * M_PI + adapted_camera_angle), (2 * M_PI));
        // Set the laser index to read from
        int laserscan_index = static_cast<int>(final_angle / laserScan.angle_increment);
        laserscan_index = fmod((360 - laserscan_index), (360));
        // Read distance from the laserscanner
        double laserscan_range = laserScan.ranges.at(laserscan_index);

        ROS_INFO("[BLUE] Angle: %.2f - Index: %i", final_angle * 180 / M_PI, laserscan_index);
        ROS_INFO("[BLUE] Distance: %.2f   |   Contour area: %.2f", laserscan_range, area);

        // Calculate the position of the blue square
        double x = cos(final_angle) * laserscan_range;
        double y = sin(final_angle) * laserscan_range;

        // Point with the postition to transform
        geometry_msgs::PointStamped blueSquareMap = get_point();

        tf2::Quaternion q;
        tf2::convert(transformStamped.transform.rotation, q);
        double rotation = tf2::getYaw(q);

        ROS_INFO("[BLUE] ROTATION1: %.2f", rotation * 180 / M_PI);
        // laserScan.ranges.at(final_angle)
        blueSquareMap.point.x += laserScan.ranges.at(final_angle) * cos(final_angle * laserScan.angle_increment + laserScan.angle_min + rotation);
        blueSquareMap.point.y += laserScan.ranges.at(final_angle) * sin(final_angle * laserScan.angle_increment + laserScan.angle_min + rotation);

        // Set the color of the marker
        std_msgs::ColorRGBA blueColor;
        blueColor.r = 0.0;
        blueColor.g = 0.0;
        blueColor.b = 1.0;
        blueColor.a = 1.0;

        publishMarker(blueSquareMap.point, std::to_string(laserscan_range), blueColor, "map", "blue_markers", 0, visualization_msgs::Marker::CUBE, blueSquarePublisher);
    }
}

Asked by turtlebot_fan on 2023-07-26 12:52:02 UTC

Comments

Hi!

Please tell me if I have got correctly what you did:

  • Get pixel coordinates of the object from the image
  • Compute the angle of the object wrt the "forward" direction of the robot, using the horizontal pixel coordinate
  • Use the angle value to get the corresponding laser value, i.e. the distance from the robot
  • Transform to the map RF and publish the marker

A couple of things that look odd to me are:

  • you do not take into account (as far as I can see) any sensor_msgs/CameraInfo message when converting from pixel to world coordinates. I would have expected, once you got your distance from laser, to see something like this.
  • it looks like you are getting the base_scan to map transform, and then manually transforming the marker position. Why not to use tf::TransformListener::transformPoint() instead?

Asked by bluegiraffe-sc on 2023-07-27 03:20:12 UTC

Thank you for taking the time, yes you get it very correct understanding the code. Like you said I don't rely on msgs from CameraInfo, I don't know how will they be helpful for me. I do my lookupTransform in main and then have a synced callback between laser and camera topics that looks like this.

void synchronizedCallback(const sensor_msgs::ImageConstPtr &imageMsg, const sensor_msgs::LaserScanConstPtr &scanMsg) {
laserScan = *scanMsg;
try
{
    cv::Mat frame = cv_bridge::toCvShare(imageMsg, "bgr8")->image;
    cv::Mat hsvImage;
    cv::cvtColor(frame, hsvImage, cv::COLOR_BGR2HSV);

    detect_blue_markers(hsvImage);
    // ! detect_red_markers(hsvImage);
}
catch (cv_bridge::Exception &e)
{
    ROS_ERROR("cv_bridge exception: %s", e.what());
}
}

I never heard or read about transformPoint(), so I will have look how to use and see if it brings something

Asked by turtlebot_fan on 2023-07-27 06:46:55 UTC

I am going again through your code. How did you get the estimated_camera_angle, which I guess is the camera angle of view along in the horizontal direction? And what is the meaning of M_PI?

Asked by bluegiraffe-sc on 2023-07-28 09:28:36 UTC

estimated camera angle ist set to PI / 4 (45 degrees) as a constant for the camera field of view. M_PI is just pi value from math lib

Asked by turtlebot_fan on 2023-07-28 10:51:11 UTC

Ok, the M_PI meaning was indeed pretty clear, sorry for the super-dumb question.

Regarding the estimated camera angle, how did you find it? From device datasheet? From calibration? You can see here how to calculate it from focal length (that you can get in turn from CameraInfo messages).

Finally (for this round), what is the meaning of

// Calculate the angle in which the blue square is
double final_angle = fmod((2 * M_PI + adapted_camera_angle), (2 * M_PI));

You are adding the adapted_camera_angle to a 180° angle, and the adapted_camera_angle will always be less than 180° (it can reach up to 45°, right?). The result of the ratio will always be 1, with remainder (which is what you are storing into final_angle) equal to adapted_camera_angle.

What am I not understanding about your pipeline?

Asked by bluegiraffe-sc on 2023-07-31 05:45:12 UTC

Answers