Continuous Time LiDAR odometry

Related tags

Deep Learningct_icp
Overview

CT-ICP: Elastic SLAM for LiDAR sensors

LUCO_GIF NCLT_GIF

This repository implements the SLAM CT-ICP (see our article), a lightweight, precise and versatile pure LiDAR odometry.

It is integrated with the python project pyLiDAR-SLAM which gives access to more datasets. pyLiDAR-SLAM requires the installation of the python binding for CT-ICP (see below).

Installation

Ubuntu
.\ct_icp_build.sh Release "Unix Makefiles" ON ON  # Builds the project in "Release" mode, with "Unix Makefiles" cmake generator, with python binding and with the visualization activated
source env.sh                                     # Setup the environment (.so locations) 
.\slam -c default_config.yaml                     # Launches the SLAM
Windows 10 sous PowerShell
.\ct_icp_build.bat                  # Builds the project
.\env.bat                           # Setup the environment (.so locations) 
.\slam.exe -c default_config.yaml   # Launches the SLAM

To modify options (for viz3d support, or python binding) for the windows script, you can directly modify the ct_icp_build.bat file.

Python binding

The steps below will install a python package named pyct_icp:

  • Generate the cmake project with the following arguments (Modify ct_icp_build.sh):

    • -DWITH_PYTHON_BINDING=ON: Activate the option to build the python binding
    • -DPYTHON_EXECUTABLE=<path-to-target-python-exe>: Path to the target python executable
  • Go into the build folder (e.g cd ./cmake-Release)

  • Build the target pyct_icp with make pyct_icp -j6

  • Install the python project pip install ./src/binding

Note: This step is required to use CT-ICP with pyLiDAR-SLAM.

Install the Datasets

The Datasets are publicly available at: https://cloud.mines-paristech.fr/index.php/s/UwgVFtiTOmrgKp5

Each dataset is a .zip archive containing the PLY scan file with the relative timestamps for each point in the frame, and if available, the ground truth poses.

To install each dataset, simply download and extract the archives on disk. The datasets are redistributions of existing and copyrighted datasets, we only offer a convenient repackaging of these datasets.

The dataset available are the following:

Under Creative Commons Attribution-NonCommercial-ShareAlike LICENCE

  • KITTI (see eval_odometry.php):
    • The most popular benchmark for odometry evaluation.
    • The sensor is a Velodyne HDL-64
    • The frames are motion-compensated (no relative-timestamps) and the Continuous-Time aspect of CT-ICP will not work on this dataset.
    • Contains 21 sequences for ~40k frames (11 with ground truth)
  • KITTI_raw (see eval_odometry.php): :
    • The same dataset as KITTI without the motion-compensation, thus with meaningful timestamps.
    • The raw data for sequence 03 is not available
  • KITTI_360 (see KITTI-360):
    • The successor of KITTI, contains longer sequences with timestamped point clouds.
    • The sensor is also a Velodyne HDL-64

Permissive LICENSE

  • NCLT: (see nclt)
    • Velodyne HDL-32 mounted on a segway
    • 27 long sequences (up to in the campus of MICHIGAN university over a long
    • Challenging motions (abrupt orientation changes)
    • NOTE: For this dataset, directly download the Velodyne links (e.g. 2012-01-08_vel.tar). Our code directly reads the velodyne_hits.bin file.
  • KITTI-CARLA: (see and cite KITTI-CARLA):
    • 7 sequences of 5000 frames generated using the CARLA simulator
    • Imitates the KITTI sensor configuration (64 channel rotating LiDAR)
    • Simulated motion with very abrupt rotations
  • ParisLuco (published with our work CT-ICP, cf below to cite us):
    • A single sequence taken around the Luxembourg Garden
    • HDL-32, with numerous dynamic objects

Running the SLAM

Usage

> chmod+x ./env.sh    # Set permission on unix to run env.sh
> ./env.sh            # Setup environment variables 
> ./slam -h           # Display help for the executable 

USAGE:

slam  [-h] [--version] [-c <string>] [-d <string>] [-j <int>] [-o
<string>] [-p <bool>] [-r <string>]


Where:

-c <string>,  --config <string>
Path to the yaml configuration file on disk

-o <string>,  --output_dir <string>
The Output Directory

-p <bool>,  --debug <bool>
Whether to display debug information (true by default)

--,  --ignore_rest
Ignores the rest of the labeled arguments following this flag.

--version
Displays version information and exits.

-h,  --help
Displays usage information and exits.

Selecting the config / setting the options

To run the SLAM call (on Unix, adapt for windows), please follow the following steps:

  1. Modify/Copy and modify one of the default config (default_config.yaml, robust_high_frequency_config.yaml or robust_driving_config.yaml) to suit your needs. Notably: change the dataset and dataset root_path dataset_options.dataset and dataset_options.root_path.

  2. Launch the SLAM with command: ./slam -c <config file path, e.g. default_config.yaml> # Launches the SLAM on the default config

  3. Find the trajectory (and optionally metrics if the dataset has a ground truth) in the output directory

Citation

If you use our work in your research project, please consider citing:

@misc{dellenbach2021cticp,
  title={CT-ICP: Real-time Elastic LiDAR Odometry with Loop Closure},
  author={Pierre Dellenbach and Jean-Emmanuel Deschaud and Bastien Jacquet and François Goulette},
  year={2021},
  eprint={2109.12979},
  archivePrefix={arXiv},
  primaryClass={cs.RO}
}

TODO

  • Make a first version of the documentation
  • Save both poses for each TrajectoryFrame
  • Fix bugs / Improve code quality (doc/comments/etc...)
  • Add a wiki (documentation on the code)
  • Add point-to-distribution cost
  • Improve the robust regime (go faster and find parameters for robust and fast driving profile)
  • Increase speed
  • Add Unit Tests
  • Github CI
  • Improve visualization / Interaction for the OpenGL Window
  • Improve the python binding (reduce the overhead)
  • Write ROS packaging
Comments
  • How to reproduce results for NCLT

    How to reproduce results for NCLT

    Hello @jedeschaud, sorry to disturb you.

    We are trying to reproduce the results for ct-icp on the NCLT dataset, and we couldn't succeed so far. How can I get this number? image

    Checking the implementation, and using the original velodyne_hits.bin file, I also see that you are processing 42764 in total, where the velodyne_sync folder contains only 28127 scans. How did you guys evaluate the results of the system?

    Moreover, which ground truth poses were used to carry on the evaluation? According to the implementation, https://github.com/jedeschaud/ct_icp/blob/1ba7ce704e9994d39076089ea3fc0dc4d856fe84/src/ct_icp/dataset.cpp#L151

    // Returns the Path to the Ground Truth file for the given sequence
    // Note: The name of the sequence is not checked
    inline std::string ground_truth_path(const DatasetOptions &options,
                                         const std::string &sequence_name) {
        std::string ground_truth_path = options.root_path;
        if (ground_truth_path.size() > 0 && ground_truth_path[ground_truth_path.size() - 1] != '/')
            ground_truth_path += '/';
    
        switch (options.dataset) {
            case NCLT:
                throw std::runtime_error("Not Implemented!");
        }
        return ground_truth_path;
    }
    

    Thanks a lot in advance

    opened by nachovizzo 15
  • libgflags link error

    libgflags link error

    Hi,thanks for your work when I run "./slam -c robust_high_frequency_config.yaml" ,the output is "error while loading shared libraries: libgflags.so.2.2: cannot open shared object file: No such file or directory". I have install gflag in “usr/local/”. And I can't find where link gflags in cmakelists.txt. hope for your reply,thanks a lot.

    opened by HeXu1 12
  • Running ct_icp using another dataset

    Running ct_icp using another dataset

    Hi, I have recenlty built and run ct_icp successfully!
    But I have difficulty running another dataset.
    I wonder how ct_icp exactly knows about timestamp between frames.
    While studying your paper and code, I found something new.

    Like this default_config.yaml (permalink highlighted), I found that ct_icp code uses KITTI dataset in the form of frame_####.ply format.
    But It cannot work on my own data(*.ply)


    When I visualize it on CloudCompare,
    here the **red** one is **frame_0001.ply**, the **black** one is **frame_0005.ply**.
    We can think the ego vehicle moves forward.

    Selection_012


    Here, I found something new.

    In CloudCompare I got to know the ply file of KITTI has x, y, z and timestamp fields.

    Selection_014

    On the other hand, I have my own data like

    Selection_013

    I think the difference is whether it has a field of timestamp(PLY_FLOAT32)


    Here is my real question!
    Could you explain the ply format and how ct_icp uses that format briefly if you can.
    Plus, I want to convert my .bag/.pcd file to .ply which has only x, y, z and timestamp file format.
    I couldn't find any solution about that.

    Best regards.

    opened by bigbigpark 8
  • Cmake Error in Ubuntu20.04

    Cmake Error in Ubuntu20.04

    $ cmake .. -DCMAKE_BUILD_TYPE=Release -DSUPERBUILD_INSTALL_DIR=/home/lingbo/open-source-project/ct_icp/install -DSLAMCORE_INSTALL_DIR=/home/lingbo/open-source-project/ct_icp/install/SlamCore -DCT_ICP_INSTALL_DIR=/home/lingbo/open-source-project/ct_icp/install/CT_ICP -DEIGEN_DIR=/home/lingbo/open-source-project/ct_icp/install/Eigen3/share/eigen3/cmake INFO [Superbuild] -- Successfully found target glog::glog INFO [Superbuild] -- Successfully Found Target Eigen3::Eigen -- Found required Ceres dependency: Eigen version 3.3.7 in /home/lingbo/open-source-project/ct_icp/install/Eigen3/include/eigen3 -- Found required Ceres dependency: glog -- Found required Ceres dependency: gflags -- Found Ceres version: 2.0.0 installed in: /home/lingbo/open-source-project/ct_icp/install/Ceres with components: [EigenSparse, SparseLinearAlgebraLibrary, LAPACK, SuiteSparse, CXSparse, SchurSpecializations, Multithreading] INFO [Superbuild] -- Successfully found target Ceres::ceres INFO [Superbuild] -- Successfully found target yaml-cpp INFO [Superbuild] -- Successfully found target GTest::gtest INFO [Superbuild] -- Successfully found target cereal INFO [Superbuild] -- Successfully found target tclap::tclap INFO [Superbuild] -- Successfully found target tsl::robin_map INFO [Superbuild] -- Successfully found target nanoflann::nanoflann INFO [Superbuild] -- Successfully found target colormap::colormap INFO [Superbuild] -- Successfully found target tinyply::tinyply CMake Error at cmake/external.cmake:13 (get_target_property): INTERFACE_LIBRARY targets may only have whitelisted properties. The property "IMPORTED_RELEASE_LOCATION" is not allowed. Call Stack (most recent call first): CMakeLists.txt:51 (include)

    INFO Eigen3::Eigen NOTFOUND INFO /home/lingbo/open-source-project/ct_icp/install/Ceres/lib/libceres.so.2.0.0 INFO /home/lingbo/open-source-project/ct_icp/install/glog/lib/libglog.so.0.5.0 CMake Error at cmake/external.cmake:13 (get_target_property): INTERFACE_LIBRARY targets may only have whitelisted properties. The property "IMPORTED_RELEASE_LOCATION" is not allowed. Call Stack (most recent call first): CMakeLists.txt:51 (include)

    INFO tsl::robin_map NOTFOUND INFO /home/lingbo/open-source-project/ct_icp/install/yaml-cpp/lib/libyaml-cpp.so.0.6.3 CMake Error at cmake/external.cmake:13 (get_target_property): INTERFACE_LIBRARY targets may only have whitelisted properties. The property "IMPORTED_RELEASE_LOCATION" is not allowed. Call Stack (most recent call first): CMakeLists.txt:51 (include)

    INFO colormap::colormap NOTFOUND INFO /home/lingbo/open-source-project/ct_icp/install/tinyply/lib/libtinyply.so INFO -- [CT-ICP] -- Appending to the INSTALL RPATH the RPATH to the external libraries: [:/home/lingbo/open-source-project/ct_icp/install/Ceres/lib:/home/lingbo/open-source-project/ct_icp/install/glog/lib:/home/lingbo/open-source-project/ct_icp/install/yaml-cpp/lib:/home/lingbo/open-source-project/ct_icp/install/tinyply/lib] INFO [CT_ICP] -- "WITH_GTSAM=OFF gtsam dependent targets will not be built" -- Configuring incomplete, errors occurred! See also "/home/lingbo/open-source-project/ct_icp/cmake-build-release/CMakeFiles/CMakeOutput.log". See also "/home/lingbo/open-source-project/ct_icp/cmake-build-release/CMakeFiles/CMakeError.log".

    opened by lingbo-yu 7
  • Superbuild issue

    Superbuild issue

    when I superbuild i met this issue

    cmake --build . --config Release Scanning dependencies of target MappingResearchKEU_superbuild [ 12%] Creating directories for 'MappingResearchKEU_superbuild' [ 25%] Performing download step (git clone) for 'MappingResearchKEU_superbuild' Cloning into 'MappingResearchKEU_superbuild'... Already on 'master' Your branch is up to date with 'origin/master'. [ 37%] No patch step for 'MappingResearchKEU_superbuild' [ 50%] Performing update step for 'MappingResearchKEU_superbuild' Current branch master is up to date. [ 62%] Performing configure step for 'MappingResearchKEU_superbuild' -- The C compiler identification is GNU 9.4.0 -- The CXX compiler identification is GNU 9.4.0 -- Check for working C compiler: /usr/bin/cc -- Check for working C compiler: /usr/bin/cc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Detecting C compile features -- Detecting C compile features - done -- Check for working CXX compiler: /usr/bin/c++ -- Check for working CXX compiler: /usr/bin/c++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Detecting CXX compile features -- Detecting CXX compile features - done INFO [Superbuild] -- [Generation] -- Generating GTest dependency INFO [Superbuild] -- [Generation] -- Generating GLOG dependency INFO [Superbuild] -- [Generation] -- Generating Eigen3 dependency INFO [Superbuild] -- [Generation] -- Generating Ceres dependency INFO [Superbuild] -- [Generation] -- Generating yaml-cpp dependency INFO [Superbuild] -- [Generation] -- Generating cereal dependency INFO [Superbuild] -- [Generation] -- Generating tessil dependency INFO [Superbuild] -- [Generation] -- Generating nanoflann dependency INFO [Superbuild] -- [Generation] -- Generating tclap dependency CMake Error at CMakeLists.txt:527 (add_library): add_library INTERFACE library requires no source arguments.

    CMake Error at CMakeLists.txt:528 (target_include_directories): Cannot specify include directories for target "tclap" which is not built by this project.

    CMake Error at CMakeLists.txt:468 (install): install TARGETS given target "tclap" which does not exist. Call Stack (most recent call first): CMakeLists.txt:531 (SUPERBUILD_INSTALL_TARGET)

    INFO [Superbuild] -- [Generation] -- Generating colormap dependency INFO [Superbuild] -- [Generation] -- Generating tinyply dependency -- Configuring incomplete, errors occurred! See also "/home/user/rosProject/ct_icp/.cmake-build-superbuild/MappingResearchKEU_superbuild/src/MappingResearchKEU_superbuild-build/CMakeFiles/CMakeOutput.log". make[2]: *** [CMakeFiles/MappingResearchKEU_superbuild.dir/build.make:107: MappingResearchKEU_superbuild/src/MappingResearchKEU_superbuild-stamp/MappingResearchKEU_superbuild-configure] Error 1 make[1]: *** [CMakeFiles/Makefile2:76: CMakeFiles/MappingResearchKEU_superbuild.dir/all] Error 2 make: *** [Makefile:84: all] Error 2

    My ubuntu is 20.04

    opened by liangyongshi 7
  • kitti timestamps

    kitti timestamps

    great appreciate with opening your great work! first, can you provide dataset passcord to me, no one relpy in mail, second, And when I see kitti website, the raw point cloud data do not have time information except scan begin and end time, How do you get the time information?

    thanks a lot best wishes!

    good first issue 
    opened by dongfangzhou1108 7
  • how can i use kitti odometry data?

    how can i use kitti odometry data?

    hi,appreciate your great work, but when i use kitti data,which have no distortion in http://www.cvlibs.net/datasets/kitti/eval_odometry.php, that is mean there is no use to Update keypoints when run CT_ICP_GN() function, and i aslo think we do not need to use pose and velocity constraint in the paper, what should i do to modify the code?

    thanks a lot

    opened by dongfangzhou1108 6
  • fail to clone imgui.git

    fail to clone imgui.git

    Hi,thanks for your work. it seems download imgui.git failed, when I run “./ct_icp_build.sh Release "Unix Makefiles" ON ON”.

    and the terminal outputs like this: [ 11%] Performing download step (git clone) for 'imgui-populate' Cloning into 'imgui-src'... Permission denied (publickey). fatal: Could not read from remote repository.

    Please make sure you have the correct access rights and the repository exists. Cloning into 'imgui-src'... Permission denied (publickey). fatal: Could not read from remote repository.

    Please make sure you have the correct access rights and the repository exists. Cloning into 'imgui-src'... Permission denied (publickey). fatal: Could not read from remote repository.

    Please make sure you have the correct access rights and the repository exists. -- Had to git clone more than once: 3 times. CMake Error at imgui-subbuild/imgui-populate-prefix/tmp/imgui-populate-gitclone.cmake:31 (message): Failed to clone repository: '[email protected]:pierdell/imgui.git'

    hope for your reply. thanks.

    opened by HeXu1 6
  • why my Elapsed Search Neighbors cost too much? can you answer for me,thanks

    why my Elapsed Search Neighbors cost too much? can you answer for me,thanks

    thanks for openning your code.

    Number of points in sub-sampled frame: 35679 / 126902 Initial ego-motion distance: 0 Elapsed Normals: 35.0737 Elapsed Search Neighbors: 567.692 Elapsed A Construction: 0 Elapsed Select closest: 2.18221 Elapsed Solve: 0.222377 Elapsed Solve: 8.04642 Number iterations CT-ICP : 5 Elapsed Elastic_ICP: 681.094 Number of Keypoints extracted: 4640 / Actual number of residuals: 4180 Trajectory correction [begin(t) - end(t-1)]: 0 Final ego-motion distance: 0.29746 Average Load Factor (Map): 0.324829 Number of Buckets (Map): 16384 Number of points (Map): 39514 Elapsed Time: 683.08 (ms)

    here is my log info of one frame, why my Elapsed Search Neighbor cost too much?

    best wishes

    good first issue 
    opened by dongfangzhou1108 6
  • GN vs Ceres Optimization

    GN vs Ceres Optimization

    Hello, thanks for this great work! I saw that the default optimization (in default_config.yaml) is using the Gauss Newton optimization and for the robust configs Ceres is used. The Jacobians for the rotation look for me like an approximation. Is this true, and if yes do you think the approximation error is relevant?

    Is Ceres mainly used for the robust loss functions or also to get better Jacobians by the autodiff?

    Thanks and best regards, Louis

    opened by louis-wiesmann 4
  • How to run this on MulRan dataset

    How to run this on MulRan dataset

    Hello again, thanks for your contribution. I'm trying to run this code on the MulRan dataset without success.

    If you ever ran this code on the dataset, can you please provide some pointers?

    I also tried to run with the "PLY_DIRECTORY" with no success:

    WARNING: Logging before InitGoogleLogging() is written to STDERR
    I20220725 11:23:17.709126 2555704 slam.cpp:351] Creating directory .outputs/
    I20220725 11:23:17.713934 2555704 dataset.cpp:278] Found Sequence PLY_DIR
    terminate called after throwing an instance of 'std::runtime_error'
      what():  PLY Directory is not supported by read_pointcloud. See the DirectoryIterator.
    [1]    2555704 IOT instruction (core dumped)  ./slam -c default_config.yaml
    

    Inspecting the code, I see there is no method to read the PLY directory. https://github.com/jedeschaud/ct_icp/blob/1ba7ce704e9994d39076089ea3fc0dc4d856fe84/src/ct_icp/dataset.cpp#L306

    Related to #27, #26, #21

    opened by nachovizzo 3
  • Tune slam output velocity

    Tune slam output velocity

    Hello, I am able to run the slam and it works really good, but especially at the beginning of the motion I am facing a possible computation time problem. I am using an Ouster OS0-64, and for what concern the motion on Y an Z I did not notice any problem, but when I walk with my sensor for 10m X remain around the initial value until I reach ~7m. After the output value start to change really fast and reach the correct value (and mantain it for almost all the test). To be more clear:

    Real distance(m) | 0 | 0.5 | 1.0  | 1.5  | 2.0     | 2.5  | 3.0  | 3.5 | 4.0    | 4.5   | 5.0   | 5.5 | 6.0   | 6.5  | 7.0 | 8.0 | 9.0 | 10.0 | 10.0 | 10.0 | 
    --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    /ct_icp/pose(m)  | 0 | 0.1 | -0.2 | 0.015 | -0.03 | 0.3 | 0.40 | -0.2 | 0.10   | 0.15 | 0.16 |  0.2 | 0.10   | 0.15   | 1.0| 3.0 | 5.0 | 7.0 | 9.0 | 10.0  | 
    

    Now my PC CPUs are not in saturation, and them speed is 4.8Ghz so I do not think is an hardware problem, the ram occupation is also low (7Gb/16Gb)

    There is a portion of code to modify in order to speed up the computation of the output pose? I have try to change some parameters in ct_icp/include/ct_icp/odometry.h but nothing has changed.

    opened by Cristian-wp 0
  • Python bindings

    Python bindings

    Update Python Bindings, in order to be able to run ct_icp within pylidar-SLAM

    TODOS:

    SlamCore

    • [x] Point cloud API
    • [x] Basic types
    • [x] IO methods
    • [x] Algorithms

    CT-ICP

    • [x] Datasets
    • [x] Map & Neighborhood
    • [x] CT_ICP
    • [x] Odometry

    TESTS, CI, Doc

    • [ ] Add test_binding.py to the Github CI
    • [x] Documentation on the Readme / Wiki for the python bindings
    • [ ] Regression tests on several (shorten) datasets
    opened by pdell-kitware 0
  • How to run with pyLiDAR-SLAM on KITTI-corrected?

    How to run with pyLiDAR-SLAM on KITTI-corrected?

    Hello authors. Thank you all for your excellent work. It's amazing!

    I can reproduce the results of KITTI-corrected in your paper (Table I), and I wish to run your ct_icp with pyLiDAR-SLAM (as it produces a loop closure function if I understand your code correctly). However, in your repo and pyLiDAR-SLAM, I cannot find any script to execute pyLiDAR-SLAM with your ct_icp module.

    Could you please let me know how to run it or if you could directly upload the pose results of KITTI-corrected (with loop closure)?

    I sincerely appreciate your work and consideration! Thanks again!

    opened by samsdolphin 5
  • Run ct_icp on Jetson Nano

    Run ct_icp on Jetson Nano

    Hello, I would like to run this slam on my Jetson Nano 4Gb. I have manage to install and build on it, but even if I use Ceres as solver, I can not manage to run the solver on the board GPU. I know that only some type of Ceres option are currently supperted by CUDA:

    "CUDA If you have an NVIDIA GPU then Ceres Solver can use it accelerate the solution of the Gauss-Newton linear systems using the CMake flag USE_CUDA. Currently this support is limited to using the dense linear solvers that ship with CUDA. As a result GPU acceleration can be used to speed up DENSE_QR, DENSE_NORMAL_CHOLESKY and DENSE_SCHUR. This also enables CUDA mixed precision solves for DENSE_NORMAL_CHOLESKY and DENSE_SCHUR. Optional."

    So, I would like to know which dense linear solver have you use.

    opened by Cristian-wp 3
  • Parameters meaning

    Parameters meaning

    Hi, I need to change the configuration for my dataset. Where I can find the parameter meanings? There is an explanation about how they influence the output?

    opened by Cristian-wp 2
  • Run ct_icp on Darpa tunnel circuit datasets

    Run ct_icp on Darpa tunnel circuit datasets

    Hi, I have manage to make the slam work on Darpa urban datasets, now I am trying to test it in the tunnel ciruits. As you links say, them al compressed, so I decompress all of them with rosbag decompress file.bag . I have correctly remap the pointcloud topic, lidar model and frequency are the same and at the moment I have not change anything inside the param yaml file. When I launch the simulation the simulation starts, but the odometry topic /ct_icp/pose/odom is not published. As you can see from the following images, also the tf from odom to base_link is not generated.

    This image are the TF from the urban datasets: Screenshot from 2022-11-03 10-50-21

    This one are the TF from the tunnel: Screenshot from 2022-11-03 12-50-19

    This one is a screenshot from RViz after some seconds: as you can see the position frame are not published even if the robot is already inside the tunnel (the robot stars outside the tunnel)

    rviz_screenshot_2022_11_03-12_42_01

    This one are the topic from rqt: Screenshot from 2022-11-03 11-38-45

    What I am doing wrong? Can someone please help me?

    opened by Cristian-wp 2
Releases(icra_2022)
Monocular 3D Object Detection: An Extrinsic Parameter Free Approach (CVPR2021)

Monocular 3D Object Detection: An Extrinsic Parameter Free Approach (CVPR2021) Yunsong Zhou, Yuan He, Hongzi Zhu, Cheng Wang, Hongyang Li, Qinhong Jia

Yunsong Zhou 51 Dec 14, 2022
This Artificial Intelligence program can take a black and white/grayscale image and generate a realistic or plausible colorized version of the same picture.

Colorizer The point of this project is to write a program capable of taking a black and white / grayscale image, and generating a realistic or plausib

Maitri Shah 1 Jan 06, 2022
FreeSOLO for unsupervised instance segmentation, CVPR 2022

FreeSOLO: Learning to Segment Objects without Annotations This project hosts the code for implementing the FreeSOLO algorithm for unsupervised instanc

NVIDIA Research Projects 253 Jan 02, 2023
[AAAI 2022] Sparse Structure Learning via Graph Neural Networks for Inductive Document Classification

Sparse Structure Learning via Graph Neural Networks for inductive document classification Make graph dataset create co-occurrence graph for datasets.

16 Dec 22, 2022
Mahadi-Now - This Is Pakistani Just Now Login Tools

PAKISTANI JUST NOW LOGIN TOOLS Install apt update apt upgrade apt install python

MAHADI HASAN AFRIDI 19 Apr 06, 2022
Official code of the paper "ReDet: A Rotation-equivariant Detector for Aerial Object Detection" (CVPR 2021)

ReDet: A Rotation-equivariant Detector for Aerial Object Detection ReDet: A Rotation-equivariant Detector for Aerial Object Detection (CVPR2021), Jiam

csuhan 334 Dec 23, 2022
This is the official implementation of our proposed SwinMR

SwinMR This is the official implementation of our proposed SwinMR: Swin Transformer for Fast MRI Please cite: @article{huang2022swin, title={Swi

A Yang Lab (led by Dr Guang Yang) 27 Nov 17, 2022
Semantically Contrastive Learning for Low-light Image Enhancement

Semantically Contrastive Learning for Low-light Image Enhancement Here, we propose an effective semantically contrastive learning paradigm for Low-lig

48 Dec 16, 2022
Code for Ditto: Building Digital Twins of Articulated Objects from Interaction

Ditto: Building Digital Twins of Articulated Objects from Interaction Zhenyu Jiang, Cheng-Chun Hsu, Yuke Zhu CVPR 2022, Oral Project | arxiv News 2022

UT Robot Perception and Learning Lab 78 Dec 22, 2022
MPViT:Multi-Path Vision Transformer for Dense Prediction

MPViT : Multi-Path Vision Transformer for Dense Prediction This repository inlcu

Youngwan Lee 272 Dec 20, 2022
Title: Graduate-Admissions-Predictor

The purpose of this project is create a predictive model capable of identifying the probability of a person securing an admit based on their personal profile parameters. Simplified visualisations hav

Akarsh Singh 1 Jan 26, 2022
This project provides an unsupervised framework for mining and tagging quality phrases on text corpora with pretrained language models (KDD'21).

UCPhrase: Unsupervised Context-aware Quality Phrase Tagging To appear on KDD'21...[pdf] This project provides an unsupervised framework for mining and

Xiaotao Gu 146 Dec 22, 2022
Implementation of Feedback Transformer in Pytorch

Feedback Transformer - Pytorch Simple implementation of Feedback Transformer in Pytorch. They improve on Transformer-XL by having each token have acce

Phil Wang 93 Oct 04, 2022
An efficient PyTorch implementation of the winning entry of the 2017 VQA Challenge.

Bottom-Up and Top-Down Attention for Visual Question Answering An efficient PyTorch implementation of the winning entry of the 2017 VQA Challenge. The

Hengyuan Hu 731 Jan 03, 2023
A Multi-attribute Controllable Generative Model for Histopathology Image Synthesis

A Multi-attribute Controllable Generative Model for Histopathology Image Synthesis This is the pytorch implementation for our MICCAI 2021 paper. A Mul

Jiarong Ye 7 Apr 04, 2022
[LREC] MMChat: Multi-Modal Chat Dataset on Social Media

MMChat This repo contains the code and data for the LREC2022 paper MMChat: Multi-Modal Chat Dataset on Social Media. Dataset MMChat is a large-scale d

Silver 47 Jan 03, 2023
An introduction to satellite image analysis using Python + OpenCV and JavaScript + Google Earth Engine

A Gentle Introduction to Satellite Image Processing Welcome to this introductory course on Satellite Image Analysis! Satellite imagery has become a pr

Edward Oughton 32 Jan 03, 2023
meProp: Sparsified Back Propagation for Accelerated Deep Learning

meProp The codes were used for the paper meProp: Sparsified Back Propagation for Accelerated Deep Learning with Reduced Overfitting (ICML 2017) [pdf]

LancoPKU 107 Nov 18, 2022
Face Recognition plus identification simply and fast | Python

PyFaceDetection Face Recognition plus identification simply and fast Ubuntu Setup sudo pip3 install numpy sudo pip3 install cmake sudo pip3 install dl

Peyman Majidi Moein 16 Sep 22, 2022
Human-Pose-and-Motion History

Human Pose and Motion Scientist Approach Eadweard Muybridge, The Galloping Horse Portfolio, 1887 Etienne-Jules Marey, Descent of Inclined Plane, Chron

Daito Manabe 47 Dec 16, 2022