All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
- using ruff instead of flake8 and pylint
- added support for reading ROS2 mcap files
- added a way to investigate the topic names of PointCloud2 messages inside ROS files with the CLI "$ pointcloudset topics test.bag"
- changed the name of the CLI from pointcloudset-convert to pointcloudset like "$ pointcloudset convert -t /os1_cloud_node/points test.bag"
- pointcloudset convert now defaults to generating a directory named after the bagfile with added _pointcloudset to the directory name
- using nbformat==5.7.0 to avoid error with open3d 0.17
- deleted blackcellmagic due to errors and not being used
- documentation of CLI where the examples where wrong
- tested with open3d 0.17
- tested with dask 2023.3.1
- tested with python 3.10.2 and new versions of pandas and numpy
- updated open3d, dask version for the docker image
- support for ROS2 files (with SQLite backend). Read them in the same ways as ROS1 bag files
- support for ROS2 conversion in pointcloudset-convert
- added version number to file metadata. If the native file format changes in the future.
- added tests for large ROS bagfiles
- using rosbags as ROS library. This avoids the conflicts of the test explorer and dependency on some poorly maintained libraries.
- renamed CLI rosbagconvert to pointcloudset-convert since its specific for pointcloudset and not rosbag. Complete rewrite of CLI.
- added pycryptodomex dependency since the ROS packages do not install it but need it
- distrubted package installation
- bounding_box property for datasets
- animate method for datasets as an experimental feature
- limit_less and limit_greater methods to PointCloud
- time format to include milliseconds
- better handling of agg with dict queries
Wrong version due to CI
- laspy in docker image based. Updated to > 2.00
- dask distributed library in docker image
- better support for data from terrestrial laser scanners
- has_original_id for datasets. Returns true if all pointclouds have original_id
- PointCloud.from_file now supports timestamp input or "from_file"
- diff with "nearest" to calculate distance to nearest point from another pointcloud
- time format to 24h PR #45
- fixed typehints after changed open3D API
- plot overlay larger than length of px.colors.qualitative.Plotly Pr #45
Removed - tqdm dependency (now covered by rich)
- missing packaged in base image
- better entry point for docker images
- using pintcloudset docker images for github actions testing
- streamlined docker images with new base image
- bug with dask 2022.5.0 where meta.json was also read not just the parquet files
- now raw tag for pypi in rst files
- rosbagconvert CLI to export individual frames to pointcloudset dataset or files like
- csv or las.
- rosbagconvert has new options and structure
- bag2daset has more functionallity and a new name: rosbagconvert
- using rich instead of tqdm
- using rich as a nice UI for the rosbagconvert
- now the docker containers runs also on arm64
- used open3d version 0.14 as default, which comes with arm wheels
- use dask version 2022.02 as minimum, as there was a bug with 2021.10 and reading files
- using Python 3.9 as minimum
- point_size option had no effect when using overlays
- writing of dataset with an empty point cloud at the start
- conda environment name was still "base" now is "pointcloudset"
- automatic start of pointcloudset conda environment now working
- use fixed version number of pointcloudset_base image
wrong release due to testing of github actions and bump2version
- random_down_sample method for pointclouds.
- Better handling of plotting large point clouds: warn when number of points is above 300k (issue#18)
- set conda environment name to "pointcloudset" not "base"
- better CD of docker images
- sticking to semantic versioning
- empty PointCloud object (issue#6)
- columns option to generate empty PointClouds with a specific schema (issue#6)
- support for reading and writing Datasets with empty frames (issue#6)
- check if all required files are written when saving a dataset