- Notifications
You must be signed in to change notification settings - Fork3
segments-ai/segments-voxel51-plugin
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
🚀New: experimental support for multisensor sequences 🚀
A plugin forVoxel51, integrating it with the annotation platformSegments.ai
plugin_demo.mp4
fiftyone plugins download https://github.com/segments-ai/segments-voxel51-pluginfiftyone plugins requirements @segmentsai/segmentsai --install
To use this plugin, you need to provide your Segments.ai API key. This is done by setting it as an environment variable before launching the app:
export SEGMENTS_API_KEY=<YOUR API KEY HERE>fiftyone app launch
The following table shows the compatiblefityone mediatypes with theSegments.ai label types.
Fiftyone media type | Compatible Segments.ai datasets |
---|---|
image | segmentation-bitmap, vector, bbox, keypoints |
pointcloud | pointcloud-cuboid, pointcloud-vector |
3d1 | pointcloud-cuboid, pointcloud-vector |
video | Not supported |
group | multisensor-sequence |
When usingmultisensor-sequence
datasets, this plugin expects you to use thedynamic grouping feature of fiftyone. This will allow you to group fiftyone samples into sequences.
You can bring up the operators by pressing "`" and searching for their name. Below you can find the available operators.
This operator selects the corresponding Segments.ai dataset for the current fiftyone dataset. This is needed for the other operators to interact with existing Segments.ai datasets. This operator will show a dropdown list of all your Segments.ai datasets with the corresponding data type. The selected Segments.ai dataset is stored internally in the fiftyone dataset object.
With this operator you can, from the fiftyone app, create a new Segments.ai annotation dataset. You can either upload the whole fiftyone dataset, upload only the current view, or you can upload all of the selected samples. This operator will either upload your data to the segments.ai AWS bucket if the samplesfilepath
is a local path. If it refers to a cloud storage location (s3://
orgs://
), it will only send that reference to Segments.ai. Alternatively you can specify thesegments_filepath
metadata field in the fiftyone sample (seethis section).
Current limitations:
- Fiftyone datasets with 3D scenes are not yet supported. If you want to upload 3D pointclouds, please usepoint cloud datasets
- You can't create a
multisensor-sequence
Segments.ai dataset using this operator. Instead, you can create a dataset using the segments.ai web interface and use the plugin to setup your annotation task.
You can fetch annotations from a Segments.ai dataset using this operator. When you call this operator, you can select one of your datasets and one of its releases. It will then pull the annotations and display them within the fiftyone app. You can read more on how samples are matched to eachother across the two platformshere.
Current limitations:
- It's currently not possible to fetch annotations for Segments.ai sequences (except
multisensor-sequences
). - For
multisensor-sequences
: you can only fetch annotations for the pointcloud annotation task. Importing image annotations from these datasets is not yet implemented.
Segments.ai issues are useful mechanisms for communicating problems in the labelling with you annotation team. With this operator, you can file an issue in a specific sample from within the fiftyone app. Simply select 1 sample and run this operator. You will be able to describe your issue in the text box, which will then be uploaded to Segments.ai!
When pulling labels from Segments.ai, the operator will match fiftyone samples with Segments.ai samples in one of two ways. Which mechanism is used is based on if you are using a sequence Segments.ai dataset or not.
In case you are using a non-sequence Segments.ai dataset, the matching mechanism simply looks up the corresponding fiftyone sample based on the segments sample UUID. These are automatically stored in the fiftyone sample object as an attribute, under the key "segments_uuid". If you have uploaded your dataset to Segments.ai using therequest_annotations
operator, this is done for you automatically! If not, you will have to provide this information yourself. This can be done as follows:
importfiftyoneasfodataset=fo.load_dataset("your_dataset")forsampleindataset.iter_samples(autosave=True,progress=True):# Somehow match the fo.Sample with the segments.ai samplesample["segments_uuid"]="<UUID OF YOUR SAMPLE HERE>"
Alternatively, this example uses FiftyOne'sset_values() method to perform a bulk update:
importfiftyoneasfodataset=fo.load_dataset("your_dataset")dataset.set_values("segments_uuid",your_values_map,key_field=your_key)
For sequence datasets, matching a fiftyone sample and Segments.ai sample is a bit more complex, as a Segments.ai sample contains multiple frames and data types, while fiftyone samples always contain 1 frame. Instead of matching only UUID, we instead match samples on a tuple of 3 values:
segments_uuid
-str
: the UUID of the sample on Segments.aisegments_frame_idx
-int
: the frame number of the data point on Segments.ai (zero indexed)segments_sensor_name
-str
: the sensor name of this data point on Segments.ai
These 3 are all stored as fields in the fiftyone sample.
If you have created your sample using therequest_annotations
operator, then the plugin will have populated these fields if fiftyone for you.
To optimally use the Segments.ai platform, you can provide information such as calibration parameters and egomotion transformations to your fiftyone samples before creating a sample on Segments.ai. Below you can find an overview of the different metadata fields this plugin interacts with.
Here is an example showing how to specify the intrinsic matrix of an image sample:
importfiftyoneasfodataset=fo.load_dataset("your_dataset")forsampleindataset.iter_samples(autosave=True,progress=True):# Somehow match the fo.Sample with the segments.ai samplesample.metadata.intrinsic_matrix= [[721.5377,0.0,609.5593], [0.0,721.5377,172.854], [0.0,0.0,1.0]]
Field | Description |
---|---|
Sample.segments_filepath | The cloud storage location where this data sample is stored. If defined, the plugin will send this URL to Segments.ai when using therequest_annotations operator, instead of using the defaultfilepath field. |
Field | Description |
---|---|
Sample.metadata.intrinsic_matrix | A nested list with 3x3 elements representing the camera intrinsic matrix. |
Sample.metadata.extrinsics_position | Dictionary containing the translation component of the camera extrinsics. Expects keysx, y, z |
Sample.metadata.extrinsics_rotation | Dictionary containing the rotation component of the camera extrinsics as a quaternion. Expects keysqx, qy, qz, qw |
Sample.metadata.distortion_model | String representing which distortion model the camera uses. |
Sample.metadata.distortion_coefficients | The coefficients of the camera model |
Sample.metadata.camera_convention | The camera convention used for the calibration. Should be either “OpenGL” or “OpenCV” |
Field | Description |
---|---|
Sample.metadata.position | the position of the lidar sensor. Should be a dictionary containingx, y andz . |
Sample.metadata.heading | The rotation of the lidar sensor as a quaternion. Should be a dictionary containingqx, qy, qz andqw |
Footnotes
request_annotations operator not yet supported for 3d mediatype. Please use the pointcloud mediatype if you need this feature.↩
About
Resources
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.
Contributors2
Uh oh!
There was an error while loading.Please reload this page.