Skip to main content

Instance Segmentation Masks

caution

This works for segmentation/polygons only.

caution

Every time you run any of these importers, previously imported predictions will be overwritten! Make sure to version your projects if you want to be able to go back to previous model iterations.

If you have your predictions stored as png masks of shape [height, width], where each pixel value correspond to a class, then you can use the import_mask_predictions function from encord_active.model_predictions.importers. It requires that you can provide a mapping between file name and data hashes.

Assuming you have predictions stored in a directory like this:

predictions
├── aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee.png
├── ...
└── aaaaaaaa-bbbb-cccc-dddd-ffffffffffff.png

or in a nested structure like

predictions
├── dir1
│   ├── aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee.png
│   ├── ...
│   └── aaaaaaaa-bbbb-cccc-dddd-ffffffffffff.png
└── dir2
   ├── bbbbbbbb-bbbb-cccc-dddd-eeeeeeeeeeee.png
   ├── ...
   └── bbbbbbbb-bbbb-cccc-dddd-ffffffffffff.png

You can use this template where the highlighted lines are what you need to change:

from pathlib import Path

import yaml
from encord import EncordUserClient

from encord_active.lib.model_predictions.importers import import_mask_predictions
from encord_active.lib.model_predictions.writer import PredictionWriter

data_dir = Path("/path/to/the/data")
meta = yaml.safe_load((data_dir / "project_meta.yaml").read_text())
private_key = Path(meta["ssh_key_path"]).read_text()

client = EncordUserClient.create_with_ssh_private_key(private_key)
project = client.get_project(project_hash=meta["project_hash"])

class_map = {
# featureNodeHash: pixel_value
"OTk2MzM3": 1, # "pedestrian"
"NzYyMjcx": 2, # "cyclist",
"Nzg2ODEx": 3, # "car"
# Note: value: 0 is reserved for "background"
}
predictions_root = Path("/path/to/predictions")

with PredictionWriter(cache_dir=data_dir, project=project) as writer:
import_mask_predictions(
project,
data_root=predictions_root,
cache_dir=data_dir,
prediction_writer=writer,
# this is what provides the mapping between file names and data hashes:
du_hash_name_lookup=lambda file_pth: (file_pth.stem, 0),
)
caution
  1. The script will look recursively for files with a .png extension and import them.
  2. For each file, every "self-contained" contour will be interpreted as an individual prediction. For example, This mask will be treated as three objects. Two from class 1 and one from class 2.
┌───────────────────┐
│0000000000000000000│
│0011100000000000000│
│0011100000002222000│
│0000000000002222000│
│0000111000002200000│
│0000111000002200000│
│0000111000000000000│
│0000000000000000000│
└───────────────────┘
  1. NB: model confidence scores will be set to 1.