Overview

To upload predictions in Encord Active, you need to create a prediction branch. This guide explains everything you need to know for importing predictions.

Predictions Workflow

  1. Import Predictions to Annotate Project: Everything starts in Annotate. Your labels and predictions must exist in your Annotate Project for the labels and predictions to appear in Active.
  2. Import/Sync Project in Active: After importing your predictions, you can then import the Project into Active. Or you can sync an existing Active Project after importing your predictions.
  3. Analyse the Predictions in Active: Once the Project import/sync completes, specify the prediction set for Active to analyse.
  4. Select the Predictions in Active: Once analysis completes, select the prediction set you want to view in Active.

Supported Prediction Formats

Encord Format (Recommended)

  • Supports multi-level nested classifications (radio, checklist, or free-form text) under objects or classifications.
  • Handles all object types and classification.
  • Only top-level objects and classifications are considered when calculating in model metrics.

  • Metrics are not yet available for keypoints and polylines. If you are interested in these, please contact the Encord team.

COCO Format

Does not supports multiple levels of nested classifications (radio, checklist, or free-form text) under tools or classifications.

Confidence Score

You can include confidence scores when uploading predictions. Encord automatically calculates model metrics based on your prediction set and assigned confidence scores.

Prediction Branches

When importing prediction sets into Encord Active, they are added as branches to individual label rows on your data units (images, videos, audio). Each data unit has the following:

  • A MAIN branch for ground truth annotations or pre-labels.
  • Optional Consensus branches and Prediction branches for different prediction sets.

STEP 1: Import Predictions

Import your predictions to a Project in Annotate. Encord currently supports importing predictions from the Encord format and from COCO.

TLDR;

Do you already know what you are doing and only want to look over a Jupyter Notebook example to import your predictions? We provide one here.

Import Encord-Format Predictions

Use branch_name to create a prediction branch in label_rows_v2 for a data unit.

  • branch_name supports alphanumeric characters (a-z, A-Z, 0-9) and is case sensitive
  • branch_name supports the following special characters: hyphens (-), underscores (_), and periods (.)

This simple example imports a bounding box model prediction to all data units in the prediction branch.

Store Predictions Boilerplate
# Import dependencies
import os
from encord import EncordUserClient, Project
from encord.objects import LabelRowV2, Object, OntologyStructure, ObjectInstance

from encord.objects.coordinates import BoundingBoxCoordinates, RotatableBoundingBoxCoordinates, PolygonCoordinates, PolylineCoordinates, PointCoordinate, BitmaskCoordinates


# Configuration
SSH_PATH = "file-path-to-your-ssh-key"
PROJECT_HASH = "unique-id-for-project"

# Specify a label_rows_v2 branch name for your predictions.
PREDICTION_BRANCH_NAME = "name-of-your-prediction-branch"

assert SSH_PATH is not None, "SSH path cannot be None"
assert PROJECT_HASH is not None, "Project hash cannot be None"

# Authenticate with Encord
user_client = EncordUserClient.create_with_ssh_private_key(
    ssh_private_key_path=SSH_PATH
)

# Access the project and prepare the branch for predictions
project = user_client.get_project(PROJECT_HASH)
prediction_branch_rows = project.list_label_rows_v2(branch_name=PREDICTION_BRANCH_NAME)

if len(prediction_branch_rows) > 0:
  print("Branch is:", prediction_branch_rows[0].branch_name)

ontology_object = project.ontology_structure.objects[0]
bundle = project.create_bundle()
for row in prediction_branch_rows:
    row.initialise_labels(bundle=bundle)
bundle.execute()

for row in prediction_branch_rows:
    inst = ontology_object.create_instance()
    inst.set_for_frames(
        coordinates=BoundingBoxCoordinates(
            height=0.8,
            width=0.8,
            top_left_x=0.1,
            top_left_y=0.1,
        ),
        # Add the bounding box to the first frame
        frames=0,
        # There are multiple additional fields that can be set optionally:
        manual_annotation=False,
    )
    row.add_object_instance(inst)
    row.save(bundle=bundle)
bundle.execute()

Import COCO Labels as Predictions

The following code imports COCO labels as predictions for Active.

For more information on importing COCO labels into Encord, refer to our documentation.

Replace the following:

  • <private_key_path> with the file path to your SSH private key.

  • <my-prediction-branch-name> with the name of your prediction branch.

  • <project_hash> with the Project ID for your Project.

  • COCOimportfile.json with the full path of the COCO file containing the predictions you want to import.

COCO Label import as Predictions

import json
from pathlib import Path
from encord.utilities.coco.datastructure import FrameIndex
from encord import EncordUserClient
from encord.exceptions import OntologyError

# Authenticate client
SSH_PATH = "file-path-to-your-ssh-key"

# Specify a Project to import your predictions to. This Project must already exist in Encord.
PROJECT_HASH = "unique-id-for-project"

# Specify a label_rows_v2 branch name for your predictions.
PREDICTION_BRANCH_NAME = "name-of-your-prediction-branch"

# Authenticate with Encord using the path to your private key
user_client: EncordUserClient = EncordUserClient.create_with_ssh_private_key(
    ssh_private_key_path=SSH_PATH
)

# Replace with your project hash
project = user_client.get_project(PROJECT_HASH)

# Load the COCO annotations JSON file
# Replace 'COCOimportfile.json' with the full path to your COCO file
coco_file = Path("COCOimportfile.json")
labels_dict = json.loads(coco_file.read_text())

# Build a mapping from COCO category IDs to the feature hashes in your Encord Ontology. 
category_id_to_feature_hash = {}
ont_struct = project.ontology_structure
for coco_category in labels_dict["categories"]:
    try:
        ont_obj = ont_struct.get_child_by_title(coco_category["name"])
        category_id_to_feature_hash[coco_category["id"]] = ont_obj.feature_node_hash
    except OntologyError:
        print(f"Could not match {coco_category['name']} in the Ontology. Import will crash if these are present.")

# Build a mapping from COCO image IDs to Encord frame indices
# This is only applicable for images, image groups, image sequences, and DICOM series
image_id_to_frame_index = {}
data_title_to_label_row = {lr.data_title: lr for lr in project.list_label_rows_v2()}
for img in labels_dict["images"]:
    lr = data_title_to_label_row[img["file_name"]]

    # Creates a mapping between the COCO image IDs and the corresponding frame indices in Encord
    # In this example, the target frame is 0 because the files in the sample project are single images
    image_id_to_frame_index[img["id"]] = FrameIndex(lr.data_hash, frame=0)

# Import the COCO labels into Encord
project.import_coco_labels(
    labels_dict,
    category_id_to_feature_hash,
    image_id_to_frame_index,
    branch_name=PREDICTION_BRANCH_NAME,
)

Verify Prediction Import

After importing your predictions, verify that your predictions imported.

The following code returns all labels and predictions on all branches.


# Import dependencies
from encord import EncordUserClient
import json

SSH_PATH = "file-path-of-your-ssh-key"
PROJECT_HASH = "unique-id-for-your-project"

# Instantiate client. Replace \<private_key_path> with the path to the file containing your private key.
user_client = EncordUserClient.create_with_ssh_private_key(
    ssh_private_key_path=SSH_PATH
)

# Specify Project. Replace \<project_hash> with the hash of the Project you want to export labels for.
project = user_client.get_project(PROJECT_HASH)

# Downloads a local copy of all the labels
# Without the include_all_label_branches flag only the MAIN branch labels export
label_rows = project.list_label_rows_v2(include_all_label_branches=True) 

for label_row in label_rows:
    # Here we have the label row for the branch, but without labels themselves downloaded
    print(f"Title: {label_row.data_title}, branch: {label_row.branch_name}")

    # And now we download the label content itself (bounding boxes and stuff)
    label_row.initialise_labels()

    # Print essential label information for all objects
    for object_instance in label_row.get_object_instances():
        print (f"objectHash: {object_instance.object_hash}")
        print (f"Object name: {object_instance.object_name}")
        print (f"featureHash: {object_instance.feature_hash}")
        print (f"uid: {object_instance.ontology_item.uid}")
        print (f"Object color: {object_instance.ontology_item.color}")
        print (f"Ontology shape: {object_instance.ontology_item.shape}")

        # Print the frame number and the location of the object on the frame
        for annotation in object_instance.get_annotations():
            print(f"Frame {annotation.frame} -> {annotation.coordinates}")

        # Print all attributes 
        for attribute in object_instance.ontology_item.attributes:
            print (attribute, object_instance)

    # Print all essential classification information
    for classification_instance in label_row.get_classification_instances():
        print (f"classificationHash: {classification_instance.classification_hash}")
        print (f"Classification name: {classification_instance.classification_name}")
        print (f"featureHash: {classification_instance.feature_hash}")
        print (f"Classification answer: {classification_instance.get_answer().value}")
        print (f"Classification answer hash: {classification_instance.get_answer().feature_node_hash}")


        # Print the frame number(s) that a classification appears on
        for annotation in classification_instance.get_annotations():
            print(f"Classification appears on frame: {annotation.frame}")

End-to-End Prediction Import Example

We provide an end-to-end example using a Jupyter Notebook here.

STEP 2: Import/Sync Project to Active

Import or sync the Annotate Project in Active.

STEP 3: Analyse the Predictions

Active MUST analyse the predictions before you can view the predictions in Active.

STEP 4: Select the Predictions

Once analysis completes, select the prediction set to view in Active.

Next Steps

Model and Prediction Validation