The Encord SDK allows you to interact with Encord’s automated labeling features. Our library includes sampling, augmentation, transformation and labeling algorithms.

Object interpolation

The encord.project.Project object interpolator allows you to run interpolation algorithms on project labels (requires a project ontology).

Interpolation is supported for the following annotation types:

  1. Bounding box

  2. Rotatable bounding box

  3. Polygon

  4. Polyline

  5. Object primitive

  6. Keypoint

Use the encord.project.Project.object_interpolation() method to run object interpolation.

Key frames, between which interpolation is run, can be obtained from label rows containing videos. The objects to interpolate between key frames is a list of <object_hash> values obtained from the label_row["labels"]["<frame_number>"]["objects"] entry in the label row. An object (identified by its <object_hash>) is interpolated between the key frames where it is present.

The interpolation algorithm can be run on multiple objects with different ontological objects at the same time (for example, you can run interpolation on bounding box, polygon, and keypoint, using the same function call) on any number of key frames.

# Fetch label row
sample_label = project.get_label_row("sample_label_uid")

# Prepare interpolation
key_frames = sample_label["data_units"]["sample_data_hash"]["labels"]
objects_to_interpolate = ["sample_object_uid"]

# Run interpolation
interpolation_result = project.object_interpolation(key_frames, objects_to_interpolate)
    "frame": {
        "objects": [
                "objectHash": object_uid,
                "featureHash": feature_uid (from editor ontology),
                "polygon": {
                    "0": {
                        "x": x1,
                        "y": y1,
                    "1": {
                        "x": x2,
                        "y": y2,
                    "2" {
                        "x": x3,
                        "y": y3,
    "frame": {

The interpolation algorithm can also be run from sample frames kept locally, with key_frames passed in a simple JSON structure (see doc-strings for more information).

All that is required is a <feature_hash> and object_hash for each object in your set of key frames.