SAM 2 AND SAM 2 VIDEO TRACKING ARE AVAILABLE NOW (beta)!!! Simply go to Encord Labs and turn SAM 2 and SAM 2 Video Tracking on. For more information about SAM 2 and Encord, check out our Blog.

SAM 2 Video Tracking (beta)

SAM 2 brings state-of-the-art segmentation and tracking capabilities for both video and images into a single model. This unification eliminates the need for combining SAM with other video object segmentation models, streamlining the process of video segmentation into a single, efficient tool. It maintains a simple design and fast inference speed, making it accessible and efficient for users.

The model can track objects consistently across video frames in real-time, which opens up numerous possibilities for applications in video editing and mixed reality experiences. This new model builds upon the success of the original Segment Anything Model, offering improved performance and efficiency.

SAM 2 can also be used to annotate visual data for training computer vision systems. It opens up creative ways to select and interact with objects in real-time or live videos.

SAM 2 Video Tracking Key Features

  • Demonstrated superior accuracy in video segmentation with three times fewer interactions compared to previous models and an 8x speedup for video annotations. For image segmentation, it is not only more accurate but also six times faster than its predecessor, SAM.

  • Object Selection and Adjustment: SAM 2 extends the prompt-based object segmentation abilities of SAM to also work for object tracks across video frames.

  • Robust Segmentation of Unfamiliar Videos: The model is capable of zero-shot generalization. This means it can segment objects, images, and videos from domains not seen during training, making it versatile for real-world applications.

  • Real-Time Interactivity: SAM 2 utilizes a streaming memory architecture that processes video frames one at a time, allowing for real-time, interactive applications.

Turn On SAM 2

SAM 2 Video Tracking is currently in beta. SAM 2 Video Tracking REQUIRES SAM 2. Because both are currently in beta, you must turn the features on from Encord Labs on the Settings screen.

Contact support@encord.com to gain access to Encord Labs.

To turn on SAM 2:

  1. Click your profile icon. A menu appears.

  2. Click Settings. The Settings screen appears.

  3. Toggle the SAM 2 and SAM 2 Video Tracking switches on under Encord Labs.

    .

How do I know SAM 2 is on?

After turning on SAM 2 from Encord Labs on the Settings screen, you can verify that the feature is on by annotating using a Bitmask, Polygon, or Bounding Box on a data unit. This means your Ontology must have a Bitmask, Polygon, or Bounding Box for labeling.

To verify SAM 2 is on:

  1. Go to Annotate > Projects. The Annotate projects list appears.

  2. Select a Project from the list that has a Bitmask, Polygon, or Bounding Box available to label an object.

  3. Go to the Queue for the Project.

  4. Click Annotate. A list of all the annotation tasks for the Project appears.

  5. Click a task from the list. The Label Editor appears with the data unit in the editor.

  6. Select the wand beside a Bitmask, Polygon, or Bounding Box for labeling. A dialog appears with SAM 2 in the heading.

Auto-segmentation Tracking

Auto-segmentation tracking follows instance labels through a series of frames in a video, allowing you to create near pixel perfect annotations with little effort. Auto-segmentation tracking is designed to maintain continuity in object tracking, even if the objects momentarily vanish from view for several frames.

Since tracking algorithms propagate labels forward through frames, auto-segmentation tracking works if labels are made on lower number frame (for example, near the start of the video).


Ontologies

The following table shows the Ontology shapes for which auto-segmentation can be used. Shapes not supported by the auto-segmentation tracking use the standard object tracking algorithm instead.

Auto-segmentation tracking is only available for videos.
Ontology shapeAuto-segmentation tracking
Bounding box
Rotatable bounding box
Polygon
Polyline
Primitive
Keypoint
Bitmask

Modalities

The following table shows the modalities that support auto-segmentation tracking.

Ontology shapeAuto-segmentation tracking
Images
Videos
Image Groups
Image Sequences
DICOM

Using auto-segmentation tracking

Auto-segmentation tracking is computationally intensive and performance might be slower than other labeling actions.

Bounding box

  1. Create at least one bounding box instance label on the frame you want to start tracking the instance from.

  2. Right-click the shape.

  3. Click Track object and select the Advanced (slow) option to run auto-segmentation tracking for 30 consecutive frames.

Use the keyboard shortcut Shift + T to run auto-segmentation tracking for 30 consecutive frames.

Rotatable bounding box

  1. Create at least one rotatable bounding box instance label on the frame you want to start tracking the instance from.

  2. Right-click the shape.

  3. Click Track object and select the Advanced (slow) option to run auto-segmentation tracking for 30 consecutive frames.

Use the keyboard shortcut Shift + T to run auto-segmentation tracking for 30 consecutive frames.

Bitmask

  1. Create at least one bitmask instance label on the frame you want to start tracking the instance from.

  2. Right-click the shape.

  3. Click Track object to run auto-segmentation tracking for 30 consecutive frames.

Use the keyboard shortcut Shift + T to run auto-segmentation tracking for 30 consecutive frames.

Polygon

  1. Create at least one bitmask instance label on the frame you want to start tracking the instance from.

  2. Right-click the shape.

  3. Click Track object to run auto-segmentation tracking for 30 consecutive frames.

Use the keyboard shortcut Shift + T to run auto-segmentation tracking for 30 consecutive frames.

Tracking multiple objects

To track multiple objects of any shape:

  1. Click one of the objects you want to track.

  2. Hold Shift on your keyboard and select all the other objects you want to track.

  3. Right-click on one of the selected objects and click Track X objects, where X is the number of objects selected.


Settings

The Object tracking section of the editor settings allows you to adjust the following.

Change tracking range

The range auto-segmentation tracking runs can be adjusted in the Object tracking section of the editor settings.

The default tracking range is set to 30 frames. This range includes the frame that auto-segmentation tracking starts on.

Increasing the tracking range to more than 500 frames can slow down performance considerably. We recommend running auto-segmentation tracking in intervals of 500 frames at a time while this feature is in the beta phase. This feature’s performance is continuously being improved, and these limits will be raised in the near future.

Change tracking algorithm

You can change the tracking algorithm in the Object tracking section of the editor settings.

  • Select Advanced to run auto-segmentation tracking.
  • Select Standard to run standard object tracking.

The choice of setting determines which algorithm is run when using the Shift + T keyboard shortcut.