Task management
The task manager is a system built to optimize labeling and quality control for all annotation and review tasks, allowing thousands of annotators, reviewers, team managers, and administrators, to work concurrently on the same project.
The task manager is enabled by default but can be switched on and off under the 'Options' tab in project settings.
Annotation and review tasks are distributed automatically using the first in, first out method - tasks that have been in the queue the longest are served first. Annotation tasks are generated and added to the label queue when a dataset or a set of datasets is attached to a project and when new data is added to attached datasets. Review tasks are generated and added to the review queue once an annotator submits an annotation task. Conversely, detaching a dataset will remove any associated annotation and review tasks, so you should exercise caution if you proceed.

Team managers and administrators can also assign tasks explicitly to individual annotators and reviewers. Once an annotation or review task is distributed to an annotator or reviewer, it is reserved by that individual, prohibiting other team members from accessing that task. Both annotation and review tasks are accessible in the 'Queue' pane in the 'Labels' tab.

Task generation
Annotation tasks are generated and added to the label queue when a dataset or a set of datasets is attached to a project and when a new data asset is added to attached datasets. Review tasks are generated and added to the review queue once an annotator submits an annotation task. Conversely, detaching a dataset will remove any associated annotation and review tasks, so you should exercise caution if you proceed.
By default, each data asset will be labeled once, and each label submitted for review will be reviewed once. You can create additional review tasks by clicking the + Add reviews button and following the steps in the window. You can reopen submitted annotation tasks if you wish to send the data asset back into the queue for further labeling by selecting the relevant assets and clicking the Reopen button.

Task distribution
Annotation and review tasks are distributed automatically using the first in-first out method (illustrated below) - tasks that have been in the queue the longest are served first. Once an annotator or reviewer clicks on the Start labeling or Start reviewing button, the next available free task in the queue is reserved by that individual, prohibiting other team members from accessing the task. Once the task is fetched, the annotator or reviewer is taken to the label editor to complete the task.

Project administrators and team managers can override the automated distribution of tasks by explicitly assigning tasks to individuals in the 'Queue' pane in the 'Labels' tab. Assignments can be done on a task-by-task basis or in bulk by selecting the relevant tasks and clicking the Assign button.

Tasks can be released by pressing the icon next to the task and clicking the Release task button. Reserved tasks do not have an expiry and will keep being assigned to an individual until it is submitted, released, or skipped.
Tip
Annotation tasks can be submitted programmatically via our API & using Encord's Python SDK.
Task completion
An annotation task is completed once all outstanding labels subject to review have been reviewed. Completed annotation tasks and annotation tasks currently in the review stage are visible in the 'Activity' pane in the 'Labels' tab.

Task Status
As tasks move through the Task Management System, their status will evolve from 'Queued' for annotation to 'In review' and then 'Complete'. If labels are rejected or the tasks is otherwise judged in need of further annotation work,
it will be marked as 'Returned'. The most comprehensive view of task statuses is available to project Administrators
and Team Managers in a project's labels dashboard under the data tab.
Annotation
Annotation tasks are completed in the label editor. Click Submit when labelling is complete. Being "complete" means the annotator meets the accuracy threshold and labelling coverage required for the project.The Quality configuration setting, for a project, specifies which submitted labels are selected for review.
Annotators can skip tasks by clicking the Skip button. The next available task is automatically distributed and assigned if a task is skipped.

Caution
In order to prevent any possible issues of annotator work being overwritten, it's critical that all annotations are done via the Task Management System's Queue tab, and only the person assigned to the task makes annotations at any given time.
Review
Review tasks are completed in a purpose-built user interface in the label editor.

Review mode components:
- A. Single label review toggle
- B. Edit labels
- C. Pending reviews pane
- D. Completed reviews pane
- E. Reject and Approve buttons
- F. Approve and Reject all in frame buttons
Note
All labels are reviewed on an instance level. This means that if an instance is rejected on one frame, it will be rejected across all frames. This includes using the Accept all in frame and the Reject all in frame buttons.
'Pending' and 'Completed' review panes
All labels for review for a particular data asset assigned to the reviewer are automatically loaded into the 'Pending reviews' pane. Completed reviews are displayed in the 'Completed reviews' pane. You can click on specific objects to highlight them. Labels can be selected and then approved or rejected for a given instance or in bulk using the Reject and Approve buttons or the matching hotkeys, b for reject and n for approve.
'Single label review' toggle
You can enter the 'Single label review' mode by toggling the switch at the top. The single label review mode automatically highlights and hides other objects, allowing you to review and approve or reject a single label at a time and quickly browse through individual labels using the Up and Down keys on your keyboard.
Note
The reviewer is automatically taken to the next set of requested label reviews once all labels in a particular review task have been reviewed.
Edit labels
A convenient feature is allowing reviewers to edit labels and make small adjustments without the need to return the entire set of labels to the annotator.
Press the Edit labels button and make any necessary changes before switching back to review mode.
Currently, only a subset of label edit operations are supported:
- objects: moving the object or individual vertices
- classifications: changing the <> value
- objects and classifications: change any <>s
Approve/Reject all in frame buttons
In addition to being able to review all labels for a given instance, you can review labels grouped by frame as well.
For review workflows that focus on progressing through video by frame rather than by instance, use the Approve all in frame and Reject all in frame buttons.
Of course, you should be sure you want to apply that judgement to all labels in a given frame before using this feature!
Rejected labels
If a reviewer rejects a label during the review stage, it will be marked as Returned in the 'Queue' pane in the 'Labels' tab. By default, rejected annotation tasks are returned and assigned to the queue of the person who submitted the task.
Returned tasks are resolved in a purpose-built user interface in the label editor. Click the icon on the right-hand side of the screen to open the drawer containing rejected labels. Once the reviewer comments have been addressed, click the
button icon to mark it as resolved.
Annotation tasks cannot be re-resubmitted until all issues have been marked as resolved. Once a task is re-submitted, the labels marked as resolved are sent back for an additional review. There is no limit on how many times a label can be rejected and sent back for correction.

Missing labels
If a reviewer determines that a label is missing entirely, they can use the report missing labels feature to indicate labels are missing in a given frame or image. Missing label reports will be sent back to the annotator via the same queue as rejected labels.

Learn the ins and outs of our products faster, and with more ease than ever before! Our brand-new documentation recently landed, and comes with new features as well as a sleek and stylish new look.
Documentation is an essential component of any product's user experience. Whether you're looking to refresh your memory of features you've previously used, or wanting to learn how to use new features to boost your team's productivity - our documentation is always the first place to look.
This is why we've stepped up, and levelled up our documentation in several key ways
-
Annotate, Active, and Apollo docs are all in one place and provide a clear overview of Encord's capabilities.
-
The improved search makes it easier to find the exact nuggets of information you're after.
-
The voting system makes our docs interactive by letting you submit feedback on every page. Your feedback is invaluable when it comes to improving our products as well as our documentation as we strive to make Encord as accessible and easy to use as possible!
Discover a whole new level of efficiency and ease with our new product documentation! Our latest update brings you not only a fresh and stylish design but also some new features that make mastering our products a breeze.
We understand that comprehensive documentation is the backbone of a seamless user experience. Whether you're a seasoned user looking to revisit familiar features or a newcomer eager to explore our latest innovations, our documentation is your ultimate go-to resource.
Here's how we've taken our documentation to the next level:
Unified Platform: Access all documentation in one centralized location, including Annotate, Active, and Apollo docs. This cohesive setup provides a clear and comprehensive overview of Encord's full range of capabilities.
Enhanced Search: Our improved search functionality allows you to quickly find precise nuggets of information easier than ever before.
Interactive Voting System: Your input matters! We've made our documentation interactive by incorporating a voting system. Every page welcomes your feedback, giving you the opportunity to 'like' or 'dislike' a page, thereby letting us know where we can continue improving our documentation.
Updated 4 days ago