1. In the Encord platform, select Projects under Annotate.
  2. Select the Project you want to Manage.

The dashboard is split into the following tabs:

  • Project Overview: High-level view of labeling and productivity statistics.
  • Explore: Explore the distribution of instances and labels across data assets in the project.
  • Queue: Shows all tasks in the Project by Workflow stage.
  • Workflow: Graphically displays the path tasks follow through the Project Workflow.
  • Labels & Export: For managing all the Project’s labels, including exporting labels.
  • Analytics: Detailed Project analytics.
  • Settings: Manage your Project Settings, including copying Projects, managing Project tags, customizing editor layouts, and deleting Projects.

Tab Visibility by Role

The Analytics and Queue tabs display different information based on the user’s role. For more details, refer to the Analytics dashboard documentation.
The Labels & Export tab for admins is called the Labels tab for Team Managers.
TabAnnotatorReviewerAnnotator + ReviewerTeam ManagerAdmin
Summary
Explore
QueueAnnotate tab onlyReview tab onlyAnnotate + Review tabs only
WorkflowView only
Labels & ExportView labels only
PerformancePersonal performancePersonal performancePersonal performanceTeam PerformanceTeam performance
ModelsView only
SettingsAdd users (except Admins)✅ All settings and user management

Update Project Status

Contact support for access to this feature.

Admins and Team Managers can change the status of a Project to any of the following:

  • Not started
  • In progress
  • Paused
  • Completed
  • Cancelled
Updating the Project status changes the Project icon accordingly.

When the Project status is changed to either Paused, Completed, or Cancelled then Annotators, Reviewers, and Annotator + Reviewers are prevented from opening tasks, and the Initiate button is greyed out. Admins and Team Managers are able to annotate and review tasks as usual, regardless of the Project status.

Tasks can still be opened using back door access. For example if an Admin shares a specific URL of a task with an Annotator, Reviewer, or Annotator + Reviewer.


Project Overview

The Project Overview dashboard has the following components that provide an overview of the Project:

Project task status overview:

Displays the number of tasks that are in each state of your workflow project. The number of states and their names reflect the choices made during workflow project creation.

Click on the stages in the chart key to remove, or include them from the donut view

Explore tab

The Explore tab is a legacy feature and will be discontinued.
The Explore tab is only visible to project Admins and Team Managers.

The Explore tab helps you understand how project annotations are distributed among data assets, at both an instance and label level. It allows a deeper exploration through attributes on objects, as well as frame-level classifications.

Instance statistics

This section provides the total count of all instances across the datasets in your project.

  • Project total: Shows total instances (both objects and classifications) across the project by default. To get instance statistics for individual data files, click the drop-down to select a data file.
  • Select class: Shows the total instances for a particular class. This is a summary of how a given class is distributed across your project’s data assets. The pie chart segments show a breakdown of how that class is split across the data assets.
  • Display timestamps: Flip the toggle to switch between frame numbers and timestamps for the labels.

Label statistics

This is a summary of how your labels are distributed across the project. The pie chart shows a breakdown of how many labels there are for a given class.

  • Project total: Shows the total number of labels across different datasets in the project. To get label stats for individual data files, click the drop-down to select a data file.
  • Objects: Click on the pie chart segment of a class to see the total number of labels and its attributes (sometimes called nested attributes) if available for that class.
  • Classifications: Shows the global classification at project or individual video level. For example, location, time of day, etc.

Task Queue & Workflow

Use the Queue tab to assign, prioritize, and manage tasks, as well as to start labeling and reviewing for all users associated with a Project.

The Queue tab’s layout adapts based on user permissions.

Workflow tab:

The Workflow tab is only visible to project Admins and Team Managers.

The Workflow tab lets you view and edit the Project’s Workflow. To begin editing the Workflow, click the Edit button on the canvas.


Labels & Export

The Labels tab is only visible to project Admins and Team Managers.

The Labels tab is your gateway to auditing and exporting labels created in your project.

Access to each pane depends on the user’s role.
RoleActivityQueueDataInstances
Annotator
Reviewer
Annotator + Reviewer
Team Manager
Admin

The labels dashboard features the following tabs:

  • Data: A complete overview of all tasks in the project, with the option to export your labels on a per-task basis.
  • Label Instances: The Label Instances tab lets you use the unique instance identifier to search the project for a specific instance, and jump directly into the editor to confirm the status of an annotation visually.

Data tab

The Data tab provides a complete overview of all tasks in the project, and lets you see which workflow stage each task is in.

Export labels

Select the data units in the Data tab you want to export labels for and click the Export and save button highlighted in the screenshot below to export labels. See our documentation on exporting labels for more information.

Save Label Version

Select the data units you’d like to save label version for and click the Save new version button highlighted in the screenshot below. It will be listed in the Saved versions tab.

A new label version is saved each time you export labels.

Label versioning allows you to keep track of your label history over time by providing a snapshot of labels at a point in time. Label versions can be exported and analyzed to track annotation performance over time.

Label Instances

The Label Instances tab allows Administrators and Team Managers to search for specific instances within the data. An annotation instance refers to a unique occurrence of an ontology class in a data asset (e.g., ‘Person (0)’ for the first instance of a ‘Person’). Instances span multiple frames of a data asset, representing the same object. Use this tab to locate specific objects or classifications by their Identifier.

Instance identifiers are unique within a project and can be found in several ways:

  • In the Label Editor: Click on an instance, then select Copy identifier from the instance action menu.
  • In Exported Labels: Look for objectHash or classificationHash in the exported data.
  • Using the SDK: Specify your own objectHash or classificationHash during label uploads.

Once you have an identifier, use the Search instance interface to filter and locate the specific instance. This is especially useful for visually confirming annotations linked to an identifier.

After finding your instance, click View in the Actions column to jump directly to the first annotation of that instance in the dataset.

Saved Versions

The Saved versions tab displays information for versions of your labels. The Actions column lets you:

  • Export label versions by clicking the Download version icon in the Actions column. For format of exported labels has the same structure as outlined in the export documentation.

  • Delete label versions by clicking the Delete version icon in the Actions column.


Performance (Legacy)

This dashboard shows legacy Analytics. To see Detailed analytics relating to Workflow Projects see the Analytics section.

Performance - Summary

The Summary tab of the performance dashboard provides an overview of your team’s manual labeling and productivity.

The Summary tab only displays actions taken in the Label Editor. Actions taken in the SDK will not be displayed.

Task actions over time

View the number of tasks in a project that have been approved, rejected, and submitted for review over a given period of time.

  • The height of a bar represents the total number of tasks.
  • The height of each color within a bar represents the number of approved, rejected, and submitted tasks.
  • A: Set the time period you would like to see displayed by selecting a range of dates.
  • B: The Hide days without any actions toggle removes all days at which no actions were taken from the view.
  • C: Download a CSV file of the data.
  • D: Display the data as a bar chart, or a table. While the chart provides a clear visual representation, the table provides exact figures for a more detailed picture of your team’s performance.

Instance Label actions over time

View the number of instance label actions in a project that have been approved, rejected, and submitted for review over a given period of time.

  • A: Set the time period you would like to see displayed by selecting a range of dates.
  • B: Download a CSV file of the data.
  • C: Display the data as a bar chart, or a table. While the chart provides a clear visual representation, the table provides exact figures for a more detailed picture of your team’s performance.

Within your specified time period, you can choose which dates to display by using the slider located beneath the graph.

Team collaborators

The ‘Team collaborators’ section shows the duration of time each project collaborator spend working on a given file.

A. ‘Data file’ displays session time collaborators spent working on individual files. ‘Project’ displays session time collaborators have spent working on the project.

B. Table entries can be filtered according to dates by clicking the range of dates, and selecting the start and end date of the period you would like to see table entries displayed for.

C. Table entries can be downloaded in CSV format by clicking the Download CSV button.

D. When lots of entries are present they will be split across a number of different pages. The number of table entries per table can be adjusted.


Performance - Details

The Details tab of the performance dashboard gives a more detailed view of your team’s labeling and productivity. This section will cover manual QA projects. The below details will be displayed for Manual QA projects.

The Details tab of the performance dashboard only shows information for labels created in the Label Editor. Labels submitted via the SDK will not be shown on the Details tab. This includes labels that were submitted using the SDK, and edited in the Label Editor.
You can specify a range of dates, as well as whether statistics should be displayed for labels, or instances. More information on instances and labels can be found here.

Submissions chart

The submissions chart displays the number of submitted labels or instances over the specified time period. The chart can be filtered to show submissions for specific annotators or classes.

If you filter on both Annotators and Classes then the resulting chart will show the submission statistics for the selected annotators and for selected labels.

Reviews chart

The reviews chart displays the cumulative number of accepted and rejected labels or instances over the specified time period.

Annotator’s table

The annotator’s table displays all the relevant statistics for all annotators in a Project. It can be filtered on classes to show annotator statistics only for the selected classes.

  • User: The annotator’s email.
  • Rejection rate: Percentage of their labels or instances that have been rejected in the review process.
  • Submitted labels / instances: Number of labels or instances that the annotator has submitted for review
    • Repeated submissions are not counted.
  • Accepted labels / instances: Number of labels or instances that the annotator created that passed the review process.
  • Rejected labels / instances: Number of labels or instances that the annotator created that we’re rejected during the review process. Note that this can be higher than the number of submitted labels / instances since a label or instance can be rejected multiple times during the review process but the submission will only be logged once.
  • Total session time: Time spent labeling.

Reviewers table

  • User: The reviewers email.
  • Rejection rate: Percentage of labels or instances that they rejected in the review process.
  • Accepted labels / instances: Number of labels or instances that the reviewer accepted.
  • Rejected labels / instances: Number of labels or instances that the reviewer rejected.
  • Total session time: Time spent reviewing.

Objects and classifications table

Each row in the objects and classifications table can be expanded to show statistics on attributes.

  • Class: The class name.
  • Rejection rate: Percentage of labels or instances rejected in the review process.
  • Reviewed labels / instances: Number of labels or instances of the class that have gone through the review process.
  • Accepted labels / instances: Number of labels or instances of the class that have passed the review process.
  • Rejected labels / instances: Number of labels or instances of the class that failed the review process.
  • Avg. time to annotate: Average time spent annotating this class.

Analytics

The Analytics tab of your Project shows event-based analytics for your Project’s tasks, labels, and users. The Analytics tab has the following views:

  • Tasks: View analytics on specific tasks in your Project.
  • Labels: View analytics of labels in your Project.
  • Collaborators: View collaborator performance in your Project.

The analytics available on the Analytics dashboard vary based on user roles:

  • Admins and Team Managers have access to the Tasks, Labels, and Collaborators views, offering a comprehensive overview of team performance.
  • Annotators, Reviewers, and Annotator + Reviewer roles can only view the Task Actions, Time Spent, and Label Actions tables, limited to their individual contributions.

Actions performed in the Label Editor or using the SDK are only tracked on the Analytics dashboard if they the tasks are actioned (submitted/rejected/approved).

Project Analytics have the following filters:

FilterDescription
CollaboratorsFilter by specific Project collaborators.
DatasetsFilter tasks based on the dataset they belong to.
File NameFilter by file name. Regex is supported, allowing filtering by prefix or infix in the file title.
Event sourceFilter by the source of the event. Either SDK or UI.
Workflow StageFilter tasks based on their current stage in the Workflow.
ClassFilter by Ontology class.

Tasks View

The Tasks tab of the Analytics dashboard provides a detailed overview of task throughput, stage efficiency, and task timers. It can help answer questions such as:

  1. How productive was each collaborator in terms of labeling and reviewing tasks over the last week?

    • By filtering the data by Collaborators and Date time range, you can see how many tasks each team member worked on and how much time they spent on labeling and reviewing.
  2. Which Dataset has the most labels added, edited, or deleted in a given workflow stage?

    • You can filter by Datasets and Workflow stage to see which Dataset is being worked on the most and how many labels are being modified at each stage of the process.

The following charts are visible in the Tasks view:

ChartDescription
Task actionsDisplays the number of annotation tasks submitted, skipped, review tasks approved, and review tasks rejected over a selected date range.
Time spentShows the total time spent actively working in the Label Editor, providing insights into productivity.

Labels View

Label analytics are not supported for Text and Audio modalities.

The Labels tab in the Analytics dashboard provides a detailed overview of your team’s labeling, reviewing, and task productivity, including time spent. It can help answer questions such as:

  1. How many labels were submitted, approved, or rejected by the team over a given period?

    • Use the Label actions chart and apply the Date time range filter to view the total number of labels submitted, approved, or rejected within the selected time frame.
  2. What actions have been taken for specific objects and classifications in the Ontology?

    • Refer to the Objects and classifications actions chart and expand classifications within the Ontology column to see detailed statistics for each classification answer.
  3. How does the team’s productivity in labeling compare across different objects or classifications? Do certain objects take more time to label than others?

    • Analyze the Created, Approved, and Rejected columns in the Objects and classifications actions table to identify objects or classifications that might require additional review or clarification using their Rejection rate.
    • Compare the average time spent per object or classification by utilizing time-tracking metrics alongside these productivity statistics.
All charts allow you to view instance labels or frame labels by selecting the appropriate option from the dropdown next to the filter button.
ChartDescription
Label actionsDisplays the number of labels submitted, approved, or rejected.
Objects and classifications actionsShows the actions taken for all objects and classifications in the Ontology.

The Objects and classifications actions table includes the following columns:

  • Ontology: Represents the Ontology class, encompassing both objects and classifications. For classifications, you can expand to view statistics for each classification answer.
  • Created: Displays the total number of instances created for this Ontology class. Each instance is counted only once, ensuring that resubmissions of the same label are not double-counted.
  • Approved: Displays the total number of instances of this Ontology class that have been approved. Approvals are counted multiple times if a label is approved in multiple review stages or if the task is reopened and reviewed again. Use stage-specific filters to see approvals per review stage.
  • Rejected: Displays the number of instances of this Ontology class that have been rejected. Rejections are double-counted if a label is rejected in multiple review stages, rejected again within the same stage, or if the task is reopened and rejected again. Use stage-specific filters to see rejections per review stage.
  • Rejection Rate: Calculates the rejection rate percentage of the given Ontology class by dividing the number of rejected labels by the total number of reviewed labels.

Collaborators View

The Collaborators in the Analytics dashboard provides a detailed view of time spent annotating and reviewing by Project collaborators. It can help answer questions such as:

  1. How much time did each collaborator spend on annotation / review tasks?
    • Use the Time spent chart to see the time distribution for each collaborator across annotation and review tasks. The Annotators and Reviewers tables, which provide total and average times for each collaborator.
  2. Which collaborator spent the most time annotating or reviewing tasks in the Project?
    • Analyze the Time spent chart to identify the collaborator with the highest time allocation.
ChartDescription
Time spentDisplays the distribution of time spent per collaborator per day
AnnotatorsTable view of all relevant task/label actions and timers for each collaborator in the Annotation stages
ReviewersTable view of all relevant task/label actions and timers for each collaborator in the Review stages

Both tables and all CSV exports are filter-sensitive; they only display information within the selected filter conditions.

  • Labels refer to objects and classifications.
  • Approve actions are are double counted if there are multiple review stages.
  • Review actions are double counted if multiple review stages are present or tasks get rejected again in the same review stage.

The Annotators table includes the following columns:

  • Submitted tasks: Total tasks submitted by the annotator.
  • Skipped tasks: Total tasks skipped by the annotator.
  • Approved tasks: Tasks submitted by the annotator that were approved in subsequent review stages.
  • Rejected tasks: Tasks submitted by the annotator that were rejected during review.
  • Task rejection rate: Percentage of the annotator’s submitted tasks that were rejected. If multiple review stages are present, use workflow filters to view stage-specific rejections.
  • Created labels: Total new labels submitted by the annotator. Include any pre-labels imported using the SDK by admins.
  • Edited labels: Total existing labels edited by the annotator. This includes pre-labels from an Agent stage, or labels from a previous Annotate stage. Vertex / coordinate changes are not tracked.
  • Deleted labels: Total existing labels deleted by the annotator. This includes pre-labels from an Agent stage, or labels from a previous Annotate stage.
  • Approved labels: Labels submitted by the annotator that were approved during review.
  • Rejected labels: Labels submitted by the annotator that were rejected during review.
  • Total annotation time: Total active time spent annotating in the Label Editor, rounded up to the nearest 15 seconds.
  • Avg time per task: Average time spent on each submitted annotation task. Calculated using the total active time spent in the Annotate stage divided by the number of submitted tasks.

The Reviewers table includes the following columns:

  • Approved tasks: Number of tasks approved by the reviewer.
  • Rejected tasks: Number of tasks rejected by the reviewer.
  • Task rejection rate: Percentage of reviewed tasks that were rejected by the reviewer.
  • Created labels: Total labels created by the reviewer using Edit Review.
  • Edited labels: Total labels edited by the reviewer using Edit Review.
  • Deleted labels: Total labels deleted by the reviewer using Edit Review.
  • Approved labels: Number of labels approved by the reviewer.
  • Rejected labels: Number of labels rejected by the reviewer.
  • Total review time: Total active time spent reviewing in the Label Editor, rounded up to the nearest 15 seconds.
  • Avg time per task: Average time spent on each actioned review task. Calculated using the total active time spent in the Review stage divided by the number of actioned reviews.

Join Projects in your Org

Organization Admins can search for and join any Projects that exist within the Organization.

  1. Navigate to Projects under the Annotate heading in the Encord platform.
  2. Select the All Encord projects tab.
  3. Find the Project you want to join.
  4. Click Join project to join the Project.

When an Organization Admin joins a Project, they are automatically assigned the Admin user role for that Project.

Projects can be filtered by Project owner or Project tags.
See all Projects you belong to by clicking the Filter by project owner search bar, and selecting My projects only.