Manage Annotation Projects
- In the Encord platform, select Projects under Annotate.
- Select the Project you want to Manage.
Project dashboard
The first thing you see when you click on your Project is the Project dashboard.
This is where you monitor and manage your Project - view your Project’s summary statistics, manage labeling tasks, view your team’s productivity, train models and invite collaborators.
The dashboard is split into the following tabs:
- Summary: a high-level view of labeling and productivity statistics.
- Explore: a quick way to explore the distribution of instances and labels across data assets in the project.
- Queue: Displays all tasks for the Project, separated by stage.
- Workflow: Graphically displays the path tasks follow through the Project Workflow.
- Labels & Export: for managing all the Project’s labeling activity and tasks.
- Performance: a more detailed view of your team’s manual labeling and productivity split into a Summary and Details tab.
- Models: for creating, training and using pre-trained computer vision models with your labeled training data.
- Settings: editing Project options, Ontology, team collaborators, and other general Project settings.
Tab visibility by role
Legend:
- ✅ Fully visible and all actions allowed
- ❌ Not visible
- View only: Visible but no actions allowed
- Specific actions detailed where applicable
Tab | Annotator | Reviewer | Annotator + Reviewer | Team Manager | Admin |
---|---|---|---|---|---|
Summary | ❌ | ❌ | ❌ | ✅ | ✅ |
Explore | ❌ | ❌ | ❌ | ✅ | ✅ |
Queue | Annotate tab only | Review tab only | Annotate + Review tabs only | ✅ | ✅ |
Workflow | ❌ | ❌ | ❌ | View only | ✅ |
Labels & Export | ❌ | ❌ | ❌ | View labels only | ✅ |
Performance | Personal performance | Personal performance | Personal performance | Team Performance | Team performance |
Models | ❌ | ❌ | ❌ | View only | ✅ |
Settings | ❌ | ❌ | ❌ | Add users (except Admins) | ✅ All settings and user management |
Update Project status
Admins and Team Managers can change the status of a Project to any of the following:
- Not started
- In progress
- Paused
- Completed
- Cancelled
When the Project status is changed to either Paused, Completed, or Cancelled then Annotators, Reviewers, and Annotator + Reviewers are prevented from opening tasks, and the Initiate button is greyed out. Admins and Team Managers are able to annotate and review tasks as usual, regardless of the Project status.
Tasks can still be opened using back door access. For example if an Admin shares a specific URL of a task with an Annotator, Reviewer, or Annotator + Reviewer.
Summary
Clicking a workflow project takes you to its summary dashboard. This dashboard has 2 components and gives you a rich visual display of your project’s progress at a high level.
Project task status overview
Displays the number of tasks that are in each state of your workflow project. The number of states and their names reflect the choices made during workflow project creation.
Explore
The Explore tab helps you understand how project annotations are distributed among data assets, at both an instance and label level. It allows a deeper exploration through attributes on objects, as well as frame-level classifications.
- Instance statistics: Class distribution across data assets in the given project.
- Label statistics: Label distributions within data assets, objects and classifications.
Instance statistics
This section provides the total count of all instances across the datasets in your project.
- Project total: Shows total instances (both objects and classifications) across the project by default. To get instance statistics for individual data files, click the drop-down to select a data file.
- Select class: Shows the total instances for a particular class. This is a summary of how a given class is distributed across your project’s data assets. The pie chart segments show a breakdown of how that class is split across the data assets.
- Display timestamps: Flip the toggle to switch between frame numbers and timestamps for the labels.
Label statistics
This is a summary of how your labels are distributed across the project. The pie chart shows a breakdown of how many labels there are for a given class.
- Project total: Shows the total number of labels across different datasets in the project. To get label stats for individual data files, click the drop-down to select a data file.
- Objects: Click on the pie chart segment of a class to see the total number of labels and its attributes (sometimes called nested attributes) if available for that class.
- Classifications: Shows the global classification at project or individual video level. For example, location, time of day, etc.
Quick definitions of classes, instances and labels
- Class: Fundamental unit of the Project’s Ontology. For example the Ontology of a Project annotating traffic
videos could have classes such as Car, Truck, Bicycle, and so on. For more information on objects and classifications, see Ontologies Overview. - Instance: Specific occurrence of a class. Car(0) is an instance of the Car class, for example, it could be a specific black sedan. The single Car(0) instance can appear in a single frame or a range of frames. Therefore, instances may contain multiple labels across frames.
- Label: An frame-specific annotation of an instance. For example the annotation of Car(0) on frame 201 is a label.
Queue
Use the Queue tab to assign and prioritize tasks and to initiate labeling and reviewing for all users attached to a Project. The Queue tab displays differently depending on the user’s permissions.
Queue (Annotator & Reviewer, Admin, Team Manager)
- A - Use the search bar to filter the list of data units being displayed, or to look for a particular data unit.
- B - Select a task to assign it to a user, release the task, or adjust its priority number.
- C - Filter the list of data units being displayed by task status by Dataset, User, Data Type, or Status. This is the order in which tasks will appear in the Label Editor.
- D - Sort the task queue.
- E - Use the Start labeling (annotation task) and Start reviewing (review task) buttons to begin labeling or reviewing a task.
- F - The list of all Workflow stages shows how many data units each stage currently contains. Each stage can be selected.
- G - Shows the task’s priority number. Tasks are listed in descending order of priority by default.
- H - Shows the task’s Status.
- I - Shows the email address of the user the task is assigned to.
- J - Clicking the Initiate button initiates a task. If an annotation stage is selected, an annotation task is initiated. If a Review stage is selected a review task is initiated.
The Queue tab lists tasks in the same order they appear in the Label Editor.
Filter
Tasks can be filtered by either Dataset, User, Data Type, or Status.
- Dataset - Filtering by Dataset will only display data units belonging to the selected Dataset(s).
- Assignees - Filtering by User will only display tasks assigned to a particular user.
- Data Type - Filtering by Data Type will only display data units of a specific type.
- Status - Filtering by Status will only display tasks with a specific status.
Sort by
You can sort the task queue by clicking Sort by next to the filter button. Select whether you want to sort the task queue by:
- Task priority, or alphabetically by the name of the data unit.
- In ascending or descending order.
The default sorting is in descending order of task priority.
Status
The task Status indicates which actions had previously been taken on a task.
- New - The task has not been worked on since being added to the Project.
- Reopened - The task was rejected during the Review stage, and has been returned for re-labeling.
- Skipped - The task was skipped by one or more annotators.
Assigning and releasing tasks
Tasks can be assigned to specific users by selecting them from the list and clicking the Assign button, as shown below. Once a task is assigned, only the assigned user is able to open the task. Alternatively, click the small arrow button in the Assigned to column to assign an individual data unit.
Releasing a task is the opposite of assigning a task, and removes any user the task was assigned to. To release any number of tasks, select them from the list and click the Release button located next to the Assign button shown above.
Task priority
All annotation and review tasks can be assigned a priority level to manage workflow efficiency. Each task is assigned a priority value ranging from 0 to 100, where the default value is set to 50. A value of 100 indicates a high-priority task, requiring immediate attention, where a value of 0 signifies a low-priority task. Annotation and review tasks with higher priority levels are displayed in the label editor in descending order of priority.
To update task priority:
-
Click on the number representing the task’s current priority. This action opens the priority setting interface for that particular task. Alternatively, select the task and click the Adjust priority button.
-
Adjust the task’s priority by either using the slider for quick selection or manually entering a specific number between 0 and 100 in the input field provided. This allows for precise control over the task’s priority level.
-
Once the desired priority level is set, finalize your changes by clicking the Update button. This action saves the new priority setting for the task, effectively updating its status in the task queue.
Queue (Annotators)
Annotators are presented with the following Queue tab, from which they can manage their annotations.
- A - The list of annotation stages shows how many data units each stage currently contains. If more than one stage is listed, clicking a stage lets you view the tasks it contains.
- B - Shows the task’s priority number. Tasks are listed in descending order of priority by default.
- C - The list of tasks / data units in your queue. Unassigned tasks are also visible and they can be initiated by all Annotators.
- D - The task status.
- E - The user a task is assigned to. A blank field indicates an unassigned task.
- F - Click the Initiate button next to a task to start annotating.
- G - The Start labeling button opens the Label Editor, starting with the highest priority task.
The Queue tab lists tasks in the same order they appear in the Label Editor.
Queue (Reviewers)
- A - The list of review stages shows how many data units each stage currently contains. If more than one stage is listed, clicking a stage lets you view the tasks it contains.
- B - Shows the task’s priority number. Tasks are listed in descending order of priority by default.
- C - The list of tasks / data units in your queue. Unassigned tasks are also visible and they can be initiated by all Reviewers.
- D - The task status.
- E - The user a task is assigned to. A blank field indicates an unassigned task.
- F - Click the Initiate button next to a task to review it .
- G - The Start reviewing button opens the Label Editor, starting with the highest priority task.
The Queue tab lists tasks in the same order they appear in the Label Editor.
Labels
The Labels tab is your gateway to auditing and exporting labels created in your project.
Access to each pane will depend on the user’s project role. We quickly summarize the purpose of each tab, and the roles which can access each below.
Role | Activity | Queue | Data | Instances |
---|---|---|---|---|
Annotator | ❌ | ✅ | ❌ | ❌ |
Reviewer | ❌ | ✅ | ❌ | ❌ |
Annotator + Reviewer | ❌ | ✅ | ❌ | ❌ |
Team Manager | ✅ | ✅ | ✅ | ✅ |
Admin | ✅ | ✅ | ✅ | ✅ |
The labels dashboard features the following tabs:
- Data: A complete overview of all tasks in the project, with the option to export your labels on a per-task basis.
- Label Instances: The Label Instances tab lets you use the unique instance identifier to search the project for a specific instance, and jump directly into the editor to confirm the status of an annotation visually.
Data
The Data tab provides a complete overview of all tasks in the project, and lets you see which workflow stage each task is in.
Export labels
Select the data units you want to export labels for and click the Export and save button highlighted in the screenshot below to export labels. See our documentation on exporting labels for more information.
Save label version
Select the data units you’d like to save label version for and click the Save new version button highlighted in the screenshot below. It will be listed in the Saved versions tab.
Label versioning allows you to keep track of your label history over time by providing a snapshot of labels at a point in time. Label versions can be exported and analyzed to track annotation performance over time.
Label Instances
The Instances tab allows Administrators and Team Managers to search within the data to directly find specific instances Recall that an annotation instance correlates to a unique instantiation of a specific ontology class in a data asset. For example, if you have the ‘Person’ class in your ontology, the first instance of a ‘Person’ in a given data asset will be indicated in the interface as ‘Person (0)’, the second as ‘Person (1)’ and so on. Instances, therefore, can exist in multiple frames of a data asset, and indicate the same object. Use the Instances tab to search for specific instances of objects or classifications using their Identifier.
Instance identifiers are unique at the project scope, and can be found in any of the following ways:
- From inside the label editor, by clicking on a particular instance, and then selecting ‘Copy identifier’ from the instance action menu.
- From inside exported labels, where they are known as the
objectHash
orclassificationHash
as appropriate. - When uploading labels using the SDK, you may specify your own
objectHash
orclassificationHash
.
Once you have an identifier of interest, use the ‘Search instance’ interface to filter the instances by identifier to quickly find the instance you’re interested in. This can be particularly handy when you want to visually confirm an annotation you may not have seen before, but for which you have the identifier.
After locating your instance of interest, click View from the ‘Actions’ column to jump deeply into the dataset straight to where the instance is first annotated.
Saved versions
The Saved versions tab displays information for versions of your labels. The Actions column lets you:
-
Export label versions by clicking the Download version icon in the Actions column. For format of exported labels has the same structure as outlined in the export documentation.
-
Delete label versions by clicking the Delete version icon in the Actions column.
Performance
The Performance tab displays different information depending on the user role.
-
Annotator, Reviewer, and Annotator + Reviewer roles see a single tab showcasing their individual performance. The information is categorized by the actions taken by the user, with graphs displaying the number of labels added, labels edited, and labels deleted. An Annotator + Reviewer sees two separate graphs: one for annotation tasks and another for review tasks.
-
Team manager and Admin roles see the Summary and Details tabs.
Performance - Summary
The Summary tab of the performance dashboard provides an overview of your team’s labeling and productivity.
Task actions over time
View the number of tasks in a Project that have been approved, rejected, and submitted for review over a given period of time.
- The height of a bar represents the total number of tasks.
- The height of each color within a bar represents the number of approved, rejected, and submitted tasks.
- A: Set the time period you would like to see displayed by selecting a range of dates.
- B: The Hide days without any actions toggle removes all days at which no actions were taken from the view.
- C: Download a CSV file of the data.
- D: Display the data as a bar chart, or a table. While the chart provides a clear visual representation, the table provides exact figures for a more detailed picture of your team’s performance.
Instance Label actions over time
View the number of instance label actions in a project that have been approved, rejected, and submitted for review over a given period of time.
- A: Set the time period you would like to see displayed by selecting a range of dates.
- B: Download a CSV file of the data.
- C: Display the data as a bar chart, or a table. While the chart provides a clear visual representation, the table provides exact figures for a more detailed picture of your team’s performance.
Within your specified time period, you can choose which dates to display by using the slider located beneath the graph.
Team collaborators
The Team collaborators section shows the duration of time each Project collaborator spend working on a given file.
A. Table entries can be filtered according to dates by clicking the range of dates, and selecting the start and end date of the period you would like to see table entries displayed for.
B. Table entries can be downloaded in CSV format by clicking the Download CSV button.
C. When lots of entries are present they will be split across a number of different pages. The number of table entries per table can be adjusted.
Performance - Details
The Details tab of the performance dashboard gives a more detailed view of your team’s manual labeling and productivity.
Submissions chart
The submissions chart displays the number of submitted labels or instances over the specified time period. The chart can be filtered to show submissions for specific annotators or classes.
If you filter on both Annotators and Classes then the resulting chart will show the submission statistics for the selected annotators and for selected labels.
Reviews chart
The reviews chart displays the cumulative number of accepted and rejected labels or instances over the specified time period.
Annotator’s table
The annotator’s table displays all the relevant statistics for all annotators in a Project. It can be filtered on classes to show annotator statistics only for the selected classes.
- User: The annotator’s email.
- Rejection rate: Percentage of their labels or instances that have been rejected in the review process.
- Submitted labels / instances: Number of labels or instances that the annotator has submitted for review
- Repeated submissions are not counted.
- Accepted labels / instances: Number of labels or instances that the annotator created that passed the review process.
- Rejected labels / instances: Number of labels or instances that the annotator created that we’re rejected during the review process. Note that this can be higher than the number of submitted labels / instances since a label or instance can be rejected multiple times during the review process but the submission will only be logged once.
- Total session time: Time spent labeling.
Reviewer’s table
- User: The reviewers email.
- Rejection rate: Percentage of labels or instances that they rejected in the review process.
- Accepted labels / instances: Number of labels or instances that the reviewer accepted.
- Rejected labels / instances: Number of labels or instances that the reviewer rejected.
- Total session time: Time spent reviewing.
Objects and classifications table
Each row in the objects and classifications table can be expanded to show statistics attributes.
- Class: The class name.
- Rejection rate: Percentage of labels or instances rejected in the review process.
- Reviewed labels / instances: Number of labels or instances of the class that have gone through the review process.
- Accepted labels / instances: Number of labels or instances of the class that have passed the review process.
- Rejected labels / instances: Number of labels or instances of the class that failed the review process.
- Avg. time to annotate: Average time spent annotating this class.
Performance - Task Actions
The Task Actions tab of the Performance dashboard gives an at-a-glance detailed view of your team’s labeling, reviewing, and productivity for tasks and the time spent on those tasks.
You can filter the graphs by the following:
- Date time range
- Workflow stage
- Collaborator
- Dataset
- Data unit title
Performance - Collaborators
The Task Actions tab in the Performance dashboard provides a detailed, at-a-glance view of time spent by Project collaborators across different Workflow stages.
Models
The Models tab is where you attach and manage models for automated labeling.
See our documentation on automated labeling here.
Export
Use the Export tab to export your data. Learn more by visiting our exporting data page.
Workflow
The Workflow tab lets you view and edit the Project’s Workflow. To begin editing the Workflow, click the Edit button on the canvas.
Settings
The Settings tab allows you to make modifications to your Project using the following tabs:
- Options - Copy a Project, modify Datasets, modify Ontology, upload annotation instructions, modify Project tags, and set a default editor layout.
- Danger zone - Delete your Project.
Copy a Project
To copy a Project:
- Click the Copy project button in the Options section of the Project’s Settings.
- Select the parts of the Project to copied into the new Project. The Ontology has to be copied and is therefore always selected. All components of the current Project are selected by default.
You can copy any combination of the following assets:
- Datasets: all datasets are copied. New annotation tasks will be created for all videos and image sequences if their labels are not copied over.
- Labels: copy the labels in specified data units. All labels are copied by default. Change the dropdown to Selected Labels ,as seen in the image below, to only include specific labels in your new Project. Click the +Advanced settings button to select the state of the data units labels should be copied for.
- Models: this will copy all the models in your Project along with their training logs.
- Collaborators: copy all Project users with their respective roles. Project admins are copied regardless of whether this is selected.
- Click the Make a copy button to copy the Project with the specified components.
Upload annotation instructions
-
Navigate to the Settings tab of your Project.
-
Click the Add instructions button to upload instructions for your annotators in PDF format.
To ensure the best possible results, provide as much detail as possible about what you would like annotated and how precise bounding boxes should be drawn. For example, instead of saying ‘person’, consider defining what should constitute a person for your annotators - only a full person? A torso? Or should any part of a person in a frame be labeled as a ‘person’? The more specific your annotator instructions, the higher the chances that your annotators perform well.
Project tags
You can add tags to a Project if you are part of an Organization.
Project tags allow you to:
- Flexibly categorize and group your Projects.
- Filter your Projects.
You can add tags to your Projects in:
-
When creating a Project.
-
In the Settings page of a Project.
To add or remove Project tags:
- Navigate to the Options tab and click the Project tags drop-down. All available tags in your Organization are shown.
- Click on a tag to add it to a Project.
Remove a tag from your Project by clicking the same tag again, or clicking the x button next to its name.
Filtering Projects by tags
You can filter your Projects based on the tags they contain.
- Click the Projects on the left side navigation
- Click the ‘Filter by tags’ drop-down and select one or more Project tags. Only Projects with the selected tags are now displayed.
Edit Project Ontology
You can view or switch the Ontology attached to your Project.
-
Click the Switch ontology button to switch the Ontology linked to your Project. The resulting pop-up allows you to choose an existing Ontology from a list, or create a new Ontology for this Project.
-
Click the View ontology button to view the details of the Ontology that is attached to the current Project.
Edit Datasets attached to a Project
The Datasets section allows you to attach or detach any number of Datasets to your Project by clicking Manage. You must create a new Dataset in the Datasets section for it to become available in a Project’s settings.
Manage Project collaborators
To manage Project collaborators, select the Team pane in your project Settings. Collaborators can be added individually or as groups of users.
Here you can invite collaborators to the Project, and configure their roles.
To add collaborators to a Project:
- Click the + Invite collaborators button. A dialog appears.
-
Select a user role for the collaborator you want to add by selecting an option from the list.
-
Type the email address of the user you want to add and select the user from the list.
-
Click the Add button to add the user with the specified role.
To add collaborators as a group:
- In the Groups section, click on Manage. The Manage groups dialog appears.
- Click the Select group drop-down and pick a group you want to add to the Project.
- Click the Select Role drop-down to assign a role to the group of collaborators.
- Click Add to add the group to the Project.
To change collaborator roles:
Only Project admins can modify collaborator roles. Admin roles cannot be changed, not even by other admins.
You can assign the following roles to collaborators:
- Annotator: annotators are responsible for labeling. This is the default role for all collaborators.
- Reviewer: for reviewing labeled tasks.
- Annotator & reviewer: a combination of annotator and reviewer.
- Team manager: a team manager can assign tasks to other users, and add collaborators to the project.
- Admin: gives this collaborator full administrative control over this Project. This is an irreversible action.
Custom editor layout
Contact support to gain access to custom editor layouts.
You can customize the default Label Editor layout by uploading a JSON file that defines your preferred arrangement. These JSON files can be added in the Project settings and are applied to all tasks in a Project.
Custom editor layouts depend on a file’s client_metadata
or their DICOM tags. This means that custom editor layouts only work for files that contain client_metadata
or DICOM tags. Watch the video tutorials below to learn how it fits together.
To upload an editor layout:
- Navigate to the Project settings.
- Click on Layouts.
- Click Upload JSON.
- Select the JSON file containing the layout you want the Label Editor to have.
JSON file requirements:
The JSON file must follow the JSON schema defined here. The following fields are required:
Complete example
Add client metadata to files
Editor layouts are based on DICOM tags for DICOM files or on client_metadata
for all other use cases. We provide templates for common mammography layouts based on DICOM tags here.
This example uses client_metadata
. The following SDK script can be used to add client metadata to specific data units in a Dataset.
Create a JSON file
Create a JSON for specifying the editor layout that suits your needs.
In the JSON file below:
-
The
grid
arrangement is configured to display two files side by side in the Label Editor. -
The
gridContent
section specifies that tasks in either position (0 or 1) can have client metadata values forencord-EditorGridPosition
set to eitherA
orB
. -
The
topLevelGridFilter
is defined asencord-LayoutGroup
, meaning tasks with matchingencord-LayoutGroup
metadata values are displayed together in the Label Editor, ensuring they appear side by side when they share the sameencord-LayoutGroup
values.
Create a JSON file
Upload the JSON file to your Project in Encord.
- Navigate to the Project settings.
- Click on Layouts.
- Click Upload JSON.
- Select the JSON file containing the layout you want the Label Editor to have.
Create a JSON file
Open any task in the task queue of your Project.
The JSON file in this example results in the following Label Editor layout.
Delete Project
You can delete your Project by going to the Danger zone tab of the Project settings.
To delete a Project, click the red Delete project button.
Join Projects in your Org
Organization Admins can search for and join any Projects that exist within the Organization.
- Navigate to Projects under the Annotate heading in the Encord platform.
- Select the All Encord projects tab.
- Find the Project you want to join.
- Click Join project to join the Project.
When an Organization Admin joins a Project, they are automatically assigned the Admin user role for that Project.
Was this page helpful?