Project Analytics
The Analytics tab of your Project shows event-based analytics for your Project’s tasks, labels, and users. The Analytics tab has the following views:
- Tasks: View analytics on specific tasks in your Project.
- Labels: View analytics of labels in your Project.
- Collaborators: View collaborator performance in your Project.
The analytics available on the Analytics dashboard vary based on user roles:
- Admins and Team Managers have access to the Tasks, Labels, and Collaborators views, offering a comprehensive overview of team performance.
- Annotators, Reviewers, and Annotator + Reviewer roles can only view the Task Actions, Time Spent, and Label Actions tables, limited to their individual contributions.
Actions performed in the Label Editor or using the SDK are only tracked on the Analytics dashboard if they the tasks are actioned (submitted/rejected/approved).
Project Analytics have the following filters:
Filter | Description |
---|---|
Collaborators | Filter by specific Project collaborators. |
Datasets | Filter tasks based on the dataset they belong to. |
File name includes | Filter by file name. Regex is supported, allowing filtering by prefix or infix in the file title. |
Event source | Filter by the source of the event. Either SDK or UI. |
Workflow stage | Filter tasks based on their current stage in the Workflow. |
Class | Filter by Ontology class. |
Tasks View
The Tasks tab of the Analytics dashboard provides a detailed overview of task throughput, stage efficiency, and task timers. It can help answer questions such as:
-
How productive was each collaborator in terms of labeling and reviewing tasks over the last week?
- By filtering the data by Collaborators and Date time range, you can see how many tasks each team member worked on and how much time they spent on labeling and reviewing.
-
Which Dataset has the most labels added, edited, or deleted in a given Workflow stage?
- You can filter by Datasets and Workflow stage to see which Dataset is being worked on the most and how many labels are being modified at each stage of the process.
The following charts are visible in the Tasks view:
Chart | Description |
---|---|
Task actions | Displays the number of annotation tasks submitted, skipped, review tasks approved, and review tasks rejected over a selected date range. |
Time spent | Shows the total time spent actively working in the Label Editor, providing insights into productivity. |
Labels View
Label analytics are not supported for Text and Audio modalities.
The Labels tab in the Analytics dashboard provides a detailed overview of your team’s labeling, reviewing, and task productivity, including time spent. It can help answer questions such as:
-
How many labels were submitted, approved, or rejected by the team over a given period?
- Use the Label actions chart and apply the Date time range filter to view the total number of labels submitted, approved, or rejected within the selected time frame.
-
What actions have been taken for specific objects and classifications in the Ontology?
- Refer to the Objects and classifications actions chart and expand classifications within the Ontology column to see detailed statistics for each classification answer.
-
How does the team’s productivity in labeling compare across different objects or classifications? Do certain objects take more time to label than others?
- Analyze the Created, Approved, and Rejected columns in the Objects and classifications actions table to identify objects or classifications that might require additional review or clarification using their Rejection rate.
- Compare the average time spent per object or classification by utilizing time-tracking metrics alongside these productivity statistics.
Chart | Description |
---|---|
Label actions | Displays the number of labels submitted, approved, or rejected. |
Objects and classifications actions | Shows the actions taken for all objects and classifications in the Ontology. |
The Objects and classifications actions table includes the following columns:
- Ontology: Represents the Ontology class, encompassing both objects and classifications. For classifications, you can expand to view statistics for each classification answer.
- Created: Displays the total number of instances created for this Ontology class. Each instance is counted only once, ensuring that resubmissions of the same label are not double-counted.
- Approved: Displays the total number of instances of this Ontology class that have been approved. Approvals are counted multiple times if a label is approved in multiple review stages or if the task is reopened and reviewed again. Use stage-specific filters to see approvals per review stage.
- Rejected: Displays the number of instances of this Ontology class that have been rejected. Rejections are double-counted if a label is rejected in multiple review stages, rejected again within the same stage, or if the task is reopened and rejected again. Use stage-specific filters to see rejections per review stage.
- Rejection Rate: Calculates the rejection rate percentage of the given Ontology class by dividing the number of rejected labels by the total number of reviewed labels.
Collaborators View
The Collaborators in the Analytics dashboard provides a detailed view of time spent annotating and reviewing by Project collaborators. It can help answer questions such as:
- How much time did each collaborator spend on annotation / review tasks?
- Use the Time spent chart to see the time distribution for each collaborator across annotation and review tasks. The Annotators and Reviewers tables, which provide total and average times for each collaborator.
- Which collaborator spent the most time annotating or reviewing tasks in the Project?
- Analyze the Time spent chart to identify the collaborator with the highest time allocation.
Chart | Description |
---|---|
Time spent | Displays the distribution of time spent per collaborator per day |
Annotators | Table view of all relevant task/label actions and timers for each collaborator in the Annotation stages |
Reviewers | Table view of all relevant task/label actions and timers for each collaborator in the Review stages |
- Labels refer to objects and classifications.
- Approve actions are are double counted if there are multiple review stages.
- Review actions are double counted if multiple review stages are present or tasks get rejected again in the same review stage.
Both tables and all CSV exports are filter-sensitive; they only display information within the selected filter conditions.
The Annotators and Reviewers table columns vary depending on whether the Instance labels or Frame labels view is selected. They include the following columns.
You can choose which columns should be displayed and included in the CSV export using the Columns selector.
Analytics FAQ
How do I track annotator performance on pre-labeling/model labels?
Annotator performance on pre-labeled or model-generated labels can be tracked using the Edited Labels and Deleted Labels metrics in the Annotators table. These include:
- Labels that were pre-labeled using an Agent stage.
- Labels that were modified by annotators during the annotation process.
Vertex/coordinate changes are not tracked as edits.
Can I export analytics for payroll or performance tracking?
Yes, CSV exports from the Analytics tab provide data that can be used for payroll or performance tracking.
- The Annotators and Reviewers tables contain total time spent, tasks submitted, approved, and rejected.
- Data in exports is filter-sensitive, so you can apply filters before exporting.
- Review time is rounded to the nearest 15 seconds per task.
What is counted as a created label?
A created label is any new label added by an annotator or reviewer. This includes:
- A completely new label drawn or added during annotation.
- Labels created in Edit Review mode.
- Labels submitted using the SDK (counted under the annotator who submitted the task).
What is counted as an edited label?
An edited label is a modification made to an existing label. This includes:
- Changes to attributes or classifications.
- Labels modified from pre-labeling or a previous annotation stage.
Can I customize what data is included in the CSV exports?
No, CSV exports are pre-defined and contain all available metrics. However, you can filter the data before exporting to limit what is included.
How are timers tracked?
Timers track active time spent in the Label Editor:
- Time is rounded to the nearest 15 seconds per task.
- Pauses or inactivity do not count toward the active time.
- The
Avg time per task
metric is calculated by dividing the total annotation/review time by the number of submitted tasks.
Are label events counted if a task is skipped?
No. If a task is skipped, any actions performed before skipping (for example, creating/editing labels) are not recorded in the Analytics dashboard.
How are actions outside of the task queue counted?
Actions performed outside of the task queue, such as label modifications using the SDK are only tracked if the task is subsequently submitted, rejected, or approved.
How does the SDK filter work?
The Event Source filter allows you to view actions performed using the SDK or UI.
What happens if someone is inactive or takes a break?
- Inactive time is not recorded. The system only tracks time when the annotator is actively working in the Label Editor.
- If an annotator takes a break without closing the task, their session remains open, but inactive time is not counted.
Was this page helpful?