Project Analytics
The Analytics tab of your Project shows event-based analytics for your Project’s tasks, labels, and users. The Analytics tab has the following views:
- Tasks: View analytics on specific tasks in your Project.
- Labels: View analytics of labels in your Project.
- Collaborators: View collaborator performance in your Project.
The analytics available on the Analytics dashboard vary based on user roles:
- Admins and Team Managers have access to the Tasks, Labels, and Collaborators views, offering a comprehensive overview of team performance.
- Annotators, Reviewers, and Annotator + Reviewer roles can only view the Task Actions, Time Spent, and Label Actions tables, limited to their individual contributions.
Actions performed in the Label Editor or using the SDK are only tracked on the Analytics dashboard if they the tasks are actioned (submitted/rejected/approved).
Project Analytics have the following filters:
Filter | Description |
---|---|
Collaborators | Filter by specific Project collaborators. |
Datasets | Filter tasks based on the dataset they belong to. |
File Name | Filter by file name. Regex is supported, allowing filtering by prefix or infix in the file title. |
Event source | Filter by the source of the event. Either SDK or UI. |
Workflow Stage | Filter tasks based on their current stage in the Workflow. |
Class | Filter by Ontology class. |
Tasks View
The Tasks tab of the Analytics dashboard provides a detailed overview of task throughput, stage efficiency, and task timers. It can help answer questions such as:
-
How productive was each collaborator in terms of labeling and reviewing tasks over the last week?
- By filtering the data by Collaborators and Date time range, you can see how many tasks each team member worked on and how much time they spent on labeling and reviewing.
-
Which Dataset has the most labels added, edited, or deleted in a given workflow stage?
- You can filter by Datasets and Workflow stage to see which Dataset is being worked on the most and how many labels are being modified at each stage of the process.
The following charts are visible in the Tasks view:
Chart | Description |
---|---|
Task actions | Displays the number of annotation tasks submitted, skipped, review tasks approved, and review tasks rejected over a selected date range. |
Time spent | Shows the total time spent actively working in the Label Editor, providing insights into productivity. |
Labels View
The Labels tab in the Analytics dashboard provides a detailed overview of your team’s labeling, reviewing, and task productivity, including time spent. It can help answer questions such as:
-
How many labels were submitted, approved, or rejected by the team over a given period?
- Use the Label actions chart and apply the Date time range filter to view the total number of labels submitted, approved, or rejected within the selected time frame.
-
What actions have been taken for specific objects and classifications in the Ontology?
- Refer to the Objects and classifications actions chart and expand classifications within the Ontology column to see detailed statistics for each classification answer.
-
How does the team’s productivity in labeling compare across different objects or classifications? Do certain objects take more time to label than others?
- Analyze the Created, Approved, and Rejected columns in the Objects and classifications actions table to identify objects or classifications that might require additional review or clarification using their Rejection rate.
- Compare the average time spent per object or classification by utilizing time-tracking metrics alongside these productivity statistics.
Chart | Description |
---|---|
Label actions | Displays the number of labels submitted, approved, or rejected. |
Objects and classifications actions | Shows the actions taken for all objects and classifications in the Ontology. |
The Objects and classifications actions table includes the following columns:
- Ontology: Represents the Ontology class, encompassing both objects and classifications. For classifications, you can expand to view statistics for each classification answer.
- Created: Displays the total number of instances created for this Ontology class. Each instance is counted only once, ensuring that resubmissions of the same label are not double-counted.
- Approved: Displays the total number of instances of this Ontology class that have been approved. Approvals are counted multiple times if a label is approved in multiple review stages or if the task is reopened and reviewed again. Use stage-specific filters to see approvals per review stage.
- Rejected: Displays the number of instances of this Ontology class that have been rejected. Rejections are double-counted if a label is rejected in multiple review stages, rejected again within the same stage, or if the task is reopened and rejected again. Use stage-specific filters to see rejections per review stage.
- Rejection Rate: Calculates the rejection rate percentage of the given Ontology class by dividing the number of rejected labels by the total number of reviewed labels.
Collaborators View
The Collaborators in the Analytics dashboard provides a detailed view of time spent annotating and reviewing by Project collaborators. It can help answer questions such as:
- How much time did each collaborator spend on annotation / review tasks?
- Use the Time spent chart to see the time distribution for each collaborator across annotation and review tasks. The Annotators and Reviewers tables, which provide total and average times for each collaborator.
- Which collaborator spent the most time annotating or reviewing tasks in the Project?
- Analyze the Time spent chart to identify the collaborator with the highest time allocation.
Chart | Description |
---|---|
Time spent | Displays the distribution of time spent per collaborator per day |
Annotators | Table view of all relevant task/label actions and timers for each collaborator in the Annotation stages |
Reviewers | Table view of all relevant task/label actions and timers for each collaborator in the Review stages |
Both tables and all CSV exports are filter-sensitive; they only display information within the selected filter conditions.
- Labels refer to objects and classifications.
- Approve actions are are double counted if there are multiple review stages.
- Review actions are double counted if multiple review stages are present or tasks get rejected again in the same review stage.
The Annotators table includes the following columns:
- Submitted tasks: Total tasks submitted by the annotator.
- Skipped tasks: Total tasks skipped by the annotator.
- Approved tasks: Tasks submitted by the annotator that were approved in subsequent review stages.
- Rejected tasks: Tasks submitted by the annotator that were rejected during review.
- Task rejection rate: Percentage of the annotator’s submitted tasks that were rejected. If multiple review stages are present, use workflow filters to view stage-specific rejections.
- Created labels: Total new labels submitted by the annotator. Include any pre-labels imported using the SDK by admins.
- Edited labels: Total existing labels edited by the annotator. This includes pre-labels from an Agent stage, or labels from a previous Annotate stage. Vertex / coordinate changes are not tracked.
- Deleted labels: Total existing labels deleted by the annotator. This includes pre-labels from an Agent stage, or labels from a previous Annotate stage.
- Approved labels: Labels submitted by the annotator that were approved during review.
- Rejected labels: Labels submitted by the annotator that were rejected during review.
- Total annotation time: Total active time spent annotating in the Label Editor, rounded up to the nearest 15 seconds.
- Avg time per task: Average time spent on each submitted annotation task. Calculated using the total active time spent in the Annotate stage divided by the number of submitted tasks.
The Reviewers table includes the following columns:
- Approved tasks: Number of tasks approved by the reviewer.
- Rejected tasks: Number of tasks rejected by the reviewer.
- Task rejection rate: Percentage of reviewed tasks that were rejected by the reviewer.
- Created labels: Total labels created by the reviewer using Edit Review.
- Edited labels: Total labels edited by the reviewer using Edit Review.
- Deleted labels: Total labels deleted by the reviewer using Edit Review.
- Approved labels: Number of labels approved by the reviewer.
- Rejected labels: Number of labels rejected by the reviewer.
- Total review time: Total active time spent reviewing in the Label Editor, rounded up to the nearest 15 seconds.
- Avg time per task: Average time spent on each actioned review task. Calculated using the total active time spent in the Review stage divided by the number of actioned reviews.
Was this page helpful?