Skip to main content

Performance

The performance dashboard gives a more detailed view of your team's manual labeling and productivity.

There are two global filters that can be applied to the entire dashboard:

  • Date range: sets the date boundaries for the statistics shown on the dashboard
  • Labels or instances: how your labels are distributed across datasets, objects and classifications
    • More information here

Submissions chart#

The submissions chart displays the number of submitted labels or instances over the specified time period. The chart can be filtered to show submissions for specific annotators or classes.

If you filter on both Annotators and Classes then the resulting chart will show the submission statistics for the selected annotators for the selected labels.

Reviews chart#

The reviews chart displays the cumulative number of accepted and rejected labels or instances over the specified time period.

Annotators table#

The annotators' table displays all the relevant statistics for all annotators in a project. It can be filtered on classes to show annotator statistics only for the selected classes.

Table columns#

  • User: The annotator's email
  • Rejection rate: Percentage of their labels or instances that have been rejected in the review process
  • Submitted labels / instances: Number of labels or instances that the annotator has submitted for review
    • Repeated submissions are not counted
  • Accepted labels / instances: Number of labels or instances that the annotator created that passed the review process
  • Rejected labels / instances: Number of labels or instances that the annotator created that we're rejected during the review process
    • Note that this can be higher than the number of submitted labels / instances since a label or instance can be rejected multiple times during the review process but the submission will only be logged once
  • Total session time: Time spent labeling

Reviewers table#

Table columns#

  • User: The reviewers email
  • Rejection rate: Percentage of labels or instances that they rejected in the review process
  • Accepted labels / instances: Number of labels or instances that the reviewer accepted
  • Rejected labels / instances: Number of labels or instances that the reviewer rejected
  • Total session time: Time spent reviewing

Objects and classifications table#

Each row the in the objects and classifications table can be expanded to show statistics on the nested attributes of the class.

Table columns#

  • Class: The class name
  • Rejection rate: Percentage of labels or instances rejected in the review process
  • Reviewed labels / instances: Number of labels or instances of the class that have gone through the review process
  • Accepted labels / instances: Number of labels or instances of the class that have passed the review process
  • Rejected labels / instances: Number of labels or instances of the class that failed the review process
  • Avg. time to annotate: Average time spent annotating this class