Workspace and user model
Every Encord deployment is organized around a Workspace — your organization’s dedicated environment within the platform. All users, projects, datasets, and ontologies live within the Workspace.Workspace roles
| Role | Capabilities |
|---|---|
| Admin | Full access: manage users, view all projects and datasets, create and delete resources |
| Workforce Manager | Add and manage Taskers, create projects and datasets, manage user groups |
| Member | Create and access projects, datasets, and ontologies they are invited to |
| Tasker | Access only the projects and tasks they are assigned to |
User groups
Workspace Admins and Workforce Managers can create user groups to organize annotators and reviewers into named pools. Groups can be assigned to workflow stages in bulk, making it easier to manage large annotator workforces without configuring permissions task by task.Project and workflow structure
Projects
An Encord Project brings together:- One or more Datasets (the data to be labeled)
- An Ontology (the labeling schema)
- A Workflow (the path tasks follow from annotation to completion)
- Collaborators (users and their roles)
Workflows
Workflows define the stages a task passes through — from initial annotation, through one or more review stages, to completion. Encord provides a no-code Workflow builder for creating custom pipelines. Common workflow patterns include:- Annotate → Complete — single-stage labeling for straightforward tasks
- Annotate → Review → Complete — two-stage with human review
- Annotate → Review → Rework → Review → Complete — iterative review with rework loops
- Consensus — multiple annotators label the same task independently; disagreements are surfaced for resolution
Task management at scale
Queue management
The Queue tab in each Project shows all tasks organized by workflow stage. Admins and Team Managers can:- Assign tasks to specific annotators or reviewers
- Set task priority (0–100) to control order of execution
- Move tasks between workflow stages manually
- Filter by dataset, assignee, data type, task status, or issue status
- Sort by priority or alphabetically
Task assignment strategies
| Strategy | When to use |
|---|---|
| Unassigned (pull model) | Annotators self-assign from the queue; best for large, homogeneous workforces |
| Explicit assignment | Admin assigns specific tasks to specific annotators; best for specialized data or tiered review |
| Selective workflow stages | Specific users are assigned to specific workflow stages; best for multi-stage QA pipelines |
Task statuses
Tasks carry a status that reflects their history:- New — not yet worked on since being added to the project
- Reopened — rejected during review and returned for re-labeling
- Skipped — skipped by one or more annotators
Quality assurance
Review workflows
Encord’s review stages allow designated reviewers to inspect, approve, or reject annotated tasks. Rejected tasks are automatically returned to the annotator with issue notes.Consensus labeling
For high-stakes labeling tasks, Consensus workflows assign the same task to multiple annotators independently. Disagreements between annotations are surfaced automatically, allowing team managers to resolve conflicts and measure inter-annotator agreement. See Consensus Workflows for setup details.Issues
Annotators and reviewers can raise Issues on specific labels or frames — flagging ambiguities, errors, or edge cases for discussion. Issues can be assigned, tracked, and resolved within the platform.Benchmark workflows
Benchmark projects allow you to measure annotator accuracy against a gold-standard dataset. This is useful for:- Screening new annotators before onboarding
- Ongoing quality monitoring for active annotators
- Training annotators on difficult edge cases
Analytics and reporting
Project-level analytics
The Analytics tab in each Project provides detailed statistics on labeling and review activity:- Labels created, approved, and rejected
- Time spent per task, per user, and per stage
- Open issues by annotator or reviewer
- Task throughput over time
Role-based analytics views
What you see in the Analytics tab depends on your role:- Annotators and Reviewers see their own personal performance metrics
- Team Managers and Admins see aggregate team metrics and can drill down to individual performance
Workspace-level reporting
Workspace Admins can view activity across all Projects in the Workspace, giving a firmwide view of annotation throughput, workforce utilization, and quality trends.Project status management
Admins and Team Managers can set a Project’s status to reflect its current phase:- Not started / In progress / Paused / Completed / Cancelled / Archived
Where to go next
- Workspace Settings — managing users, groups, and workspace configuration
- Workflows and Templates — building and reusing annotation pipelines
- Consensus Workflows — multi-annotator QA and agreement measurement
- Security and Compliance — access controls and audit capabilities

