Get started
- Introduction to Encord
- 1. Goals with Encord
- 2. Import data
- 3. Create Dataset
- 4. Create an Ontology
- 5. Create a Project
- How to Label
General
Index
- Overview of Encord Index
- Getting Started with Index
- Upload files to Encord
- Files
- Index Basics
Annotate
- Overview of Encord Annotate
- Get Started with Annotate
- Datasets
- Ontologies
- Agents (BETA)
- Projects
- Export labels
- Webhooks and Notifications
- Label Editor
- Automated Labeling
- Encord Annotate API
- Settings
- API Keys
- Annotate FAQ
Active
- Overview of Encord Active
- Get started with Active
- Importing
- Basics
- Collections and Bulk Actions
- Embeddings
- Tutorials
- How to
- Model Evaluation
- Active Videos Guides
- Active FAQ
Compare Model Performance
You have trained your model, and now you are ready to see how it performs. It is time to perform a cycle of the Active model optimization workflow.
Now you want to compare your model’s performance before using Encord (or maybe after running a number of data curation and label validation cycles). Active supports doing direct model prediction performance comparison from within your Active Project.
This process assumes you have already imported your model’s predictions in to Active at least twice.
-
Log in to Encord. The Encord Homepage appears.
-
Create a Workflow Project in Annotate.
-
Click Active. The Active landing page appears.
-
Click an Active Project. The Project opens on the Explorer.
-
Click Model Evaluation. The Model Evaluation page appears with Summary displaying.
-
Select an entry from the dropdown under Prediction Set under Overview.
-
Select an entry from the dropdown under Compare against under Overview.
-
Click through the various entries on the left side of the Model Evaluation page to view the comparison.
-
Add more data and start the data curation, label validation, and model optimization cycles until the model reaches a performance level that you require.
This process assumes you are just getting started with Encord. You have not trained your model yet. You are using Encord to prepare your data for annotation, annotating your data, labeling your data, validating your labels, fixing any label issues, then training your model.
-
Log in to Encord. The Encord Homepage appears.
-
Create a Workflow Project in Annotate.
-
Click Active. The Active landing page appears.
-
Click an Active Project. The Project opens on the Explorer.
-
Click Model Evaluation. The Model Evaluation page appears.
-
[Perform data curation on your Project in Active(/platform-documentation/Active/active-tutorials/active-use-cases#data-cleansingcuration).
-
Label and review your data in Annotate.
-
Sync the Active Project with the updated Annotate Project.
-
Perform label validation on your updated and synced Project.
-
Send the Project to Annotate.
-
Label and review your data in Annotate.
-
Retrain your model using the curated and validated data/labels.
-
Click the Active Project. The Project opens on the Explorer.
-
Click Model Evaluation. The Model Evaluation page appears.
-
Select an entry from the dropdown under Prediction Set under Overview.
-
Select an entry from the dropdown under Compare against under Overview.
-
Click through the various entries on the left side of the Model Evaluation page to view the comparison.
-
Add more data and start the data curation, label validation, and model optimization cycles until the model reaches a performance level that you require.
Was this page helpful?