Import Data from AWS
Your data is uploaded to, and securely stored in the Files section of Index where it is organized into folders and sub-folders. Importing your data into Encord is a multi-step process:
- Set up an AWS integration.
- Create an AWS integration in Encord.
- Create a JSON or CSV.
- Create a folder to store your data in Encord.
- Upload your data to the folder.
Step 1: Set up AWS
Before you can do anything with the Encord platform and cloud storage, you need to configure your cloud storage to work with Encord. Once the integration between Encord and your cloud storage is complete, you can then use your data in Encord.
In order to integrate with AWS S3, you need to:
- Create a permission policy for your resources that allows appropriate access to Encord.
- Create a role for Encord and attach the policy so that Encord can access those resources.
- Activate Cross-origin resource sharing which allows Encord to access those resources from a web browser.
- Test the integration to make sure it works.
You have the following options to integrate AWS and Encord:
Step 2: Create AWS-Encord Integration
In the Integrations section of the Encord platform, click +New integration to create a new integration.
Select AWS S3 at the top of the chooser.
Step 3: Create Metadata Schema
Metadata schema
Before importing your custom metadata to Encord, we recommend that you import a metadata schema. Encord uses metadata schemas to validate custom metadata uploaded to Encord and to instruct Index and Active how to display your metadata.
video.description
, while team B could use audio.description
. Another example could be TeamName.MetadataKey
. This approach maintains clarity and avoids key collisions across departments.Benefits of Using a Metadata Schema
Using a metadata schema provides several benefits:
- Validation: Ensures that all custom metadata conforms to predefined data types, reducing errors during data import and processing.
- Consistency: Maintains uniformity in data types across different datasets and projects, which simplifies data management and analysis.
- Filtering and Sorting: Enhances the ability to filter and sort data efficiently in the Encord platform, enabling more accurate and quick data retrieval.
Metadata Schema Table
Use add_scalar
to add a scalar key to your metadata schema.
Scalar Key | Description | Display Benefits |
---|---|---|
boolean | Binary data type with values “true” or “false”. | Filtering by binary values |
datetime | ISO 8601 formatted date and time. | Filtering by time and date |
number | Numeric data type supporting float values. | Filtering by numeric values |
uuid | Customer specified unique identifier for a data unit. | Filtering by customer specified unique identifier |
varchar | Textual data type. Formally string . string can be used as an alias for varchar , but we STRONGLY RECOMMEND that you use varchar . | Filtering by string. |
text | Text data with unlimited length (example: transcripts for audio). Formally long_string . long_string can be used as an alias for text , but we STRONGLY RECOMMEND that you use text . | Storing and filtering large amounts of text. |
Use add_enum
and add_enum_options
to add an enum and enum options to your metadata schema.
Key | Description | Display Benefits |
---|---|---|
enum | Enumerated type with predefined set of values. | Facilitates categorical filtering and data validation |
Use add_embedding
to add an embedding to your metadata schema.
Key | Description | Display Benefits |
---|---|---|
embedding | 512 dimension embeddings for Active, 1 to 4096 for Index. | Filtering by embeddings, similarity search, 2D scatter plot visualization (Coming Soon) |
Incorrectly specifying a data type in the schema can cause errors when filtering your data in Index or Active. If you encounter errors while filtering, verify your schema is correct. If your schema has errors, correct the errors, re-import the schema, and then re-sync your Active Project.
Step 4: Create JSON or CSV for import
All types of data (videos, images, image groups, image sequences, and DICOM) from a private cloud are added to a Dataset in the same way, by using a JSON or CSV file. The file includes links to all images, image groups, videos and DICOM files in your cloud storage.
Create JSON file for import
For detailed information about the JSON file format used for import go here.
The information provided about each of the following data types is designed to get you up and running as quickly as possible without going too deeply into the why or how. Look at the template for each data type, then the examples, and adjust the examples to suit your needs.
skip_duplicate_urls
is set to true
, all object URLs that exactly match existing images/videos in the dataset are skipped.Use a Multi-Region Access Point
When using a Multi-Region Access Point for your AWS S3 buckets the JSON file has to be slightly different from the examples provided. Instead of an object’s URL, objects are specified using the ARN of the Multi-Region Access Point followed by the object name. The example below shows how video files from a Multi-Region Access Point would be specified.
Create CSV file for import
In the CSV file format, the column headers specify which type of data is being uploaded. You can add and single file format at a time, or combine multiple data types in a single CSV file.
Details for each data format are given in the sections below.
- Object URLs can’t contain whitespace.
- For backwards compatibility reasons, a single column CSV is supported. A file with the single
ObjectUrl
column is interpreted as a request for video upload. If your objects are of a different type (for example, images), this error displays: “Expected a video, got a file of type XXX”.
Step 5: Create a folder
- Navigate to Files section of Index in the Encord platform.
- Click + New folder. A dialog to create a new folder appears.
- Give the folder a meaningful name and description.
- Click Create to create the folder. The folder is listed in Files.
Step 6: Upload your data to the folder
- Navigate to Files section of Index in the Encord platform.
- Click + Upload files. A dialog appears.
- Select the folder you created in step 4.
- Click the Import from private cloud option.
- Select the integration you created in step 2 to add your cloud data.
Was this page helpful?