Skip to main content

ImageData Objects

class ImageData()
Information about individual images within a single :class:~encord.orm.dataset.DataRow of type :meth:DataType.IMG_GROUP <encord.constants.enums.DataType.IMG_GROUP>. Get this information via the :meth:DataRow.images <encord.orm.dataset.DataRow.images> property.

file_type

@property
def file_type() -> str
The MIME type of the file.

file_size

@property
def file_size() -> int
The size of the file in bytes.

signed_url

@property
def signed_url() -> Optional[str]
The signed URL if one was generated when this class was created.

DataRow Objects

class DataRow(dict, Formatter)
Each individual DataRow is one upload of a video, image group, single image, or DICOM series. This class has dict-style accessors for backwards compatibility. Clients who are using this class for the first time are encouraged to use the property accessors and setters instead of the underlying dictionary. The mixed use of the dict style member functions and the property accessors and setters is discouraged. WARNING: Do NOT use the .data member of this class. Its usage could corrupt the correctness of the datastructure.

uid

@property
def uid() -> str
The unique identifier for this data row. Note that the setter does not update the data on the server.

title

@property
def title() -> str
The data title. The setter updates the custom client metadata. This queues a request for the backend which will be executed on a call of :meth:.DataRow.upload.

data_type

@data_type.setter
def data_type(value: DataType) -> None
DEPRECATED. Do not this function as it will never update the created_at in the server.

created_at

@created_at.setter
def created_at(value: datetime) -> None
DEPRECATED. Do not this function as it will never update the created_at in the server.

frames_per_second

@property
def frames_per_second() -> Optional[int]
If the data type is :meth:DataType.VIDEO <encord.constants.enums.DataType.VIDEO> this returns the actual number of frames per second for the video. Otherwise, it returns None as a frames_per_second field is not applicable.

duration

@property
def duration() -> Optional[int]
If the data type is :meth:DataType.VIDEO <encord.constants.enums.DataType.VIDEO> this returns the actual duration for the video. Otherwise, it returns None as a duration field is not applicable.

client_metadata

@property
def client_metadata() -> Optional[MappingProxyType]
The currently cached client metadata. To cache the client metadata, use the :meth:~encord.orm.dataset.DataRow.refetch_data() function. The setter updates the custom client metadata. This queues a request for the backend which will be executed on a call of :meth:.DataRow.upload.

width

@property
def width() -> Optional[int]
An actual width of the data asset. This is None for data types of :meth:DataType.IMG_GROUP <encord.constants.enums.DataType.IMG_GROUP> where :meth:is_image_sequence <encord.data.DataRow.is_image_sequence> is False, because each image in this group can have a different dimension. Inspect the :meth:images <encord.data.DataRow.images> to get the height of individual images.

height

@property
def height() -> Optional[int]
An actual height of the data asset. This is None for data types of :meth:DataType.IMG_GROUP <encord.constants.enums.DataType.IMG_GROUP> where :meth:is_image_sequence <encord.data.DataRow.is_image_sequence> is False, because each image in this group can have a different dimension. Inspect the :meth:images <encord.data.DataRow.images> to get the height of individual images.
@property
def file_link() -> Optional[str]
A permanent file link of the given data asset. When stored in :meth:StorageLocation.CORD_STORAGE <encord.orm.dataset.StorageLocation.CORD_STORAGE> this will be the internal file path. In private bucket storage location this will be the full path to the file. If the data type is DataType.DICOM then this returns None as no single file is associated with the series.

signed_url

@property
def signed_url() -> Optional[str]
The cached signed url of the given data asset. To cache the signed url, use the :meth:~encord.orm.dataset.DataRow.refetch_data() function.

file_size

@property
def file_size() -> int
The file size of the given data asset in bytes.

file_type

@property
def file_type() -> str
A MIME file type of the given data asset as a string

images_data

@property
def images_data() -> Optional[List[ImageData]]
A list of the cached :class:~encord.orm.dataset.ImageData objects for the given data asset. Fetch the images with appropriate settings in the :meth:~encord.orm.dataset.DataRow.refetch_data() function. If the data type is not :meth:DataType.IMG_GROUP <encord.constants.enums.DataType.IMG_GROUP> then this returns None.

is_optimised_image_group

@property
@deprecated("0.1.98", ".is_image_sequence")
def is_optimised_image_group() -> Optional[bool]
If the data type is an :meth:DataType.IMG_GROUP <encord.constants.enums.DataType.IMG_GROUP>, returns whether this is a performance optimised image group. Returns None for other data types. DEPRECATED: This method is deprecated and will be removed in the upcoming library version. Please use :meth:.is_image_sequence instead

is_image_sequence

@property
def is_image_sequence() -> Optional[bool]
If the data type is an :meth:DataType.IMG_GROUP <encord.constants.enums.DataType.IMG_GROUP>, returns whether this is an image sequence. Returns None for other data types. For more details refer to the :ref:documentation on image sequences <https://docs.encord.com/docs/annotate-supported-data#image-sequences>

backing_item_uuid

@property
def backing_item_uuid() -> UUID
The id of the :class:encord.storage.StorageItem that underlies this data row. See also :meth:encord.user_client.EncordUserClient.get_storage_item.

refetch_data

def refetch_data(
        *,
        signed_url: bool = False,
        images_data_fetch_options: Optional[ImagesDataFetchOptions] = None,
        client_metadata: bool = False)
Fetches all the most up-to-date data. If any of the parameters are falsy, the current values will not be updated. Arguments:
  • signed_url - If True, this will fetch a generated signed url of the data asset.
  • images_data_fetch_options - If not None, this will fetch the image data of the data asset. You can additionally specify what to fetch with the :class:.ImagesDataFetchOptions class.
  • client_metadata - If True, this will fetch the client metadata of the data asset.

save

def save() -> None
Sync local state to the server, if updates are made. This is a blocking function. The newest values from the Encord server will update the current :class:.DataRow object.

DataRows Objects

@dataclasses.dataclass(frozen=True)
class DataRows(dict, Formatter)
This is a helper class that forms request for filtered dataset rows Not intended to be used directly

DatasetInfo Objects

@dataclasses.dataclass(frozen=True)
class DatasetInfo()
This class represents a dataset in the context of listing

Dataset Objects

class Dataset(dict, Formatter)

__init__

def __init__(title: str,
             storage_location: str,
             data_rows: List[DataRow],
             dataset_hash: str,
             description: Optional[str] = None,
             backing_folder_uuid: Optional[UUID] = None)
DEPRECATED - prefer using the :class:encord.dataset.Dataset class instead. This class has dict-style accessors for backwards compatibility. Clients who are using this class for the first time are encouraged to use the property accessors and setters instead of the underlying dictionary. The mixed use of the dict style member functions and the property accessors and setters is discouraged. WARNING: Do NOT use the .data member of this class. Its usage could corrupt the correctness of the datastructure.

AddPrivateDataResponse Objects

@dataclasses.dataclass(frozen=True)
class AddPrivateDataResponse(Formatter)
Response of add_private_data_to_dataset

CreateDatasetResponse Objects

class CreateDatasetResponse(dict, Formatter)

__init__

def __init__(title: str, storage_location: int, dataset_hash: str,
             user_hash: str, backing_folder_uuid: Optional[UUID])
This class has dict-style accessors for backwards compatibility. Clients who are using this class for the first time are encouraged to use the property accessors and setters instead of the underlying dictionary. The mixed use of the dict style member functions and the property accessors and setters is discouraged. WARNING: Do NOT use the .data member of this class. Its usage could corrupt the correctness of the datastructure.

StorageLocation Objects

class StorageLocation(IntEnum)

NEW_STORAGE

This is a placeholder for a new storage location that is not yet supported by your SDK version. Please update your SDK to the latest version.

DatasetType

For backwards compatibility

DatasetData Objects

class DatasetData(base_orm.BaseORM)
Video base ORM.

SignedVideoURL Objects

class SignedVideoURL(base_orm.BaseORM)
A signed URL object with supporting information.

SignedImageURL Objects

class SignedImageURL(base_orm.BaseORM)
A signed URL object with supporting information.

SignedImagesURL Objects

class SignedImagesURL(base_orm.BaseListORM)
A signed URL object with supporting information.

SignedAudioURL Objects

class SignedAudioURL(base_orm.BaseORM)
A signed URL object with supporting information.

SignedDicomURL Objects

class SignedDicomURL(base_orm.BaseORM)
A signed URL object with supporting information.

SignedDicomsURL Objects

class SignedDicomsURL(base_orm.BaseListORM)
A signed URL object with supporting information.

Video Objects

class Video(base_orm.BaseORM)
A video object with supporting information.

ImageGroup Objects

class ImageGroup(base_orm.BaseORM)
An image group object with supporting information.

Image Objects

class Image(base_orm.BaseORM)
An image object with supporting information.

SingleImage Objects

class SingleImage(Image)
For native single image upload.

Audio Objects

class Audio(base_orm.BaseORM)
An audio object with supporting information.

Images Objects

@dataclasses.dataclass(frozen=True)
class Images()
Uploading multiple images in a batch mode.

ReEncodeVideoTask Objects

class ReEncodeVideoTask(BaseDTO)
A re encode video object with supporting information.

DatasetAccessSettings Objects

@dataclasses.dataclass
class DatasetAccessSettings()
Settings for using the dataset object.

fetch_client_metadata

Whether client metadata should be retrieved for each data_row.

ImagesDataFetchOptions Objects

@dataclasses.dataclass
class ImagesDataFetchOptions()

fetch_signed_urls

Whether to fetch signed urls for each individual image. Only set this to true if you need to download the images.

LongPollingStatus Objects

class LongPollingStatus(str, Enum)

PENDING

Job will automatically start soon (waiting in queue) or already started processing.

DONE

Job has finished successfully (possibly with errors if ignore_errors=True). If ignore_errors=False was specified in :meth:encord.dataset.Dataset.add_private_data_to_dataset_start, the job will only have the status DONE` if there were no errors. If ignore_errors=True was specified in :meth:encord.dataset.Dataset.add_private_data_to_dataset_start, the job will always show the status DONE once complete and will never show ERROR status if this flag was set to True. There could be errors that were ignored. Information about number of errors and stringified exceptions is available in the units_error_count: int and errors: List[str] attributes.

ERROR

Job has completed with errors. This can only happen if ignore_errors was set to False. Information about errors is available in the units_error_count: int and errors: List[str] attributes.

CANCELLED

Job was cancelled explicitly by the user through the Encord UI or via the Encord SDK using the add_data_to_folder_job_cancel method. In the context of this status:
  • The job may have been partially processed, but it was explicitly interrupted before completion by a user action.
  • Cancellation can occur either manually through the Encord UI or programmatically using the SDK method add_data_to_folder_job_cancel.
  • Once a job is cancelled, no further processing will occur, and any processed data before the cancellation will be available.
  • The presence of cancelled data units (units_cancelled_count) indicates that some data upload units were interrupted and cancelled before completion.
  • If ignore_errors was set to True, the job may continue despite errors, and cancellation will only apply to the unprocessed units.

DataUnitError Objects

class DataUnitError(BaseDTO)
A description of an error for an individual upload item

object_urls

URLs involved. A single item for videos and images; a list of frames for image groups and DICOM

error

The error message

subtask_uuid

Opaque ID of the process. Please quote this when contacting Encord support.

action_description

Human-readable description of the action that failed (e.g. ā€˜Uploading DICOM series’).

DatasetDataLongPolling Objects

class DatasetDataLongPolling(BaseDTO)
Response of the upload job’s long polling request. Note: An upload job consists of job units, where job unit could be either a video, image group, dicom series, or a single image.

status

Status of the upload job. Documented in detail in :meth:LongPollingStatus

data_hashes_with_titles

Information about data which was added to the dataset.

errors

Stringified list of exceptions.

data_unit_errors

Structured list of per-item upload errors. See :class:DataUnitError for more details.

units_pending_count

Number of upload job units that have pending status.

units_done_count

Number of upload job units that have done status.

units_error_count

Number of upload job units that have error status.

units_cancelled_count

Number of upload job units that have been cancelled.

CreateDatasetPayload Objects

class CreateDatasetPayload(BaseDTO)

create_backing_folder

this creates a legacy ā€œmirrorā€ dataset and it’s backing folder in one go

legacy_call

this field will be removed soon

CreateDatasetResponseV2 Objects

class CreateDatasetResponseV2(BaseDTO)

backing_folder_uuid

a ā€˜not None’ indicates a legacy ā€œmirrorā€ dataset was created

DatasetWithUserRole Objects

class DatasetWithUserRole(BaseDTO)

storage_location

legacy field: you can have data from mixed locations now

backing_folder_uuid

if set, this indicates a legacy ā€˜mirror’ dataset