Skip to content

Welcome to the ZenML SDK Docs

Actions

Actions allow configuring a given action for later execution.

Alerter

Alerters allow you to send alerts from within your pipeline.

This is useful to immediately get notified when failures happen, and also for general monitoring / reporting.

BaseAlerter

Bases: StackComponent, ABC

Base class for all ZenML alerters.

Source code in src/zenml/alerter/base_alerter.py
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
class BaseAlerter(StackComponent, ABC):
    """Base class for all ZenML alerters."""

    @property
    def config(self) -> BaseAlerterConfig:
        """Returns the `BaseAlerterConfig` config.

        Returns:
            The configuration.
        """
        return cast(BaseAlerterConfig, self._config)

    def post(
        self, message: str, params: Optional[BaseAlerterStepParameters] = None
    ) -> bool:
        """Post a message to a chat service.

        Args:
            message: Message to be posted.
            params: Optional parameters of this function.

        Returns:
            bool: True if operation succeeded, else False.
        """
        return True

    def ask(
        self, question: str, params: Optional[BaseAlerterStepParameters] = None
    ) -> bool:
        """Post a message to a chat service and wait for approval.

        This can be useful to easily get a human in the loop, e.g., when
        deploying models.

        Args:
            question: Question to ask (message to be posted).
            params: Optional parameters of this function.

        Returns:
            bool: True if operation succeeded and was approved, else False.
        """
        return True

config property

Returns the BaseAlerterConfig config.

Returns:

Type Description
BaseAlerterConfig

The configuration.

ask(question, params=None)

Post a message to a chat service and wait for approval.

This can be useful to easily get a human in the loop, e.g., when deploying models.

Parameters:

Name Type Description Default
question str

Question to ask (message to be posted).

required
params Optional[BaseAlerterStepParameters]

Optional parameters of this function.

None

Returns:

Name Type Description
bool bool

True if operation succeeded and was approved, else False.

Source code in src/zenml/alerter/base_alerter.py
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
def ask(
    self, question: str, params: Optional[BaseAlerterStepParameters] = None
) -> bool:
    """Post a message to a chat service and wait for approval.

    This can be useful to easily get a human in the loop, e.g., when
    deploying models.

    Args:
        question: Question to ask (message to be posted).
        params: Optional parameters of this function.

    Returns:
        bool: True if operation succeeded and was approved, else False.
    """
    return True

post(message, params=None)

Post a message to a chat service.

Parameters:

Name Type Description Default
message str

Message to be posted.

required
params Optional[BaseAlerterStepParameters]

Optional parameters of this function.

None

Returns:

Name Type Description
bool bool

True if operation succeeded, else False.

Source code in src/zenml/alerter/base_alerter.py
46
47
48
49
50
51
52
53
54
55
56
57
58
def post(
    self, message: str, params: Optional[BaseAlerterStepParameters] = None
) -> bool:
    """Post a message to a chat service.

    Args:
        message: Message to be posted.
        params: Optional parameters of this function.

    Returns:
        bool: True if operation succeeded, else False.
    """
    return True

BaseAlerterConfig

Bases: StackComponentConfig

Base config for alerters.

Source code in src/zenml/alerter/base_alerter.py
30
31
class BaseAlerterConfig(StackComponentConfig):
    """Base config for alerters."""

BaseAlerterFlavor

Bases: Flavor, ABC

Base class for all ZenML alerter flavors.

Source code in src/zenml/alerter/base_alerter.py
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
class BaseAlerterFlavor(Flavor, ABC):
    """Base class for all ZenML alerter flavors."""

    @property
    def type(self) -> StackComponentType:
        """Returns the flavor type.

        Returns:
            The flavor type.
        """
        return StackComponentType.ALERTER

    @property
    def config_class(self) -> Type[BaseAlerterConfig]:
        """Returns BaseAlerterConfig class.

        Returns:
            The BaseAlerterConfig class.
        """
        return BaseAlerterConfig

    @property
    def implementation_class(self) -> Type[BaseAlerter]:
        """Implementation class.

        Returns:
            The implementation class.
        """
        return BaseAlerter

config_class property

Returns BaseAlerterConfig class.

Returns:

Type Description
Type[BaseAlerterConfig]

The BaseAlerterConfig class.

implementation_class property

Implementation class.

Returns:

Type Description
Type[BaseAlerter]

The implementation class.

type property

Returns the flavor type.

Returns:

Type Description
StackComponentType

The flavor type.

BaseAlerterStepParameters

Bases: BaseModel

Step parameters definition for all alerters.

Source code in src/zenml/alerter/base_alerter.py
26
27
class BaseAlerterStepParameters(BaseModel):
    """Step parameters definition for all alerters."""

Analytics

The 'analytics' module of ZenML.

alias(user_id, previous_id)

Alias user IDs.

Parameters:

Name Type Description Default
user_id UUID

The user ID.

required
previous_id UUID

Previous ID for the alias.

required

Returns:

Type Description
bool

True if event is sent successfully, False is not.

Source code in src/zenml/analytics/__init__.py
49
50
51
52
53
54
55
56
57
58
59
60
61
62
def alias(user_id: UUID, previous_id: UUID) -> bool:  # type: ignore[return]
    """Alias user IDs.

    Args:
        user_id: The user ID.
        previous_id: Previous ID for the alias.

    Returns:
        True if event is sent successfully, False is not.
    """
    from zenml.analytics.context import AnalyticsContext

    with AnalyticsContext() as analytics:
        return analytics.alias(user_id=user_id, previous_id=previous_id)

group(group_id, group_metadata=None)

Attach metadata to a segment group.

Parameters:

Name Type Description Default
group_id UUID

ID of the group.

required
group_metadata Optional[Dict[str, Any]]

Metadata to attach to the group.

None

Returns:

Type Description
bool

True if event is sent successfully, False if not.

Source code in src/zenml/analytics/__init__.py
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
def group(  # type: ignore[return]
    group_id: UUID,
    group_metadata: Optional[Dict[str, Any]] = None,
) -> bool:
    """Attach metadata to a segment group.

    Args:
        group_id: ID of the group.
        group_metadata: Metadata to attach to the group.

    Returns:
        True if event is sent successfully, False if not.
    """
    from zenml.analytics.context import AnalyticsContext

    with AnalyticsContext() as analytics:
        return analytics.group(group_id=group_id, traits=group_metadata)

identify(metadata=None)

Attach metadata to user directly.

Parameters:

Name Type Description Default
metadata Optional[Dict[str, Any]]

Dict of metadata to attach to the user.

None

Returns:

Type Description
bool

True if event is sent successfully, False is not.

Source code in src/zenml/analytics/__init__.py
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
def identify(  # type: ignore[return]
    metadata: Optional[Dict[str, Any]] = None
) -> bool:
    """Attach metadata to user directly.

    Args:
        metadata: Dict of metadata to attach to the user.

    Returns:
        True if event is sent successfully, False is not.
    """
    from zenml.analytics.context import AnalyticsContext

    if metadata is None:
        return False

    with AnalyticsContext() as analytics:
        return analytics.identify(traits=metadata)

track(event, metadata=None)

Track segment event if user opted-in.

Parameters:

Name Type Description Default
event AnalyticsEvent

Name of event to track in segment.

required
metadata Optional[Dict[str, Any]]

Dict of metadata to track.

None

Returns:

Type Description
bool

True if event is sent successfully, False if not.

Source code in src/zenml/analytics/__init__.py
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
def track(  # type: ignore[return]
    event: "AnalyticsEvent",
    metadata: Optional[Dict[str, Any]] = None,
) -> bool:
    """Track segment event if user opted-in.

    Args:
        event: Name of event to track in segment.
        metadata: Dict of metadata to track.

    Returns:
        True if event is sent successfully, False if not.
    """
    from zenml.analytics.context import AnalyticsContext

    if metadata is None:
        metadata = {}

    metadata.setdefault("event_success", True)

    with AnalyticsContext() as analytics:
        return analytics.track(event=event, properties=metadata)

Annotators

Initialization of the ZenML annotator stack component.

BaseAnnotator

Bases: StackComponent, ABC

Base class for all ZenML annotators.

Source code in src/zenml/annotators/base_annotator.py
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
class BaseAnnotator(StackComponent, ABC):
    """Base class for all ZenML annotators."""

    @property
    def config(self) -> BaseAnnotatorConfig:
        """Returns the `BaseAnnotatorConfig` config.

        Returns:
            The configuration.
        """
        return cast(BaseAnnotatorConfig, self._config)

    @abstractmethod
    def get_url(self) -> str:
        """Gets the URL of the annotation interface.

        Returns:
            The URL of the annotation interface.
        """

    @abstractmethod
    def get_url_for_dataset(self, dataset_name: str) -> str:
        """Gets the URL of the annotation interface for a specific dataset.

        Args:
            dataset_name: name of the dataset.

        Returns:
            The URL of the dataset annotation interface.
        """

    @abstractmethod
    def get_datasets(self) -> List[Any]:
        """Gets the datasets currently available for annotation.

        Returns:
            The datasets currently available for annotation.
        """

    @abstractmethod
    def get_dataset_names(self) -> List[str]:
        """Gets the names of the datasets currently available for annotation.

        Returns:
            The names of the datasets currently available for annotation.
        """

    @abstractmethod
    def get_dataset_stats(self, dataset_name: str) -> Tuple[int, int]:
        """Gets the statistics of a dataset.

        Args:
            dataset_name: name of the dataset.

        Returns:
            A tuple containing (labeled_task_count, unlabeled_task_count) for
                the dataset.
        """

    @abstractmethod
    def launch(self, **kwargs: Any) -> None:
        """Launches the annotation interface.

        Args:
            **kwargs: Additional keyword arguments to pass to the
                annotation client.
        """

    @abstractmethod
    def add_dataset(self, **kwargs: Any) -> Any:
        """Registers a dataset for annotation.

        Args:
            **kwargs: keyword arguments.

        Returns:
            The dataset or confirmation object on adding the dataset.
        """

    @abstractmethod
    def get_dataset(self, **kwargs: Any) -> Any:
        """Gets the dataset with the given name.

        Args:
            **kwargs: keyword arguments.

        Returns:
            The dataset with the given name.
        """

    @abstractmethod
    def delete_dataset(self, **kwargs: Any) -> None:
        """Deletes a dataset.

        Args:
            **kwargs: keyword arguments.
        """

    @abstractmethod
    def get_labeled_data(self, **kwargs: Any) -> Any:
        """Gets the labeled data for the given dataset.

        Args:
            **kwargs: keyword arguments.

        Returns:
            The labeled data for the given dataset.
        """

    @abstractmethod
    def get_unlabeled_data(self, **kwargs: str) -> Any:
        """Gets the unlabeled data for the given dataset.

        Args:
            **kwargs: Additional keyword arguments to pass to the Label Studio client.

        Returns:
            The unlabeled data for the given dataset.
        """

config property

Returns the BaseAnnotatorConfig config.

Returns:

Type Description
BaseAnnotatorConfig

The configuration.

add_dataset(**kwargs) abstractmethod

Registers a dataset for annotation.

Parameters:

Name Type Description Default
**kwargs Any

keyword arguments.

{}

Returns:

Type Description
Any

The dataset or confirmation object on adding the dataset.

Source code in src/zenml/annotators/base_annotator.py
102
103
104
105
106
107
108
109
110
111
@abstractmethod
def add_dataset(self, **kwargs: Any) -> Any:
    """Registers a dataset for annotation.

    Args:
        **kwargs: keyword arguments.

    Returns:
        The dataset or confirmation object on adding the dataset.
    """

delete_dataset(**kwargs) abstractmethod

Deletes a dataset.

Parameters:

Name Type Description Default
**kwargs Any

keyword arguments.

{}
Source code in src/zenml/annotators/base_annotator.py
124
125
126
127
128
129
130
@abstractmethod
def delete_dataset(self, **kwargs: Any) -> None:
    """Deletes a dataset.

    Args:
        **kwargs: keyword arguments.
    """

get_dataset(**kwargs) abstractmethod

Gets the dataset with the given name.

Parameters:

Name Type Description Default
**kwargs Any

keyword arguments.

{}

Returns:

Type Description
Any

The dataset with the given name.

Source code in src/zenml/annotators/base_annotator.py
113
114
115
116
117
118
119
120
121
122
@abstractmethod
def get_dataset(self, **kwargs: Any) -> Any:
    """Gets the dataset with the given name.

    Args:
        **kwargs: keyword arguments.

    Returns:
        The dataset with the given name.
    """

get_dataset_names() abstractmethod

Gets the names of the datasets currently available for annotation.

Returns:

Type Description
List[str]

The names of the datasets currently available for annotation.

Source code in src/zenml/annotators/base_annotator.py
73
74
75
76
77
78
79
@abstractmethod
def get_dataset_names(self) -> List[str]:
    """Gets the names of the datasets currently available for annotation.

    Returns:
        The names of the datasets currently available for annotation.
    """

get_dataset_stats(dataset_name) abstractmethod

Gets the statistics of a dataset.

Parameters:

Name Type Description Default
dataset_name str

name of the dataset.

required

Returns:

Type Description
Tuple[int, int]

A tuple containing (labeled_task_count, unlabeled_task_count) for the dataset.

Source code in src/zenml/annotators/base_annotator.py
81
82
83
84
85
86
87
88
89
90
91
@abstractmethod
def get_dataset_stats(self, dataset_name: str) -> Tuple[int, int]:
    """Gets the statistics of a dataset.

    Args:
        dataset_name: name of the dataset.

    Returns:
        A tuple containing (labeled_task_count, unlabeled_task_count) for
            the dataset.
    """

get_datasets() abstractmethod

Gets the datasets currently available for annotation.

Returns:

Type Description
List[Any]

The datasets currently available for annotation.

Source code in src/zenml/annotators/base_annotator.py
65
66
67
68
69
70
71
@abstractmethod
def get_datasets(self) -> List[Any]:
    """Gets the datasets currently available for annotation.

    Returns:
        The datasets currently available for annotation.
    """

get_labeled_data(**kwargs) abstractmethod

Gets the labeled data for the given dataset.

Parameters:

Name Type Description Default
**kwargs Any

keyword arguments.

{}

Returns:

Type Description
Any

The labeled data for the given dataset.

Source code in src/zenml/annotators/base_annotator.py
132
133
134
135
136
137
138
139
140
141
@abstractmethod
def get_labeled_data(self, **kwargs: Any) -> Any:
    """Gets the labeled data for the given dataset.

    Args:
        **kwargs: keyword arguments.

    Returns:
        The labeled data for the given dataset.
    """

get_unlabeled_data(**kwargs) abstractmethod

Gets the unlabeled data for the given dataset.

Parameters:

Name Type Description Default
**kwargs str

Additional keyword arguments to pass to the Label Studio client.

{}

Returns:

Type Description
Any

The unlabeled data for the given dataset.

Source code in src/zenml/annotators/base_annotator.py
143
144
145
146
147
148
149
150
151
152
@abstractmethod
def get_unlabeled_data(self, **kwargs: str) -> Any:
    """Gets the unlabeled data for the given dataset.

    Args:
        **kwargs: Additional keyword arguments to pass to the Label Studio client.

    Returns:
        The unlabeled data for the given dataset.
    """

get_url() abstractmethod

Gets the URL of the annotation interface.

Returns:

Type Description
str

The URL of the annotation interface.

Source code in src/zenml/annotators/base_annotator.py
46
47
48
49
50
51
52
@abstractmethod
def get_url(self) -> str:
    """Gets the URL of the annotation interface.

    Returns:
        The URL of the annotation interface.
    """

get_url_for_dataset(dataset_name) abstractmethod

Gets the URL of the annotation interface for a specific dataset.

Parameters:

Name Type Description Default
dataset_name str

name of the dataset.

required

Returns:

Type Description
str

The URL of the dataset annotation interface.

Source code in src/zenml/annotators/base_annotator.py
54
55
56
57
58
59
60
61
62
63
@abstractmethod
def get_url_for_dataset(self, dataset_name: str) -> str:
    """Gets the URL of the annotation interface for a specific dataset.

    Args:
        dataset_name: name of the dataset.

    Returns:
        The URL of the dataset annotation interface.
    """

launch(**kwargs) abstractmethod

Launches the annotation interface.

Parameters:

Name Type Description Default
**kwargs Any

Additional keyword arguments to pass to the annotation client.

{}
Source code in src/zenml/annotators/base_annotator.py
 93
 94
 95
 96
 97
 98
 99
100
@abstractmethod
def launch(self, **kwargs: Any) -> None:
    """Launches the annotation interface.

    Args:
        **kwargs: Additional keyword arguments to pass to the
            annotation client.
    """

Artifact Stores

ZenML's artifact-store stores artifacts in a file system.

In ZenML, the inputs and outputs which go through any step is treated as an artifact and as its name suggests, an ArtifactStore is a place where these artifacts get stored.

Out of the box, ZenML comes with the BaseArtifactStore and LocalArtifactStore implementations. While the BaseArtifactStore establishes an interface for people who want to extend it to their needs, the LocalArtifactStore is a simple implementation for a local setup.

Moreover, additional artifact stores can be found in specific integrations modules, such as the GCPArtifactStore in the gcp integration and the AzureArtifactStore in the azure integration.

BaseArtifactStore

Bases: StackComponent

Base class for all ZenML artifact stores.

Source code in src/zenml/artifact_stores/base_artifact_store.py
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
class BaseArtifactStore(StackComponent):
    """Base class for all ZenML artifact stores."""

    @property
    def config(self) -> BaseArtifactStoreConfig:
        """Returns the `BaseArtifactStoreConfig` config.

        Returns:
            The configuration.
        """
        return cast(BaseArtifactStoreConfig, self._config)

    @property
    def path(self) -> str:
        """The path to the artifact store.

        Returns:
            The path.
        """
        return self.config.path

    @property
    def custom_cache_key(self) -> Optional[bytes]:
        """Custom cache key.

        Any artifact store can override this property in case they need
        additional control over the caching behavior.

        Returns:
            Custom cache key.
        """
        return None

    # --- User interface ---
    @abstractmethod
    def open(self, path: PathType, mode: str = "r") -> Any:
        """Open a file at the given path.

        Args:
            path: The path of the file to open.
            mode: The mode to open the file.

        Returns:
            The file object.
        """

    @abstractmethod
    def copyfile(
        self, src: PathType, dst: PathType, overwrite: bool = False
    ) -> None:
        """Copy a file from the source to the destination.

        Args:
            src: The source path.
            dst: The destination path.
            overwrite: Whether to overwrite the destination file if it exists.
        """

    @abstractmethod
    def exists(self, path: PathType) -> bool:
        """Checks if a path exists.

        Args:
            path: The path to check.

        Returns:
            `True` if the path exists.
        """

    @abstractmethod
    def glob(self, pattern: PathType) -> List[PathType]:
        """Gets the paths that match a glob pattern.

        Args:
            pattern: The glob pattern.

        Returns:
            The list of paths that match the pattern.
        """

    @abstractmethod
    def isdir(self, path: PathType) -> bool:
        """Returns whether the given path points to a directory.

        Args:
            path: The path to check.

        Returns:
            `True` if the path points to a directory.
        """

    @abstractmethod
    def listdir(self, path: PathType) -> List[PathType]:
        """Returns a list of files under a given directory in the filesystem.

        Args:
            path: The path to list.

        Returns:
            The list of files under the given path.
        """

    @abstractmethod
    def makedirs(self, path: PathType) -> None:
        """Make a directory at the given path, recursively creating parents.

        Args:
            path: The path to create.
        """

    @abstractmethod
    def mkdir(self, path: PathType) -> None:
        """Make a directory at the given path; parent directory must exist.

        Args:
            path: The path to create.
        """

    @abstractmethod
    def remove(self, path: PathType) -> None:
        """Remove the file at the given path. Dangerous operation.

        Args:
            path: The path to remove.
        """

    @abstractmethod
    def rename(
        self, src: PathType, dst: PathType, overwrite: bool = False
    ) -> None:
        """Rename source file to destination file.

        Args:
            src: The source path.
            dst: The destination path.
            overwrite: Whether to overwrite the destination file if it exists.
        """

    @abstractmethod
    def rmtree(self, path: PathType) -> None:
        """Deletes dir recursively. Dangerous operation.

        Args:
            path: The path to delete.
        """

    @abstractmethod
    def stat(self, path: PathType) -> Any:
        """Return the stat descriptor for a given file path.

        Args:
            path: The path to check.

        Returns:
            The stat descriptor.
        """

    @abstractmethod
    def size(self, path: PathType) -> Optional[int]:
        """Get the size of a file in bytes.

        Args:
            path: The path to the file.

        Returns:
            The size of the file in bytes or `None` if the artifact store
            does not implement the `size` method.
        """
        logger.warning(
            "Cannot get size of file '%s' since the artifact store %s does not "
            "implement the `size` method.",
            path,
            self.__class__.__name__,
        )
        return None

    @abstractmethod
    def walk(
        self,
        top: PathType,
        topdown: bool = True,
        onerror: Optional[Callable[..., None]] = None,
    ) -> Iterable[Tuple[PathType, List[PathType], List[PathType]]]:
        """Return an iterator that walks the contents of the given directory.

        Args:
            top: The path to walk.
            topdown: Whether to walk the top-down or bottom-up.
            onerror: The error handler.

        Returns:
            The iterator that walks the contents of the given directory.
        """

    # --- Internal interface ---
    def __init__(self, *args: Any, **kwargs: Any) -> None:
        """Initiate the Pydantic object and register the corresponding filesystem.

        Args:
            *args: The positional arguments to pass to the Pydantic object.
            **kwargs: The keyword arguments to pass to the Pydantic object.
        """
        super(BaseArtifactStore, self).__init__(*args, **kwargs)

        # If running in a ZenML server environment, we don't register
        # the filesystems. We always use the artifact stores directly.
        if ENV_ZENML_SERVER not in os.environ:
            self._register()

    def _register(self) -> None:
        """Create and register a filesystem within the filesystem registry."""
        from zenml.io.filesystem import BaseFilesystem
        from zenml.io.filesystem_registry import default_filesystem_registry
        from zenml.io.local_filesystem import LocalFilesystem

        overloads: Dict[str, Any] = {
            "SUPPORTED_SCHEMES": self.config.SUPPORTED_SCHEMES,
        }
        for abc_method in inspect.getmembers(BaseArtifactStore):
            if getattr(abc_method[1], "__isabstractmethod__", False):
                sanitized_method = _sanitize_paths(
                    getattr(self, abc_method[0]), self.path
                )
                # prepare overloads for filesystem methods
                overloads[abc_method[0]] = staticmethod(sanitized_method)

                # decorate artifact store methods
                setattr(
                    self,
                    abc_method[0],
                    sanitized_method,
                )

        # Local filesystem is always registered, no point in doing it again.
        if isinstance(self, LocalFilesystem):
            return

        filesystem_class = type(
            self.__class__.__name__, (BaseFilesystem,), overloads
        )

        default_filesystem_registry.register(filesystem_class)

    def _remove_previous_file_versions(self, path: PathType) -> None:
        """Remove all file versions but the latest in the given path.

        Method is useful for logs stored in versioned file systems
        like AWS S3.

        Args:
            path: The path to the file.
        """
        return

config property

Returns the BaseArtifactStoreConfig config.

Returns:

Type Description
BaseArtifactStoreConfig

The configuration.

custom_cache_key property

Custom cache key.

Any artifact store can override this property in case they need additional control over the caching behavior.

Returns:

Type Description
Optional[bytes]

Custom cache key.

path property

The path to the artifact store.

Returns:

Type Description
str

The path.

__init__(*args, **kwargs)

Initiate the Pydantic object and register the corresponding filesystem.

Parameters:

Name Type Description Default
*args Any

The positional arguments to pass to the Pydantic object.

()
**kwargs Any

The keyword arguments to pass to the Pydantic object.

{}
Source code in src/zenml/artifact_stores/base_artifact_store.py
430
431
432
433
434
435
436
437
438
439
440
441
442
def __init__(self, *args: Any, **kwargs: Any) -> None:
    """Initiate the Pydantic object and register the corresponding filesystem.

    Args:
        *args: The positional arguments to pass to the Pydantic object.
        **kwargs: The keyword arguments to pass to the Pydantic object.
    """
    super(BaseArtifactStore, self).__init__(*args, **kwargs)

    # If running in a ZenML server environment, we don't register
    # the filesystems. We always use the artifact stores directly.
    if ENV_ZENML_SERVER not in os.environ:
        self._register()

copyfile(src, dst, overwrite=False) abstractmethod

Copy a file from the source to the destination.

Parameters:

Name Type Description Default
src PathType

The source path.

required
dst PathType

The destination path.

required
overwrite bool

Whether to overwrite the destination file if it exists.

False
Source code in src/zenml/artifact_stores/base_artifact_store.py
281
282
283
284
285
286
287
288
289
290
291
@abstractmethod
def copyfile(
    self, src: PathType, dst: PathType, overwrite: bool = False
) -> None:
    """Copy a file from the source to the destination.

    Args:
        src: The source path.
        dst: The destination path.
        overwrite: Whether to overwrite the destination file if it exists.
    """

exists(path) abstractmethod

Checks if a path exists.

Parameters:

Name Type Description Default
path PathType

The path to check.

required

Returns:

Type Description
bool

True if the path exists.

Source code in src/zenml/artifact_stores/base_artifact_store.py
293
294
295
296
297
298
299
300
301
302
@abstractmethod
def exists(self, path: PathType) -> bool:
    """Checks if a path exists.

    Args:
        path: The path to check.

    Returns:
        `True` if the path exists.
    """

glob(pattern) abstractmethod

Gets the paths that match a glob pattern.

Parameters:

Name Type Description Default
pattern PathType

The glob pattern.

required

Returns:

Type Description
List[PathType]

The list of paths that match the pattern.

Source code in src/zenml/artifact_stores/base_artifact_store.py
304
305
306
307
308
309
310
311
312
313
@abstractmethod
def glob(self, pattern: PathType) -> List[PathType]:
    """Gets the paths that match a glob pattern.

    Args:
        pattern: The glob pattern.

    Returns:
        The list of paths that match the pattern.
    """

isdir(path) abstractmethod

Returns whether the given path points to a directory.

Parameters:

Name Type Description Default
path PathType

The path to check.

required

Returns:

Type Description
bool

True if the path points to a directory.

Source code in src/zenml/artifact_stores/base_artifact_store.py
315
316
317
318
319
320
321
322
323
324
@abstractmethod
def isdir(self, path: PathType) -> bool:
    """Returns whether the given path points to a directory.

    Args:
        path: The path to check.

    Returns:
        `True` if the path points to a directory.
    """

listdir(path) abstractmethod

Returns a list of files under a given directory in the filesystem.

Parameters:

Name Type Description Default
path PathType

The path to list.

required

Returns:

Type Description
List[PathType]

The list of files under the given path.

Source code in src/zenml/artifact_stores/base_artifact_store.py
326
327
328
329
330
331
332
333
334
335
@abstractmethod
def listdir(self, path: PathType) -> List[PathType]:
    """Returns a list of files under a given directory in the filesystem.

    Args:
        path: The path to list.

    Returns:
        The list of files under the given path.
    """

makedirs(path) abstractmethod

Make a directory at the given path, recursively creating parents.

Parameters:

Name Type Description Default
path PathType

The path to create.

required
Source code in src/zenml/artifact_stores/base_artifact_store.py
337
338
339
340
341
342
343
@abstractmethod
def makedirs(self, path: PathType) -> None:
    """Make a directory at the given path, recursively creating parents.

    Args:
        path: The path to create.
    """

mkdir(path) abstractmethod

Make a directory at the given path; parent directory must exist.

Parameters:

Name Type Description Default
path PathType

The path to create.

required
Source code in src/zenml/artifact_stores/base_artifact_store.py
345
346
347
348
349
350
351
@abstractmethod
def mkdir(self, path: PathType) -> None:
    """Make a directory at the given path; parent directory must exist.

    Args:
        path: The path to create.
    """

open(path, mode='r') abstractmethod

Open a file at the given path.

Parameters:

Name Type Description Default
path PathType

The path of the file to open.

required
mode str

The mode to open the file.

'r'

Returns:

Type Description
Any

The file object.

Source code in src/zenml/artifact_stores/base_artifact_store.py
269
270
271
272
273
274
275
276
277
278
279
@abstractmethod
def open(self, path: PathType, mode: str = "r") -> Any:
    """Open a file at the given path.

    Args:
        path: The path of the file to open.
        mode: The mode to open the file.

    Returns:
        The file object.
    """

remove(path) abstractmethod

Remove the file at the given path. Dangerous operation.

Parameters:

Name Type Description Default
path PathType

The path to remove.

required
Source code in src/zenml/artifact_stores/base_artifact_store.py
353
354
355
356
357
358
359
@abstractmethod
def remove(self, path: PathType) -> None:
    """Remove the file at the given path. Dangerous operation.

    Args:
        path: The path to remove.
    """

rename(src, dst, overwrite=False) abstractmethod

Rename source file to destination file.

Parameters:

Name Type Description Default
src PathType

The source path.

required
dst PathType

The destination path.

required
overwrite bool

Whether to overwrite the destination file if it exists.

False
Source code in src/zenml/artifact_stores/base_artifact_store.py
361
362
363
364
365
366
367
368
369
370
371
@abstractmethod
def rename(
    self, src: PathType, dst: PathType, overwrite: bool = False
) -> None:
    """Rename source file to destination file.

    Args:
        src: The source path.
        dst: The destination path.
        overwrite: Whether to overwrite the destination file if it exists.
    """

rmtree(path) abstractmethod

Deletes dir recursively. Dangerous operation.

Parameters:

Name Type Description Default
path PathType

The path to delete.

required
Source code in src/zenml/artifact_stores/base_artifact_store.py
373
374
375
376
377
378
379
@abstractmethod
def rmtree(self, path: PathType) -> None:
    """Deletes dir recursively. Dangerous operation.

    Args:
        path: The path to delete.
    """

size(path) abstractmethod

Get the size of a file in bytes.

Parameters:

Name Type Description Default
path PathType

The path to the file.

required

Returns:

Type Description
Optional[int]

The size of the file in bytes or None if the artifact store

Optional[int]

does not implement the size method.

Source code in src/zenml/artifact_stores/base_artifact_store.py
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
@abstractmethod
def size(self, path: PathType) -> Optional[int]:
    """Get the size of a file in bytes.

    Args:
        path: The path to the file.

    Returns:
        The size of the file in bytes or `None` if the artifact store
        does not implement the `size` method.
    """
    logger.warning(
        "Cannot get size of file '%s' since the artifact store %s does not "
        "implement the `size` method.",
        path,
        self.__class__.__name__,
    )
    return None

stat(path) abstractmethod

Return the stat descriptor for a given file path.

Parameters:

Name Type Description Default
path PathType

The path to check.

required

Returns:

Type Description
Any

The stat descriptor.

Source code in src/zenml/artifact_stores/base_artifact_store.py
381
382
383
384
385
386
387
388
389
390
@abstractmethod
def stat(self, path: PathType) -> Any:
    """Return the stat descriptor for a given file path.

    Args:
        path: The path to check.

    Returns:
        The stat descriptor.
    """

walk(top, topdown=True, onerror=None) abstractmethod

Return an iterator that walks the contents of the given directory.

Parameters:

Name Type Description Default
top PathType

The path to walk.

required
topdown bool

Whether to walk the top-down or bottom-up.

True
onerror Optional[Callable[..., None]]

The error handler.

None

Returns:

Type Description
Iterable[Tuple[PathType, List[PathType], List[PathType]]]

The iterator that walks the contents of the given directory.

Source code in src/zenml/artifact_stores/base_artifact_store.py
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
@abstractmethod
def walk(
    self,
    top: PathType,
    topdown: bool = True,
    onerror: Optional[Callable[..., None]] = None,
) -> Iterable[Tuple[PathType, List[PathType], List[PathType]]]:
    """Return an iterator that walks the contents of the given directory.

    Args:
        top: The path to walk.
        topdown: Whether to walk the top-down or bottom-up.
        onerror: The error handler.

    Returns:
        The iterator that walks the contents of the given directory.
    """

BaseArtifactStoreConfig

Bases: StackComponentConfig

Config class for BaseArtifactStore.

Source code in src/zenml/artifact_stores/base_artifact_store.py
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
class BaseArtifactStoreConfig(StackComponentConfig):
    """Config class for `BaseArtifactStore`."""

    path: str

    SUPPORTED_SCHEMES: ClassVar[Set[str]]
    IS_IMMUTABLE_FILESYSTEM: ClassVar[bool] = False

    @model_validator(mode="before")
    @classmethod
    @before_validator_handler
    def _ensure_artifact_store(cls, data: Dict[str, Any]) -> Dict[str, Any]:
        """Validator function for the Artifact Stores.

        Checks whether supported schemes are defined and the given path is
        supported.

        Args:
            data: the input data to construct the artifact store.

        Returns:
            The validated values.

        Raises:
            ArtifactStoreInterfaceError: If the scheme is not supported.
        """
        try:
            getattr(cls, "SUPPORTED_SCHEMES")
        except AttributeError:
            raise ArtifactStoreInterfaceError(
                textwrap.dedent(
                    """
                    When you are working with any classes which subclass from
                    zenml.artifact_store.BaseArtifactStore please make sure
                    that your class has a ClassVar named `SUPPORTED_SCHEMES`
                    which should hold a set of supported file schemes such
                    as {"s3://"} or {"gcs://"}.

                    Example:

                    class MyArtifactStoreConfig(BaseArtifactStoreConfig):
                        ...
                        # Class Variables
                        SUPPORTED_SCHEMES: ClassVar[Set[str]] = {"s3://"}
                        ...
                    """
                )
            )

        if "path" in data:
            data["path"] = data["path"].strip("'\"`")
            if not any(
                data["path"].startswith(i) for i in cls.SUPPORTED_SCHEMES
            ):
                raise ArtifactStoreInterfaceError(
                    f"The path: '{data['path']}' you defined for your "
                    f"artifact store is not supported by the implementation of "
                    f"{cls.schema()['title']}, because it does not start with "
                    f"one of its supported schemes: {cls.SUPPORTED_SCHEMES}."
                )

        return data

BaseArtifactStoreFlavor

Bases: Flavor

Base class for artifact store flavors.

Source code in src/zenml/artifact_stores/base_artifact_store.py
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
class BaseArtifactStoreFlavor(Flavor):
    """Base class for artifact store flavors."""

    @property
    def type(self) -> StackComponentType:
        """Returns the flavor type.

        Returns:
            The flavor type.
        """
        return StackComponentType.ARTIFACT_STORE

    @property
    def config_class(self) -> Type[StackComponentConfig]:
        """Config class for this flavor.

        Returns:
            The config class.
        """
        return BaseArtifactStoreConfig

    @property
    @abstractmethod
    def implementation_class(self) -> Type["BaseArtifactStore"]:
        """Implementation class.

        Returns:
            The implementation class.
        """

config_class property

Config class for this flavor.

Returns:

Type Description
Type[StackComponentConfig]

The config class.

implementation_class abstractmethod property

Implementation class.

Returns:

Type Description
Type[BaseArtifactStore]

The implementation class.

type property

Returns the flavor type.

Returns:

Type Description
StackComponentType

The flavor type.

LocalArtifactStore

Bases: LocalFilesystem, BaseArtifactStore

Artifact Store for local artifacts.

All methods are inherited from the default LocalFilesystem.

Source code in src/zenml/artifact_stores/local_artifact_store.py
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
class LocalArtifactStore(LocalFilesystem, BaseArtifactStore):
    """Artifact Store for local artifacts.

    All methods are inherited from the default `LocalFilesystem`.
    """

    _path: Optional[str] = None

    @staticmethod
    def get_default_local_path(id_: "UUID") -> str:
        """Returns the default local path for a local artifact store.

        Args:
            id_: The id of the local artifact store.

        Returns:
            str: The default local path.
        """
        return os.path.join(
            GlobalConfiguration().local_stores_path,
            str(id_),
        )

    @property
    def path(self) -> str:
        """Returns the path to the local artifact store.

        If the user has not defined a path in the config, this will create a
        sub-folder in the global config directory.

        Returns:
            The path to the local artifact store.
        """
        if self._path:
            return self._path

        if self.config.path:
            self._path = self.config.path
        else:
            self._path = self.get_default_local_path(self.id)
        io_utils.create_dir_recursive_if_not_exists(self._path)
        return self._path

    @property
    def local_path(self) -> Optional[str]:
        """Returns the local path of the artifact store.

        Returns:
            The local path of the artifact store.
        """
        return self.path

    @property
    def custom_cache_key(self) -> Optional[bytes]:
        """Custom cache key.

        The client ID is returned here to invalidate caching when using the same
        local artifact store on multiple client machines.

        Returns:
            Custom cache key.
        """
        return GlobalConfiguration().user_id.bytes

custom_cache_key property

Custom cache key.

The client ID is returned here to invalidate caching when using the same local artifact store on multiple client machines.

Returns:

Type Description
Optional[bytes]

Custom cache key.

local_path property

Returns the local path of the artifact store.

Returns:

Type Description
Optional[str]

The local path of the artifact store.

path property

Returns the path to the local artifact store.

If the user has not defined a path in the config, this will create a sub-folder in the global config directory.

Returns:

Type Description
str

The path to the local artifact store.

get_default_local_path(id_) staticmethod

Returns the default local path for a local artifact store.

Parameters:

Name Type Description Default
id_ UUID

The id of the local artifact store.

required

Returns:

Name Type Description
str str

The default local path.

Source code in src/zenml/artifact_stores/local_artifact_store.py
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
@staticmethod
def get_default_local_path(id_: "UUID") -> str:
    """Returns the default local path for a local artifact store.

    Args:
        id_: The id of the local artifact store.

    Returns:
        str: The default local path.
    """
    return os.path.join(
        GlobalConfiguration().local_stores_path,
        str(id_),
    )

LocalArtifactStoreConfig

Bases: BaseArtifactStoreConfig

Config class for the local artifact store.

Attributes:

Name Type Description
path str

The path to the local artifact store.

Source code in src/zenml/artifact_stores/local_artifact_store.py
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
class LocalArtifactStoreConfig(BaseArtifactStoreConfig):
    """Config class for the local artifact store.

    Attributes:
        path: The path to the local artifact store.
    """

    SUPPORTED_SCHEMES: ClassVar[Set[str]] = {""}

    path: str = ""

    @field_validator("path")
    @classmethod
    def ensure_path_local(cls, path: str) -> str:
        """Pydantic validator which ensures that the given path is a local path.

        Args:
            path: The path to validate.

        Returns:
            str: The validated (local) path.

        Raises:
            ArtifactStoreInterfaceError: If the given path is not a local path.
        """
        remote_prefixes = ["gs://", "hdfs://", "s3://", "az://", "abfs://"]
        if any(path.startswith(prefix) for prefix in remote_prefixes):
            raise ArtifactStoreInterfaceError(
                f"The path '{path}' you defined for your local artifact store "
                f"starts with a remote prefix."
            )
        return path

    @property
    def is_local(self) -> bool:
        """Checks if this stack component is running locally.

        Returns:
            True if this config is for a local component, False otherwise.
        """
        return True

is_local property

Checks if this stack component is running locally.

Returns:

Type Description
bool

True if this config is for a local component, False otherwise.

ensure_path_local(path) classmethod

Pydantic validator which ensures that the given path is a local path.

Parameters:

Name Type Description Default
path str

The path to validate.

required

Returns:

Name Type Description
str str

The validated (local) path.

Raises:

Type Description
ArtifactStoreInterfaceError

If the given path is not a local path.

Source code in src/zenml/artifact_stores/local_artifact_store.py
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
@field_validator("path")
@classmethod
def ensure_path_local(cls, path: str) -> str:
    """Pydantic validator which ensures that the given path is a local path.

    Args:
        path: The path to validate.

    Returns:
        str: The validated (local) path.

    Raises:
        ArtifactStoreInterfaceError: If the given path is not a local path.
    """
    remote_prefixes = ["gs://", "hdfs://", "s3://", "az://", "abfs://"]
    if any(path.startswith(prefix) for prefix in remote_prefixes):
        raise ArtifactStoreInterfaceError(
            f"The path '{path}' you defined for your local artifact store "
            f"starts with a remote prefix."
        )
    return path

LocalArtifactStoreFlavor

Bases: BaseArtifactStoreFlavor

Class for the LocalArtifactStoreFlavor.

Source code in src/zenml/artifact_stores/local_artifact_store.py
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
class LocalArtifactStoreFlavor(BaseArtifactStoreFlavor):
    """Class for the `LocalArtifactStoreFlavor`."""

    @property
    def name(self) -> str:
        """Returns the name of the artifact store flavor.

        Returns:
            str: The name of the artifact store flavor.
        """
        return "local"

    @property
    def docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_docs_url()

    @property
    def sdk_docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_sdk_docs_url()

    @property
    def logo_url(self) -> str:
        """A url to represent the flavor in the dashboard.

        Returns:
            The flavor logo.
        """
        return "https://public-flavor-logos.s3.eu-central-1.amazonaws.com/artifact_store/local.svg"

    @property
    def config_class(self) -> Type[LocalArtifactStoreConfig]:
        """Config class for this flavor.

        Returns:
            The config class.
        """
        return LocalArtifactStoreConfig

    @property
    def implementation_class(self) -> Type[LocalArtifactStore]:
        """Implementation class.

        Returns:
            The implementation class.
        """
        return LocalArtifactStore

config_class property

Config class for this flavor.

Returns:

Type Description
Type[LocalArtifactStoreConfig]

The config class.

docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

implementation_class property

Implementation class.

Returns:

Type Description
Type[LocalArtifactStore]

The implementation class.

logo_url property

A url to represent the flavor in the dashboard.

Returns:

Type Description
str

The flavor logo.

name property

Returns the name of the artifact store flavor.

Returns:

Name Type Description
str str

The name of the artifact store flavor.

sdk_docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

Artifacts

Client Lazy Loader

Lazy loading functionality for Client methods.

ClientLazyLoader

Bases: BaseModel

Lazy loader for Client methods.

Source code in src/zenml/client_lazy_loader.py
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
class ClientLazyLoader(BaseModel):
    """Lazy loader for Client methods."""

    method_name: str
    call_chain: List[_CallStep] = []
    exclude_next_call: bool = False

    def __getattr__(self, name: str) -> "ClientLazyLoader":
        """Get attribute not defined in ClientLazyLoader.

        Args:
            name: Name of the attribute to get.

        Returns:
            self
        """
        self_ = ClientLazyLoader(
            method_name=self.method_name, call_chain=self.call_chain.copy()
        )
        # workaround to protect from infinitely looping over in deepcopy called in invocations
        if name != "__deepcopy__":
            self_.call_chain.append(_CallStep(attribute_name=name))
        else:
            self_.exclude_next_call = True
        return self_

    def __call__(self, *args: Any, **kwargs: Any) -> "ClientLazyLoader":
        """Call mocked attribute.

        Args:
            args: Positional arguments.
            kwargs: Keyword arguments.

        Returns:
            self
        """
        # workaround to protect from infinitely looping over in deepcopy called in invocations
        if not self.exclude_next_call:
            self.call_chain.append(
                _CallStep(is_call=True, call_args=args, call_kwargs=kwargs)
            )
        self.exclude_next_call = False
        return self

    def __getitem__(self, item: Any) -> "ClientLazyLoader":
        """Get item from mocked attribute.

        Args:
            item: Item to get.

        Returns:
            self
        """
        self.call_chain.append(_CallStep(selector=item))
        return self

    def evaluate(self) -> Any:
        """Evaluate lazy loaded Client method.

        Returns:
            Evaluated lazy loader chain of calls.
        """
        from zenml.client import Client

        def _iterate_over_lazy_chain(
            self: "ClientLazyLoader", self_: Any, call_chain_: List[_CallStep]
        ) -> Any:
            next_step = call_chain_.pop(0)
            try:
                if next_step.is_call:
                    self_ = self_(
                        *next_step.call_args, **next_step.call_kwargs
                    )
                elif next_step.selector:
                    self_ = self_[next_step.selector]
                elif next_step.attribute_name:
                    self_ = getattr(self_, next_step.attribute_name)
                else:
                    raise ValueError(
                        "Invalid call chain. Reach out to the ZenML team."
                    )
            except Exception as e:
                logger.debug(
                    f"Failed to evaluate lazy load chain `{self.method_name}` "
                    f"+ `{next_step}` + `{self.call_chain}`."
                )
                msg = f"`{self.method_name}("
                if next_step:
                    for arg in next_step.call_args:
                        msg += f"'{arg}',"
                    for k, v in next_step.call_kwargs.items():
                        msg += f"{k}='{v}',"
                    msg = msg[:-1]
                msg += f")` failed during lazy load with error: {e}"
                logger.error(msg)
                raise RuntimeError(msg)
            return self_

        self_ = getattr(Client(), self.method_name)
        call_chain_ = self.call_chain.copy()
        while call_chain_:
            self_ = _iterate_over_lazy_chain(self, self_, call_chain_)
        return self_

__call__(*args, **kwargs)

Call mocked attribute.

Parameters:

Name Type Description Default
args Any

Positional arguments.

()
kwargs Any

Keyword arguments.

{}

Returns:

Type Description
ClientLazyLoader

self

Source code in src/zenml/client_lazy_loader.py
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
def __call__(self, *args: Any, **kwargs: Any) -> "ClientLazyLoader":
    """Call mocked attribute.

    Args:
        args: Positional arguments.
        kwargs: Keyword arguments.

    Returns:
        self
    """
    # workaround to protect from infinitely looping over in deepcopy called in invocations
    if not self.exclude_next_call:
        self.call_chain.append(
            _CallStep(is_call=True, call_args=args, call_kwargs=kwargs)
        )
    self.exclude_next_call = False
    return self

__getattr__(name)

Get attribute not defined in ClientLazyLoader.

Parameters:

Name Type Description Default
name str

Name of the attribute to get.

required

Returns:

Type Description
ClientLazyLoader

self

Source code in src/zenml/client_lazy_loader.py
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
def __getattr__(self, name: str) -> "ClientLazyLoader":
    """Get attribute not defined in ClientLazyLoader.

    Args:
        name: Name of the attribute to get.

    Returns:
        self
    """
    self_ = ClientLazyLoader(
        method_name=self.method_name, call_chain=self.call_chain.copy()
    )
    # workaround to protect from infinitely looping over in deepcopy called in invocations
    if name != "__deepcopy__":
        self_.call_chain.append(_CallStep(attribute_name=name))
    else:
        self_.exclude_next_call = True
    return self_

__getitem__(item)

Get item from mocked attribute.

Parameters:

Name Type Description Default
item Any

Item to get.

required

Returns:

Type Description
ClientLazyLoader

self

Source code in src/zenml/client_lazy_loader.py
82
83
84
85
86
87
88
89
90
91
92
def __getitem__(self, item: Any) -> "ClientLazyLoader":
    """Get item from mocked attribute.

    Args:
        item: Item to get.

    Returns:
        self
    """
    self.call_chain.append(_CallStep(selector=item))
    return self

evaluate()

Evaluate lazy loaded Client method.

Returns:

Type Description
Any

Evaluated lazy loader chain of calls.

Source code in src/zenml/client_lazy_loader.py
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
def evaluate(self) -> Any:
    """Evaluate lazy loaded Client method.

    Returns:
        Evaluated lazy loader chain of calls.
    """
    from zenml.client import Client

    def _iterate_over_lazy_chain(
        self: "ClientLazyLoader", self_: Any, call_chain_: List[_CallStep]
    ) -> Any:
        next_step = call_chain_.pop(0)
        try:
            if next_step.is_call:
                self_ = self_(
                    *next_step.call_args, **next_step.call_kwargs
                )
            elif next_step.selector:
                self_ = self_[next_step.selector]
            elif next_step.attribute_name:
                self_ = getattr(self_, next_step.attribute_name)
            else:
                raise ValueError(
                    "Invalid call chain. Reach out to the ZenML team."
                )
        except Exception as e:
            logger.debug(
                f"Failed to evaluate lazy load chain `{self.method_name}` "
                f"+ `{next_step}` + `{self.call_chain}`."
            )
            msg = f"`{self.method_name}("
            if next_step:
                for arg in next_step.call_args:
                    msg += f"'{arg}',"
                for k, v in next_step.call_kwargs.items():
                    msg += f"{k}='{v}',"
                msg = msg[:-1]
            msg += f")` failed during lazy load with error: {e}"
            logger.error(msg)
            raise RuntimeError(msg)
        return self_

    self_ = getattr(Client(), self.method_name)
    call_chain_ = self.call_chain.copy()
    while call_chain_:
        self_ = _iterate_over_lazy_chain(self, self_, call_chain_)
    return self_

client_lazy_loader(method_name, *args, **kwargs)

Lazy loader for Client methods helper.

Usage:

def get_something(self, arg1: Any)->SomeResponse:
    if cll:=client_lazy_loader("get_something", arg1):
        return cll # type: ignore[return-value]
    return SomeResponse()

Parameters:

Name Type Description Default
method_name str

The name of the method to be called.

required
*args Any

The arguments to be passed to the method.

()
**kwargs Any

The keyword arguments to be passed to the method.

{}

Returns:

Type Description
Optional[ClientLazyLoader]

The result of the method call.

Source code in src/zenml/client_lazy_loader.py
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
def client_lazy_loader(
    method_name: str, *args: Any, **kwargs: Any
) -> Optional[ClientLazyLoader]:
    """Lazy loader for Client methods helper.

    Usage:
    ```
    def get_something(self, arg1: Any)->SomeResponse:
        if cll:=client_lazy_loader("get_something", arg1):
            return cll # type: ignore[return-value]
        return SomeResponse()
    ```

    Args:
        method_name: The name of the method to be called.
        *args: The arguments to be passed to the method.
        **kwargs: The keyword arguments to be passed to the method.

    Returns:
        The result of the method call.
    """
    from zenml import get_pipeline_context

    try:
        get_pipeline_context()
        cll = ClientLazyLoader(
            method_name=method_name,
        )
        return cll(*args, **kwargs)
    except RuntimeError:
        return None

evaluate_all_lazy_load_args_in_client_methods(cls)

Class wrapper to evaluate lazy loader arguments of all methods.

Parameters:

Name Type Description Default
cls Type[Client]

The class to wrap.

required

Returns:

Type Description
Type[Client]

Wrapped class.

Source code in src/zenml/client_lazy_loader.py
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
def evaluate_all_lazy_load_args_in_client_methods(
    cls: Type["Client"],
) -> Type["Client"]:
    """Class wrapper to evaluate lazy loader arguments of all methods.

    Args:
        cls: The class to wrap.

    Returns:
        Wrapped class.
    """
    import inspect

    def _evaluate_args(
        func: Callable[..., Any], is_instance_method: bool
    ) -> Any:
        @functools.wraps(func)
        def _inner(*args: Any, **kwargs: Any) -> Any:
            args_ = list(args)
            if not is_instance_method:
                from zenml.client import Client

                if args and isinstance(args[0], Client):
                    args_ = list(args[1:])

            for i in range(len(args_)):
                if isinstance(args_[i], dict):
                    with contextlib.suppress(ValueError):
                        args_[i] = ClientLazyLoader(**args_[i]).evaluate()
                elif isinstance(args_[i], ClientLazyLoader):
                    args_[i] = args_[i].evaluate()

            for k, v in kwargs.items():
                if isinstance(v, dict):
                    with contextlib.suppress(ValueError):
                        kwargs[k] = ClientLazyLoader(**v).evaluate()

            return func(*args_, **kwargs)

        return _inner

    def _decorate() -> Type["Client"]:
        for name, fn in inspect.getmembers(cls, inspect.isfunction):
            setattr(
                cls,
                name,
                _evaluate_args(fn, "self" in inspect.getfullargspec(fn).args),
            )
        return cls

    return _decorate()

Client

Client implementation.

Client

ZenML client class.

The ZenML client manages configuration options for ZenML stacks as well as their components.

Source code in src/zenml/client.py
 341
 342
 343
 344
 345
 346
 347
 348
 349
 350
 351
 352
 353
 354
 355
 356
 357
 358
 359
 360
 361
 362
 363
 364
 365
 366
 367
 368
 369
 370
 371
 372
 373
 374
 375
 376
 377
 378
 379
 380
 381
 382
 383
 384
 385
 386
 387
 388
 389
 390
 391
 392
 393
 394
 395
 396
 397
 398
 399
 400
 401
 402
 403
 404
 405
 406
 407
 408
 409
 410
 411
 412
 413
 414
 415
 416
 417
 418
 419
 420
 421
 422
 423
 424
 425
 426
 427
 428
 429
 430
 431
 432
 433
 434
 435
 436
 437
 438
 439
 440
 441
 442
 443
 444
 445
 446
 447
 448
 449
 450
 451
 452
 453
 454
 455
 456
 457
 458
 459
 460
 461
 462
 463
 464
 465
 466
 467
 468
 469
 470
 471
 472
 473
 474
 475
 476
 477
 478
 479
 480
 481
 482
 483
 484
 485
 486
 487
 488
 489
 490
 491
 492
 493
 494
 495
 496
 497
 498
 499
 500
 501
 502
 503
 504
 505
 506
 507
 508
 509
 510
 511
 512
 513
 514
 515
 516
 517
 518
 519
 520
 521
 522
 523
 524
 525
 526
 527
 528
 529
 530
 531
 532
 533
 534
 535
 536
 537
 538
 539
 540
 541
 542
 543
 544
 545
 546
 547
 548
 549
 550
 551
 552
 553
 554
 555
 556
 557
 558
 559
 560
 561
 562
 563
 564
 565
 566
 567
 568
 569
 570
 571
 572
 573
 574
 575
 576
 577
 578
 579
 580
 581
 582
 583
 584
 585
 586
 587
 588
 589
 590
 591
 592
 593
 594
 595
 596
 597
 598
 599
 600
 601
 602
 603
 604
 605
 606
 607
 608
 609
 610
 611
 612
 613
 614
 615
 616
 617
 618
 619
 620
 621
 622
 623
 624
 625
 626
 627
 628
 629
 630
 631
 632
 633
 634
 635
 636
 637
 638
 639
 640
 641
 642
 643
 644
 645
 646
 647
 648
 649
 650
 651
 652
 653
 654
 655
 656
 657
 658
 659
 660
 661
 662
 663
 664
 665
 666
 667
 668
 669
 670
 671
 672
 673
 674
 675
 676
 677
 678
 679
 680
 681
 682
 683
 684
 685
 686
 687
 688
 689
 690
 691
 692
 693
 694
 695
 696
 697
 698
 699
 700
 701
 702
 703
 704
 705
 706
 707
 708
 709
 710
 711
 712
 713
 714
 715
 716
 717
 718
 719
 720
 721
 722
 723
 724
 725
 726
 727
 728
 729
 730
 731
 732
 733
 734
 735
 736
 737
 738
 739
 740
 741
 742
 743
 744
 745
 746
 747
 748
 749
 750
 751
 752
 753
 754
 755
 756
 757
 758
 759
 760
 761
 762
 763
 764
 765
 766
 767
 768
 769
 770
 771
 772
 773
 774
 775
 776
 777
 778
 779
 780
 781
 782
 783
 784
 785
 786
 787
 788
 789
 790
 791
 792
 793
 794
 795
 796
 797
 798
 799
 800
 801
 802
 803
 804
 805
 806
 807
 808
 809
 810
 811
 812
 813
 814
 815
 816
 817
 818
 819
 820
 821
 822
 823
 824
 825
 826
 827
 828
 829
 830
 831
 832
 833
 834
 835
 836
 837
 838
 839
 840
 841
 842
 843
 844
 845
 846
 847
 848
 849
 850
 851
 852
 853
 854
 855
 856
 857
 858
 859
 860
 861
 862
 863
 864
 865
 866
 867
 868
 869
 870
 871
 872
 873
 874
 875
 876
 877
 878
 879
 880
 881
 882
 883
 884
 885
 886
 887
 888
 889
 890
 891
 892
 893
 894
 895
 896
 897
 898
 899
 900
 901
 902
 903
 904
 905
 906
 907
 908
 909
 910
 911
 912
 913
 914
 915
 916
 917
 918
 919
 920
 921
 922
 923
 924
 925
 926
 927
 928
 929
 930
 931
 932
 933
 934
 935
 936
 937
 938
 939
 940
 941
 942
 943
 944
 945
 946
 947
 948
 949
 950
 951
 952
 953
 954
 955
 956
 957
 958
 959
 960
 961
 962
 963
 964
 965
 966
 967
 968
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
2113
2114
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
2133
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
2184
2185
2186
2187
2188
2189
2190
2191
2192
2193
2194
2195
2196
2197
2198
2199
2200
2201
2202
2203
2204
2205
2206
2207
2208
2209
2210
2211
2212
2213
2214
2215
2216
2217
2218
2219
2220
2221
2222
2223
2224
2225
2226
2227
2228
2229
2230
2231
2232
2233
2234
2235
2236
2237
2238
2239
2240
2241
2242
2243
2244
2245
2246
2247
2248
2249
2250
2251
2252
2253
2254
2255
2256
2257
2258
2259
2260
2261
2262
2263
2264
2265
2266
2267
2268
2269
2270
2271
2272
2273
2274
2275
2276
2277
2278
2279
2280
2281
2282
2283
2284
2285
2286
2287
2288
2289
2290
2291
2292
2293
2294
2295
2296
2297
2298
2299
2300
2301
2302
2303
2304
2305
2306
2307
2308
2309
2310
2311
2312
2313
2314
2315
2316
2317
2318
2319
2320
2321
2322
2323
2324
2325
2326
2327
2328
2329
2330
2331
2332
2333
2334
2335
2336
2337
2338
2339
2340
2341
2342
2343
2344
2345
2346
2347
2348
2349
2350
2351
2352
2353
2354
2355
2356
2357
2358
2359
2360
2361
2362
2363
2364
2365
2366
2367
2368
2369
2370
2371
2372
2373
2374
2375
2376
2377
2378
2379
2380
2381
2382
2383
2384
2385
2386
2387
2388
2389
2390
2391
2392
2393
2394
2395
2396
2397
2398
2399
2400
2401
2402
2403
2404
2405
2406
2407
2408
2409
2410
2411
2412
2413
2414
2415
2416
2417
2418
2419
2420
2421
2422
2423
2424
2425
2426
2427
2428
2429
2430
2431
2432
2433
2434
2435
2436
2437
2438
2439
2440
2441
2442
2443
2444
2445
2446
2447
2448
2449
2450
2451
2452
2453
2454
2455
2456
2457
2458
2459
2460
2461
2462
2463
2464
2465
2466
2467
2468
2469
2470
2471
2472
2473
2474
2475
2476
2477
2478
2479
2480
2481
2482
2483
2484
2485
2486
2487
2488
2489
2490
2491
2492
2493
2494
2495
2496
2497
2498
2499
2500
2501
2502
2503
2504
2505
2506
2507
2508
2509
2510
2511
2512
2513
2514
2515
2516
2517
2518
2519
2520
2521
2522
2523
2524
2525
2526
2527
2528
2529
2530
2531
2532
2533
2534
2535
2536
2537
2538
2539
2540
2541
2542
2543
2544
2545
2546
2547
2548
2549
2550
2551
2552
2553
2554
2555
2556
2557
2558
2559
2560
2561
2562
2563
2564
2565
2566
2567
2568
2569
2570
2571
2572
2573
2574
2575
2576
2577
2578
2579
2580
2581
2582
2583
2584
2585
2586
2587
2588
2589
2590
2591
2592
2593
2594
2595
2596
2597
2598
2599
2600
2601
2602
2603
2604
2605
2606
2607
2608
2609
2610
2611
2612
2613
2614
2615
2616
2617
2618
2619
2620
2621
2622
2623
2624
2625
2626
2627
2628
2629
2630
2631
2632
2633
2634
2635
2636
2637
2638
2639
2640
2641
2642
2643
2644
2645
2646
2647
2648
2649
2650
2651
2652
2653
2654
2655
2656
2657
2658
2659
2660
2661
2662
2663
2664
2665
2666
2667
2668
2669
2670
2671
2672
2673
2674
2675
2676
2677
2678
2679
2680
2681
2682
2683
2684
2685
2686
2687
2688
2689
2690
2691
2692
2693
2694
2695
2696
2697
2698
2699
2700
2701
2702
2703
2704
2705
2706
2707
2708
2709
2710
2711
2712
2713
2714
2715
2716
2717
2718
2719
2720
2721
2722
2723
2724
2725
2726
2727
2728
2729
2730
2731
2732
2733
2734
2735
2736
2737
2738
2739
2740
2741
2742
2743
2744
2745
2746
2747
2748
2749
2750
2751
2752
2753
2754
2755
2756
2757
2758
2759
2760
2761
2762
2763
2764
2765
2766
2767
2768
2769
2770
2771
2772
2773
2774
2775
2776
2777
2778
2779
2780
2781
2782
2783
2784
2785
2786
2787
2788
2789
2790
2791
2792
2793
2794
2795
2796
2797
2798
2799
2800
2801
2802
2803
2804
2805
2806
2807
2808
2809
2810
2811
2812
2813
2814
2815
2816
2817
2818
2819
2820
2821
2822
2823
2824
2825
2826
2827
2828
2829
2830
2831
2832
2833
2834
2835
2836
2837
2838
2839
2840
2841
2842
2843
2844
2845
2846
2847
2848
2849
2850
2851
2852
2853
2854
2855
2856
2857
2858
2859
2860
2861
2862
2863
2864
2865
2866
2867
2868
2869
2870
2871
2872
2873
2874
2875
2876
2877
2878
2879
2880
2881
2882
2883
2884
2885
2886
2887
2888
2889
2890
2891
2892
2893
2894
2895
2896
2897
2898
2899
2900
2901
2902
2903
2904
2905
2906
2907
2908
2909
2910
2911
2912
2913
2914
2915
2916
2917
2918
2919
2920
2921
2922
2923
2924
2925
2926
2927
2928
2929
2930
2931
2932
2933
2934
2935
2936
2937
2938
2939
2940
2941
2942
2943
2944
2945
2946
2947
2948
2949
2950
2951
2952
2953
2954
2955
2956
2957
2958
2959
2960
2961
2962
2963
2964
2965
2966
2967
2968
2969
2970
2971
2972
2973
2974
2975
2976
2977
2978
2979
2980
2981
2982
2983
2984
2985
2986
2987
2988
2989
2990
2991
2992
2993
2994
2995
2996
2997
2998
2999
3000
3001
3002
3003
3004
3005
3006
3007
3008
3009
3010
3011
3012
3013
3014
3015
3016
3017
3018
3019
3020
3021
3022
3023
3024
3025
3026
3027
3028
3029
3030
3031
3032
3033
3034
3035
3036
3037
3038
3039
3040
3041
3042
3043
3044
3045
3046
3047
3048
3049
3050
3051
3052
3053
3054
3055
3056
3057
3058
3059
3060
3061
3062
3063
3064
3065
3066
3067
3068
3069
3070
3071
3072
3073
3074
3075
3076
3077
3078
3079
3080
3081
3082
3083
3084
3085
3086
3087
3088
3089
3090
3091
3092
3093
3094
3095
3096
3097
3098
3099
3100
3101
3102
3103
3104
3105
3106
3107
3108
3109
3110
3111
3112
3113
3114
3115
3116
3117
3118
3119
3120
3121
3122
3123
3124
3125
3126
3127
3128
3129
3130
3131
3132
3133
3134
3135
3136
3137
3138
3139
3140
3141
3142
3143
3144
3145
3146
3147
3148
3149
3150
3151
3152
3153
3154
3155
3156
3157
3158
3159
3160
3161
3162
3163
3164
3165
3166
3167
3168
3169
3170
3171
3172
3173
3174
3175
3176
3177
3178
3179
3180
3181
3182
3183
3184
3185
3186
3187
3188
3189
3190
3191
3192
3193
3194
3195
3196
3197
3198
3199
3200
3201
3202
3203
3204
3205
3206
3207
3208
3209
3210
3211
3212
3213
3214
3215
3216
3217
3218
3219
3220
3221
3222
3223
3224
3225
3226
3227
3228
3229
3230
3231
3232
3233
3234
3235
3236
3237
3238
3239
3240
3241
3242
3243
3244
3245
3246
3247
3248
3249
3250
3251
3252
3253
3254
3255
3256
3257
3258
3259
3260
3261
3262
3263
3264
3265
3266
3267
3268
3269
3270
3271
3272
3273
3274
3275
3276
3277
3278
3279
3280
3281
3282
3283
3284
3285
3286
3287
3288
3289
3290
3291
3292
3293
3294
3295
3296
3297
3298
3299
3300
3301
3302
3303
3304
3305
3306
3307
3308
3309
3310
3311
3312
3313
3314
3315
3316
3317
3318
3319
3320
3321
3322
3323
3324
3325
3326
3327
3328
3329
3330
3331
3332
3333
3334
3335
3336
3337
3338
3339
3340
3341
3342
3343
3344
3345
3346
3347
3348
3349
3350
3351
3352
3353
3354
3355
3356
3357
3358
3359
3360
3361
3362
3363
3364
3365
3366
3367
3368
3369
3370
3371
3372
3373
3374
3375
3376
3377
3378
3379
3380
3381
3382
3383
3384
3385
3386
3387
3388
3389
3390
3391
3392
3393
3394
3395
3396
3397
3398
3399
3400
3401
3402
3403
3404
3405
3406
3407
3408
3409
3410
3411
3412
3413
3414
3415
3416
3417
3418
3419
3420
3421
3422
3423
3424
3425
3426
3427
3428
3429
3430
3431
3432
3433
3434
3435
3436
3437
3438
3439
3440
3441
3442
3443
3444
3445
3446
3447
3448
3449
3450
3451
3452
3453
3454
3455
3456
3457
3458
3459
3460
3461
3462
3463
3464
3465
3466
3467
3468
3469
3470
3471
3472
3473
3474
3475
3476
3477
3478
3479
3480
3481
3482
3483
3484
3485
3486
3487
3488
3489
3490
3491
3492
3493
3494
3495
3496
3497
3498
3499
3500
3501
3502
3503
3504
3505
3506
3507
3508
3509
3510
3511
3512
3513
3514
3515
3516
3517
3518
3519
3520
3521
3522
3523
3524
3525
3526
3527
3528
3529
3530
3531
3532
3533
3534
3535
3536
3537
3538
3539
3540
3541
3542
3543
3544
3545
3546
3547
3548
3549
3550
3551
3552
3553
3554
3555
3556
3557
3558
3559
3560
3561
3562
3563
3564
3565
3566
3567
3568
3569
3570
3571
3572
3573
3574
3575
3576
3577
3578
3579
3580
3581
3582
3583
3584
3585
3586
3587
3588
3589
3590
3591
3592
3593
3594
3595
3596
3597
3598
3599
3600
3601
3602
3603
3604
3605
3606
3607
3608
3609
3610
3611
3612
3613
3614
3615
3616
3617
3618
3619
3620
3621
3622
3623
3624
3625
3626
3627
3628
3629
3630
3631
3632
3633
3634
3635
3636
3637
3638
3639
3640
3641
3642
3643
3644
3645
3646
3647
3648
3649
3650
3651
3652
3653
3654
3655
3656
3657
3658
3659
3660
3661
3662
3663
3664
3665
3666
3667
3668
3669
3670
3671
3672
3673
3674
3675
3676
3677
3678
3679
3680
3681
3682
3683
3684
3685
3686
3687
3688
3689
3690
3691
3692
3693
3694
3695
3696
3697
3698
3699
3700
3701
3702
3703
3704
3705
3706
3707
3708
3709
3710
3711
3712
3713
3714
3715
3716
3717
3718
3719
3720
3721
3722
3723
3724
3725
3726
3727
3728
3729
3730
3731
3732
3733
3734
3735
3736
3737
3738
3739
3740
3741
3742
3743
3744
3745
3746
3747
3748
3749
3750
3751
3752
3753
3754
3755
3756
3757
3758
3759
3760
3761
3762
3763
3764
3765
3766
3767
3768
3769
3770
3771
3772
3773
3774
3775
3776
3777
3778
3779
3780
3781
3782
3783
3784
3785
3786
3787
3788
3789
3790
3791
3792
3793
3794
3795
3796
3797
3798
3799
3800
3801
3802
3803
3804
3805
3806
3807
3808
3809
3810
3811
3812
3813
3814
3815
3816
3817
3818
3819
3820
3821
3822
3823
3824
3825
3826
3827
3828
3829
3830
3831
3832
3833
3834
3835
3836
3837
3838
3839
3840
3841
3842
3843
3844
3845
3846
3847
3848
3849
3850
3851
3852
3853
3854
3855
3856
3857
3858
3859
3860
3861
3862
3863
3864
3865
3866
3867
3868
3869
3870
3871
3872
3873
3874
3875
3876
3877
3878
3879
3880
3881
3882
3883
3884
3885
3886
3887
3888
3889
3890
3891
3892
3893
3894
3895
3896
3897
3898
3899
3900
3901
3902
3903
3904
3905
3906
3907
3908
3909
3910
3911
3912
3913
3914
3915
3916
3917
3918
3919
3920
3921
3922
3923
3924
3925
3926
3927
3928
3929
3930
3931
3932
3933
3934
3935
3936
3937
3938
3939
3940
3941
3942
3943
3944
3945
3946
3947
3948
3949
3950
3951
3952
3953
3954
3955
3956
3957
3958
3959
3960
3961
3962
3963
3964
3965
3966
3967
3968
3969
3970
3971
3972
3973
3974
3975
3976
3977
3978
3979
3980
3981
3982
3983
3984
3985
3986
3987
3988
3989
3990
3991
3992
3993
3994
3995
3996
3997
3998
3999
4000
4001
4002
4003
4004
4005
4006
4007
4008
4009
4010
4011
4012
4013
4014
4015
4016
4017
4018
4019
4020
4021
4022
4023
4024
4025
4026
4027
4028
4029
4030
4031
4032
4033
4034
4035
4036
4037
4038
4039
4040
4041
4042
4043
4044
4045
4046
4047
4048
4049
4050
4051
4052
4053
4054
4055
4056
4057
4058
4059
4060
4061
4062
4063
4064
4065
4066
4067
4068
4069
4070
4071
4072
4073
4074
4075
4076
4077
4078
4079
4080
4081
4082
4083
4084
4085
4086
4087
4088
4089
4090
4091
4092
4093
4094
4095
4096
4097
4098
4099
4100
4101
4102
4103
4104
4105
4106
4107
4108
4109
4110
4111
4112
4113
4114
4115
4116
4117
4118
4119
4120
4121
4122
4123
4124
4125
4126
4127
4128
4129
4130
4131
4132
4133
4134
4135
4136
4137
4138
4139
4140
4141
4142
4143
4144
4145
4146
4147
4148
4149
4150
4151
4152
4153
4154
4155
4156
4157
4158
4159
4160
4161
4162
4163
4164
4165
4166
4167
4168
4169
4170
4171
4172
4173
4174
4175
4176
4177
4178
4179
4180
4181
4182
4183
4184
4185
4186
4187
4188
4189
4190
4191
4192
4193
4194
4195
4196
4197
4198
4199
4200
4201
4202
4203
4204
4205
4206
4207
4208
4209
4210
4211
4212
4213
4214
4215
4216
4217
4218
4219
4220
4221
4222
4223
4224
4225
4226
4227
4228
4229
4230
4231
4232
4233
4234
4235
4236
4237
4238
4239
4240
4241
4242
4243
4244
4245
4246
4247
4248
4249
4250
4251
4252
4253
4254
4255
4256
4257
4258
4259
4260
4261
4262
4263
4264
4265
4266
4267
4268
4269
4270
4271
4272
4273
4274
4275
4276
4277
4278
4279
4280
4281
4282
4283
4284
4285
4286
4287
4288
4289
4290
4291
4292
4293
4294
4295
4296
4297
4298
4299
4300
4301
4302
4303
4304
4305
4306
4307
4308
4309
4310
4311
4312
4313
4314
4315
4316
4317
4318
4319
4320
4321
4322
4323
4324
4325
4326
4327
4328
4329
4330
4331
4332
4333
4334
4335
4336
4337
4338
4339
4340
4341
4342
4343
4344
4345
4346
4347
4348
4349
4350
4351
4352
4353
4354
4355
4356
4357
4358
4359
4360
4361
4362
4363
4364
4365
4366
4367
4368
4369
4370
4371
4372
4373
4374
4375
4376
4377
4378
4379
4380
4381
4382
4383
4384
4385
4386
4387
4388
4389
4390
4391
4392
4393
4394
4395
4396
4397
4398
4399
4400
4401
4402
4403
4404
4405
4406
4407
4408
4409
4410
4411
4412
4413
4414
4415
4416
4417
4418
4419
4420
4421
4422
4423
4424
4425
4426
4427
4428
4429
4430
4431
4432
4433
4434
4435
4436
4437
4438
4439
4440
4441
4442
4443
4444
4445
4446
4447
4448
4449
4450
4451
4452
4453
4454
4455
4456
4457
4458
4459
4460
4461
4462
4463
4464
4465
4466
4467
4468
4469
4470
4471
4472
4473
4474
4475
4476
4477
4478
4479
4480
4481
4482
4483
4484
4485
4486
4487
4488
4489
4490
4491
4492
4493
4494
4495
4496
4497
4498
4499
4500
4501
4502
4503
4504
4505
4506
4507
4508
4509
4510
4511
4512
4513
4514
4515
4516
4517
4518
4519
4520
4521
4522
4523
4524
4525
4526
4527
4528
4529
4530
4531
4532
4533
4534
4535
4536
4537
4538
4539
4540
4541
4542
4543
4544
4545
4546
4547
4548
4549
4550
4551
4552
4553
4554
4555
4556
4557
4558
4559
4560
4561
4562
4563
4564
4565
4566
4567
4568
4569
4570
4571
4572
4573
4574
4575
4576
4577
4578
4579
4580
4581
4582
4583
4584
4585
4586
4587
4588
4589
4590
4591
4592
4593
4594
4595
4596
4597
4598
4599
4600
4601
4602
4603
4604
4605
4606
4607
4608
4609
4610
4611
4612
4613
4614
4615
4616
4617
4618
4619
4620
4621
4622
4623
4624
4625
4626
4627
4628
4629
4630
4631
4632
4633
4634
4635
4636
4637
4638
4639
4640
4641
4642
4643
4644
4645
4646
4647
4648
4649
4650
4651
4652
4653
4654
4655
4656
4657
4658
4659
4660
4661
4662
4663
4664
4665
4666
4667
4668
4669
4670
4671
4672
4673
4674
4675
4676
4677
4678
4679
4680
4681
4682
4683
4684
4685
4686
4687
4688
4689
4690
4691
4692
4693
4694
4695
4696
4697
4698
4699
4700
4701
4702
4703
4704
4705
4706
4707
4708
4709
4710
4711
4712
4713
4714
4715
4716
4717
4718
4719
4720
4721
4722
4723
4724
4725
4726
4727
4728
4729
4730
4731
4732
4733
4734
4735
4736
4737
4738
4739
4740
4741
4742
4743
4744
4745
4746
4747
4748
4749
4750
4751
4752
4753
4754
4755
4756
4757
4758
4759
4760
4761
4762
4763
4764
4765
4766
4767
4768
4769
4770
4771
4772
4773
4774
4775
4776
4777
4778
4779
4780
4781
4782
4783
4784
4785
4786
4787
4788
4789
4790
4791
4792
4793
4794
4795
4796
4797
4798
4799
4800
4801
4802
4803
4804
4805
4806
4807
4808
4809
4810
4811
4812
4813
4814
4815
4816
4817
4818
4819
4820
4821
4822
4823
4824
4825
4826
4827
4828
4829
4830
4831
4832
4833
4834
4835
4836
4837
4838
4839
4840
4841
4842
4843
4844
4845
4846
4847
4848
4849
4850
4851
4852
4853
4854
4855
4856
4857
4858
4859
4860
4861
4862
4863
4864
4865
4866
4867
4868
4869
4870
4871
4872
4873
4874
4875
4876
4877
4878
4879
4880
4881
4882
4883
4884
4885
4886
4887
4888
4889
4890
4891
4892
4893
4894
4895
4896
4897
4898
4899
4900
4901
4902
4903
4904
4905
4906
4907
4908
4909
4910
4911
4912
4913
4914
4915
4916
4917
4918
4919
4920
4921
4922
4923
4924
4925
4926
4927
4928
4929
4930
4931
4932
4933
4934
4935
4936
4937
4938
4939
4940
4941
4942
4943
4944
4945
4946
4947
4948
4949
4950
4951
4952
4953
4954
4955
4956
4957
4958
4959
4960
4961
4962
4963
4964
4965
4966
4967
4968
4969
4970
4971
4972
4973
4974
4975
4976
4977
4978
4979
4980
4981
4982
4983
4984
4985
4986
4987
4988
4989
4990
4991
4992
4993
4994
4995
4996
4997
4998
4999
5000
5001
5002
5003
5004
5005
5006
5007
5008
5009
5010
5011
5012
5013
5014
5015
5016
5017
5018
5019
5020
5021
5022
5023
5024
5025
5026
5027
5028
5029
5030
5031
5032
5033
5034
5035
5036
5037
5038
5039
5040
5041
5042
5043
5044
5045
5046
5047
5048
5049
5050
5051
5052
5053
5054
5055
5056
5057
5058
5059
5060
5061
5062
5063
5064
5065
5066
5067
5068
5069
5070
5071
5072
5073
5074
5075
5076
5077
5078
5079
5080
5081
5082
5083
5084
5085
5086
5087
5088
5089
5090
5091
5092
5093
5094
5095
5096
5097
5098
5099
5100
5101
5102
5103
5104
5105
5106
5107
5108
5109
5110
5111
5112
5113
5114
5115
5116
5117
5118
5119
5120
5121
5122
5123
5124
5125
5126
5127
5128
5129
5130
5131
5132
5133
5134
5135
5136
5137
5138
5139
5140
5141
5142
5143
5144
5145
5146
5147
5148
5149
5150
5151
5152
5153
5154
5155
5156
5157
5158
5159
5160
5161
5162
5163
5164
5165
5166
5167
5168
5169
5170
5171
5172
5173
5174
5175
5176
5177
5178
5179
5180
5181
5182
5183
5184
5185
5186
5187
5188
5189
5190
5191
5192
5193
5194
5195
5196
5197
5198
5199
5200
5201
5202
5203
5204
5205
5206
5207
5208
5209
5210
5211
5212
5213
5214
5215
5216
5217
5218
5219
5220
5221
5222
5223
5224
5225
5226
5227
5228
5229
5230
5231
5232
5233
5234
5235
5236
5237
5238
5239
5240
5241
5242
5243
5244
5245
5246
5247
5248
5249
5250
5251
5252
5253
5254
5255
5256
5257
5258
5259
5260
5261
5262
5263
5264
5265
5266
5267
5268
5269
5270
5271
5272
5273
5274
5275
5276
5277
5278
5279
5280
5281
5282
5283
5284
5285
5286
5287
5288
5289
5290
5291
5292
5293
5294
5295
5296
5297
5298
5299
5300
5301
5302
5303
5304
5305
5306
5307
5308
5309
5310
5311
5312
5313
5314
5315
5316
5317
5318
5319
5320
5321
5322
5323
5324
5325
5326
5327
5328
5329
5330
5331
5332
5333
5334
5335
5336
5337
5338
5339
5340
5341
5342
5343
5344
5345
5346
5347
5348
5349
5350
5351
5352
5353
5354
5355
5356
5357
5358
5359
5360
5361
5362
5363
5364
5365
5366
5367
5368
5369
5370
5371
5372
5373
5374
5375
5376
5377
5378
5379
5380
5381
5382
5383
5384
5385
5386
5387
5388
5389
5390
5391
5392
5393
5394
5395
5396
5397
5398
5399
5400
5401
5402
5403
5404
5405
5406
5407
5408
5409
5410
5411
5412
5413
5414
5415
5416
5417
5418
5419
5420
5421
5422
5423
5424
5425
5426
5427
5428
5429
5430
5431
5432
5433
5434
5435
5436
5437
5438
5439
5440
5441
5442
5443
5444
5445
5446
5447
5448
5449
5450
5451
5452
5453
5454
5455
5456
5457
5458
5459
5460
5461
5462
5463
5464
5465
5466
5467
5468
5469
5470
5471
5472
5473
5474
5475
5476
5477
5478
5479
5480
5481
5482
5483
5484
5485
5486
5487
5488
5489
5490
5491
5492
5493
5494
5495
5496
5497
5498
5499
5500
5501
5502
5503
5504
5505
5506
5507
5508
5509
5510
5511
5512
5513
5514
5515
5516
5517
5518
5519
5520
5521
5522
5523
5524
5525
5526
5527
5528
5529
5530
5531
5532
5533
5534
5535
5536
5537
5538
5539
5540
5541
5542
5543
5544
5545
5546
5547
5548
5549
5550
5551
5552
5553
5554
5555
5556
5557
5558
5559
5560
5561
5562
5563
5564
5565
5566
5567
5568
5569
5570
5571
5572
5573
5574
5575
5576
5577
5578
5579
5580
5581
5582
5583
5584
5585
5586
5587
5588
5589
5590
5591
5592
5593
5594
5595
5596
5597
5598
5599
5600
5601
5602
5603
5604
5605
5606
5607
5608
5609
5610
5611
5612
5613
5614
5615
5616
5617
5618
5619
5620
5621
5622
5623
5624
5625
5626
5627
5628
5629
5630
5631
5632
5633
5634
5635
5636
5637
5638
5639
5640
5641
5642
5643
5644
5645
5646
5647
5648
5649
5650
5651
5652
5653
5654
5655
5656
5657
5658
5659
5660
5661
5662
5663
5664
5665
5666
5667
5668
5669
5670
5671
5672
5673
5674
5675
5676
5677
5678
5679
5680
5681
5682
5683
5684
5685
5686
5687
5688
5689
5690
5691
5692
5693
5694
5695
5696
5697
5698
5699
5700
5701
5702
5703
5704
5705
5706
5707
5708
5709
5710
5711
5712
5713
5714
5715
5716
5717
5718
5719
5720
5721
5722
5723
5724
5725
5726
5727
5728
5729
5730
5731
5732
5733
5734
5735
5736
5737
5738
5739
5740
5741
5742
5743
5744
5745
5746
5747
5748
5749
5750
5751
5752
5753
5754
5755
5756
5757
5758
5759
5760
5761
5762
5763
5764
5765
5766
5767
5768
5769
5770
5771
5772
5773
5774
5775
5776
5777
5778
5779
5780
5781
5782
5783
5784
5785
5786
5787
5788
5789
5790
5791
5792
5793
5794
5795
5796
5797
5798
5799
5800
5801
5802
5803
5804
5805
5806
5807
5808
5809
5810
5811
5812
5813
5814
5815
5816
5817
5818
5819
5820
5821
5822
5823
5824
5825
5826
5827
5828
5829
5830
5831
5832
5833
5834
5835
5836
5837
5838
5839
5840
5841
5842
5843
5844
5845
5846
5847
5848
5849
5850
5851
5852
5853
5854
5855
5856
5857
5858
5859
5860
5861
5862
5863
5864
5865
5866
5867
5868
5869
5870
5871
5872
5873
5874
5875
5876
5877
5878
5879
5880
5881
5882
5883
5884
5885
5886
5887
5888
5889
5890
5891
5892
5893
5894
5895
5896
5897
5898
5899
5900
5901
5902
5903
5904
5905
5906
5907
5908
5909
5910
5911
5912
5913
5914
5915
5916
5917
5918
5919
5920
5921
5922
5923
5924
5925
5926
5927
5928
5929
5930
5931
5932
5933
5934
5935
5936
5937
5938
5939
5940
5941
5942
5943
5944
5945
5946
5947
5948
5949
5950
5951
5952
5953
5954
5955
5956
5957
5958
5959
5960
5961
5962
5963
5964
5965
5966
5967
5968
5969
5970
5971
5972
5973
5974
5975
5976
5977
5978
5979
5980
5981
5982
5983
5984
5985
5986
5987
5988
5989
5990
5991
5992
5993
5994
5995
5996
5997
5998
5999
6000
6001
6002
6003
6004
6005
6006
6007
6008
6009
6010
6011
6012
6013
6014
6015
6016
6017
6018
6019
6020
6021
6022
6023
6024
6025
6026
6027
6028
6029
6030
6031
6032
6033
6034
6035
6036
6037
6038
6039
6040
6041
6042
6043
6044
6045
6046
6047
6048
6049
6050
6051
6052
6053
6054
6055
6056
6057
6058
6059
6060
6061
6062
6063
6064
6065
6066
6067
6068
6069
6070
6071
6072
6073
6074
6075
6076
6077
6078
6079
6080
6081
6082
6083
6084
6085
6086
6087
6088
6089
6090
6091
6092
6093
6094
6095
6096
6097
6098
6099
6100
6101
6102
6103
6104
6105
6106
6107
6108
6109
6110
6111
6112
6113
6114
6115
6116
6117
6118
6119
6120
6121
6122
6123
6124
6125
6126
6127
6128
6129
6130
6131
6132
6133
6134
6135
6136
6137
6138
6139
6140
6141
6142
6143
6144
6145
6146
6147
6148
6149
6150
6151
6152
6153
6154
6155
6156
6157
6158
6159
6160
6161
6162
6163
6164
6165
6166
6167
6168
6169
6170
6171
6172
6173
6174
6175
6176
6177
6178
6179
6180
6181
6182
6183
6184
6185
6186
6187
6188
6189
6190
6191
6192
6193
6194
6195
6196
6197
6198
6199
6200
6201
6202
6203
6204
6205
6206
6207
6208
6209
6210
6211
6212
6213
6214
6215
6216
6217
6218
6219
6220
6221
6222
6223
6224
6225
6226
6227
6228
6229
6230
6231
6232
6233
6234
6235
6236
6237
6238
6239
6240
6241
6242
6243
6244
6245
6246
6247
6248
6249
6250
6251
6252
6253
6254
6255
6256
6257
6258
6259
6260
6261
6262
6263
6264
6265
6266
6267
6268
6269
6270
6271
6272
6273
6274
6275
6276
6277
6278
6279
6280
6281
6282
6283
6284
6285
6286
6287
6288
6289
6290
6291
6292
6293
6294
6295
6296
6297
6298
6299
6300
6301
6302
6303
6304
6305
6306
6307
6308
6309
6310
6311
6312
6313
6314
6315
6316
6317
6318
6319
6320
6321
6322
6323
6324
6325
6326
6327
6328
6329
6330
6331
6332
6333
6334
6335
6336
6337
6338
6339
6340
6341
6342
6343
6344
6345
6346
6347
6348
6349
6350
6351
6352
6353
6354
6355
6356
6357
6358
6359
6360
6361
6362
6363
6364
6365
6366
6367
6368
6369
6370
6371
6372
6373
6374
6375
6376
6377
6378
6379
6380
6381
6382
6383
6384
6385
6386
6387
6388
6389
6390
6391
6392
6393
6394
6395
6396
6397
6398
6399
6400
6401
6402
6403
6404
6405
6406
6407
6408
6409
6410
6411
6412
6413
6414
6415
6416
6417
6418
6419
6420
6421
6422
6423
6424
6425
6426
6427
6428
6429
6430
6431
6432
6433
6434
6435
6436
6437
6438
6439
6440
6441
6442
6443
6444
6445
6446
6447
6448
6449
6450
6451
6452
6453
6454
6455
6456
6457
6458
6459
6460
6461
6462
6463
6464
6465
6466
6467
6468
6469
6470
6471
6472
6473
6474
6475
6476
6477
6478
6479
6480
6481
6482
6483
6484
6485
6486
6487
6488
6489
6490
6491
6492
6493
6494
6495
6496
6497
6498
6499
6500
6501
6502
6503
6504
6505
6506
6507
6508
6509
6510
6511
6512
6513
6514
6515
6516
6517
6518
6519
6520
6521
6522
6523
6524
6525
6526
6527
6528
6529
6530
6531
6532
6533
6534
6535
6536
6537
6538
6539
6540
6541
6542
6543
6544
6545
6546
6547
6548
6549
6550
6551
6552
6553
6554
6555
6556
6557
6558
6559
6560
6561
6562
6563
6564
6565
6566
6567
6568
6569
6570
6571
6572
6573
6574
6575
6576
6577
6578
6579
6580
6581
6582
6583
6584
6585
6586
6587
6588
6589
6590
6591
6592
6593
6594
6595
6596
6597
6598
6599
6600
6601
6602
6603
6604
6605
6606
6607
6608
6609
6610
6611
6612
6613
6614
6615
6616
6617
6618
6619
6620
6621
6622
6623
6624
6625
6626
6627
6628
6629
6630
6631
6632
6633
6634
6635
6636
6637
6638
6639
6640
6641
6642
6643
6644
6645
6646
6647
6648
6649
6650
6651
6652
6653
6654
6655
6656
6657
6658
6659
6660
6661
6662
6663
6664
6665
6666
6667
6668
6669
6670
6671
6672
6673
6674
6675
6676
6677
6678
6679
6680
6681
6682
6683
6684
6685
6686
6687
6688
6689
6690
6691
6692
6693
6694
6695
6696
6697
6698
6699
6700
6701
6702
6703
6704
6705
6706
6707
6708
6709
6710
6711
6712
6713
6714
6715
6716
6717
6718
6719
6720
6721
6722
6723
6724
6725
6726
6727
6728
6729
6730
6731
6732
6733
6734
6735
6736
6737
6738
6739
6740
6741
6742
6743
6744
6745
6746
6747
6748
6749
6750
6751
6752
6753
6754
6755
6756
6757
6758
6759
6760
6761
6762
6763
6764
6765
6766
6767
6768
6769
6770
6771
6772
6773
6774
6775
6776
6777
6778
6779
6780
6781
6782
6783
6784
6785
6786
6787
6788
6789
6790
6791
6792
6793
6794
6795
6796
6797
6798
6799
6800
6801
6802
6803
6804
6805
6806
6807
6808
6809
6810
6811
6812
6813
6814
6815
6816
6817
6818
6819
6820
6821
6822
6823
6824
6825
6826
6827
6828
6829
6830
6831
6832
6833
6834
6835
6836
6837
6838
6839
6840
6841
6842
6843
6844
6845
6846
6847
6848
6849
6850
6851
6852
6853
6854
6855
6856
6857
6858
6859
6860
6861
6862
6863
6864
6865
6866
6867
6868
6869
6870
6871
6872
6873
6874
6875
6876
6877
6878
6879
6880
6881
6882
6883
6884
6885
6886
6887
6888
6889
6890
6891
6892
6893
6894
6895
6896
6897
6898
6899
6900
6901
6902
6903
6904
6905
6906
6907
6908
6909
6910
6911
6912
6913
6914
6915
6916
6917
6918
6919
6920
6921
6922
6923
6924
6925
6926
6927
6928
6929
6930
6931
6932
6933
6934
6935
6936
6937
6938
6939
6940
6941
6942
6943
6944
6945
6946
6947
6948
6949
6950
6951
6952
6953
6954
6955
6956
6957
6958
6959
6960
6961
6962
6963
6964
6965
6966
6967
6968
6969
6970
6971
6972
6973
6974
6975
6976
6977
6978
6979
6980
6981
6982
6983
6984
6985
6986
6987
6988
6989
6990
6991
6992
6993
6994
6995
6996
6997
6998
6999
7000
7001
7002
7003
7004
7005
7006
7007
7008
7009
7010
7011
7012
7013
7014
7015
7016
7017
7018
7019
7020
7021
7022
7023
7024
7025
7026
7027
7028
7029
7030
7031
7032
7033
7034
7035
7036
7037
7038
7039
7040
7041
7042
7043
7044
7045
7046
7047
7048
7049
7050
7051
7052
7053
7054
7055
7056
7057
7058
7059
7060
7061
7062
7063
7064
7065
7066
7067
7068
7069
7070
7071
7072
7073
7074
7075
7076
7077
7078
7079
7080
7081
7082
7083
7084
7085
7086
7087
7088
7089
7090
7091
7092
7093
7094
7095
7096
7097
7098
7099
7100
7101
7102
7103
7104
7105
7106
7107
7108
7109
7110
7111
7112
7113
7114
7115
7116
7117
7118
7119
7120
7121
7122
7123
7124
7125
7126
7127
7128
7129
7130
7131
7132
7133
7134
7135
7136
7137
7138
7139
7140
7141
7142
7143
7144
7145
7146
7147
7148
7149
7150
7151
7152
7153
7154
7155
7156
7157
7158
7159
7160
7161
7162
7163
7164
7165
7166
7167
7168
7169
7170
7171
7172
7173
7174
7175
7176
7177
7178
7179
7180
7181
7182
7183
7184
7185
7186
7187
7188
7189
7190
7191
7192
7193
7194
7195
7196
7197
7198
7199
7200
7201
7202
7203
7204
7205
7206
7207
7208
7209
7210
7211
7212
7213
7214
7215
7216
7217
7218
7219
7220
7221
7222
7223
7224
7225
7226
7227
7228
7229
7230
7231
7232
7233
7234
7235
7236
7237
7238
7239
7240
7241
7242
7243
7244
7245
7246
7247
7248
7249
7250
7251
7252
7253
7254
7255
7256
7257
7258
7259
7260
7261
7262
7263
7264
7265
7266
7267
7268
7269
7270
7271
7272
7273
7274
7275
7276
7277
7278
7279
7280
7281
7282
7283
7284
7285
7286
7287
7288
7289
7290
7291
7292
7293
7294
7295
7296
7297
7298
7299
7300
7301
7302
7303
7304
7305
7306
7307
7308
7309
7310
7311
7312
7313
7314
7315
7316
7317
7318
7319
7320
7321
7322
7323
7324
7325
7326
7327
7328
7329
7330
7331
7332
7333
7334
7335
7336
7337
7338
7339
7340
7341
7342
7343
7344
7345
7346
7347
7348
7349
7350
7351
7352
7353
7354
7355
7356
7357
7358
7359
7360
7361
7362
7363
7364
7365
7366
7367
7368
7369
7370
7371
7372
7373
7374
7375
7376
7377
7378
7379
7380
7381
7382
7383
7384
7385
7386
7387
7388
7389
7390
7391
7392
7393
7394
7395
7396
7397
7398
7399
7400
7401
7402
7403
7404
7405
7406
7407
7408
7409
7410
7411
7412
7413
7414
7415
7416
7417
7418
7419
7420
7421
7422
7423
7424
7425
7426
7427
7428
7429
7430
7431
7432
7433
7434
7435
7436
7437
7438
7439
7440
7441
7442
7443
7444
7445
7446
7447
7448
7449
7450
7451
7452
7453
7454
7455
7456
7457
7458
7459
7460
7461
7462
7463
7464
7465
7466
7467
7468
7469
7470
7471
7472
7473
7474
7475
7476
7477
7478
7479
7480
7481
7482
7483
7484
7485
7486
7487
7488
7489
7490
7491
7492
7493
7494
7495
7496
7497
7498
7499
7500
7501
7502
7503
7504
7505
7506
7507
7508
7509
7510
7511
7512
7513
7514
7515
7516
7517
7518
7519
7520
7521
7522
7523
7524
7525
7526
7527
7528
7529
7530
7531
7532
7533
7534
7535
7536
7537
7538
7539
7540
7541
7542
7543
7544
7545
7546
7547
7548
7549
7550
7551
7552
7553
7554
7555
7556
7557
7558
7559
7560
7561
7562
7563
7564
7565
7566
7567
7568
7569
7570
7571
7572
7573
7574
7575
7576
7577
7578
7579
7580
7581
7582
7583
7584
7585
7586
7587
7588
7589
7590
7591
7592
7593
7594
7595
7596
7597
7598
7599
7600
7601
7602
7603
7604
7605
7606
7607
7608
7609
7610
7611
7612
7613
7614
7615
7616
7617
7618
7619
7620
7621
7622
7623
7624
7625
7626
7627
7628
7629
7630
7631
7632
7633
7634
7635
7636
7637
7638
7639
7640
7641
7642
7643
7644
7645
7646
7647
7648
7649
7650
7651
7652
7653
7654
7655
7656
7657
7658
7659
7660
7661
7662
7663
7664
7665
7666
7667
7668
7669
7670
7671
7672
7673
7674
7675
7676
7677
7678
7679
7680
7681
7682
7683
7684
7685
7686
7687
7688
7689
7690
7691
7692
7693
7694
7695
7696
7697
7698
7699
7700
7701
7702
7703
7704
7705
7706
7707
7708
7709
7710
7711
7712
7713
7714
7715
7716
7717
7718
7719
7720
7721
7722
7723
7724
7725
7726
7727
7728
7729
7730
7731
7732
7733
7734
7735
7736
7737
7738
7739
7740
7741
7742
7743
7744
7745
7746
7747
7748
7749
7750
7751
7752
7753
7754
7755
7756
7757
7758
7759
7760
7761
7762
7763
7764
7765
7766
7767
7768
7769
7770
7771
7772
7773
7774
7775
7776
7777
7778
7779
7780
7781
7782
7783
7784
7785
7786
7787
7788
7789
7790
7791
7792
7793
7794
7795
7796
7797
7798
7799
7800
7801
7802
7803
7804
7805
7806
7807
7808
7809
7810
7811
7812
7813
7814
7815
7816
7817
7818
7819
7820
7821
7822
7823
7824
7825
7826
7827
7828
7829
7830
7831
7832
7833
7834
7835
7836
7837
7838
7839
7840
7841
7842
7843
7844
7845
7846
7847
7848
7849
7850
7851
7852
7853
7854
7855
7856
7857
7858
7859
7860
7861
7862
7863
7864
7865
7866
7867
7868
7869
7870
7871
7872
7873
7874
7875
7876
7877
7878
7879
7880
7881
7882
7883
7884
7885
7886
7887
7888
7889
7890
7891
7892
@evaluate_all_lazy_load_args_in_client_methods
class Client(metaclass=ClientMetaClass):
    """ZenML client class.

    The ZenML client manages configuration options for ZenML stacks as well
    as their components.
    """

    _active_user: Optional["UserResponse"] = None
    _active_project: Optional["ProjectResponse"] = None
    _active_stack: Optional["StackResponse"] = None

    def __init__(
        self,
        root: Optional[Path] = None,
    ) -> None:
        """Initializes the global client instance.

        Client is a singleton class: only one instance can exist. Calling
        this constructor multiple times will always yield the same instance (see
        the exception below).

        The `root` argument is only meant for internal use and testing purposes.
        User code must never pass them to the constructor.
        When a custom `root` value is passed, an anonymous Client instance
        is created and returned independently of the Client singleton and
        that will have no effect as far as the rest of the ZenML core code is
        concerned.

        Instead of creating a new Client instance to reflect a different
        repository root, to change the active root in the global Client,
        call `Client().activate_root(<new-root>)`.

        Args:
            root: (internal use) custom root directory for the client. If
                no path is given, the repository root is determined using the
                environment variable `ZENML_REPOSITORY_PATH` (if set) and by
                recursively searching in the parent directories of the
                current working directory. Only used to initialize new
                clients internally.
        """
        self._root: Optional[Path] = None
        self._config: Optional[ClientConfiguration] = None

        self._set_active_root(root)

    @classmethod
    def get_instance(cls) -> Optional["Client"]:
        """Return the Client singleton instance.

        Returns:
            The Client singleton instance or None, if the Client hasn't
            been initialized yet.
        """
        return cls._global_client

    @classmethod
    def _reset_instance(cls, client: Optional["Client"] = None) -> None:
        """Reset the Client singleton instance.

        This method is only meant for internal use and testing purposes.

        Args:
            client: The Client instance to set as the global singleton.
                If None, the global Client singleton is reset to an empty
                value.
        """
        cls._global_client = client

    def _set_active_root(self, root: Optional[Path] = None) -> None:
        """Set the supplied path as the repository root.

        If a client configuration is found at the given path or the
        path, it is loaded and used to initialize the client.
        If no client configuration is found, the global configuration is
        used instead to manage the active stack, project etc.

        Args:
            root: The path to set as the active repository root. If not set,
                the repository root is determined using the environment
                variable `ZENML_REPOSITORY_PATH` (if set) and by recursively
                searching in the parent directories of the current working
                directory.
        """
        enable_warnings = handle_bool_env_var(
            ENV_ZENML_ENABLE_REPO_INIT_WARNINGS, False
        )
        self._root = self.find_repository(
            root, enable_warnings=enable_warnings
        )

        if not self._root:
            self._config = None
            if enable_warnings:
                logger.info("Running without an active repository root.")
        else:
            logger.debug("Using repository root %s.", self._root)
            self._config = self._load_config()

        # Sanitize the client configuration to reflect the current
        # settings
        self._sanitize_config()

    def _config_path(self) -> Optional[str]:
        """Path to the client configuration file.

        Returns:
            Path to the client configuration file or None if the client
            root has not been initialized yet.
        """
        if not self.config_directory:
            return None
        return str(self.config_directory / "config.yaml")

    def _sanitize_config(self) -> None:
        """Sanitize and save the client configuration.

        This method is called to ensure that the client configuration
        doesn't contain outdated information, such as an active stack or
        project that no longer exists.
        """
        if not self._config:
            return

        active_project, active_stack = self.zen_store.validate_active_config(
            self._config.active_project_id,
            self._config.active_stack_id,
            config_name="repo",
        )
        self._config.set_active_stack(active_stack)
        if active_project:
            self._config.set_active_project(active_project)

    def _load_config(self) -> Optional[ClientConfiguration]:
        """Loads the client configuration from disk.

        This happens if the client has an active root and the configuration
        file exists. If the configuration file doesn't exist, an empty
        configuration is returned.

        Returns:
            Loaded client configuration or None if the client does not
            have an active root.
        """
        config_path = self._config_path()
        if not config_path:
            return None

        # load the client configuration file if it exists, otherwise use
        # an empty configuration as default
        if fileio.exists(config_path):
            logger.debug(f"Loading client configuration from {config_path}.")
        else:
            logger.debug(
                "No client configuration file found, creating default "
                "configuration."
            )

        return ClientConfiguration(config_file=config_path)

    @staticmethod
    def initialize(
        root: Optional[Path] = None,
    ) -> None:
        """Initializes a new ZenML repository at the given path.

        Args:
            root: The root directory where the repository should be created.
                If None, the current working directory is used.

        Raises:
            InitializationException: If the root directory already contains a
                ZenML repository.
        """
        root = root or Path.cwd()
        logger.debug("Initializing new repository at path %s.", root)
        if Client.is_repository_directory(root):
            raise InitializationException(
                f"Found existing ZenML repository at path '{root}'."
            )

        config_directory = str(root / REPOSITORY_DIRECTORY_NAME)
        io_utils.create_dir_recursive_if_not_exists(config_directory)
        # Initialize the repository configuration at the custom path
        Client(root=root)

    @property
    def uses_local_configuration(self) -> bool:
        """Check if the client is using a local configuration.

        Returns:
            True if the client is using a local configuration,
            False otherwise.
        """
        return self._config is not None

    @staticmethod
    def is_repository_directory(path: Path) -> bool:
        """Checks whether a ZenML client exists at the given path.

        Args:
            path: The path to check.

        Returns:
            True if a ZenML client exists at the given path,
            False otherwise.
        """
        config_dir = path / REPOSITORY_DIRECTORY_NAME
        return fileio.isdir(str(config_dir))

    @staticmethod
    def find_repository(
        path: Optional[Path] = None, enable_warnings: bool = False
    ) -> Optional[Path]:
        """Search for a ZenML repository directory.

        Args:
            path: Optional path to look for the repository. If no path is
                given, this function tries to find the repository using the
                environment variable `ZENML_REPOSITORY_PATH` (if set) and
                recursively searching in the parent directories of the current
                working directory.
            enable_warnings: If `True`, warnings are printed if the repository
                root cannot be found.

        Returns:
            Absolute path to a ZenML repository directory or None if no
            repository directory was found.
        """
        if not path:
            # try to get path from the environment variable
            env_var_path = os.getenv(ENV_ZENML_REPOSITORY_PATH)
            if env_var_path:
                path = Path(env_var_path)

        if path:
            # explicit path via parameter or environment variable, don't search
            # parent directories
            search_parent_directories = False
            warning_message = (
                f"Unable to find ZenML repository at path '{path}'. Make sure "
                f"to create a ZenML repository by calling `zenml init` when "
                f"specifying an explicit repository path in code or via the "
                f"environment variable '{ENV_ZENML_REPOSITORY_PATH}'."
            )
        else:
            # try to find the repository in the parent directories of the
            # current working directory
            path = Path.cwd()
            search_parent_directories = True
            warning_message = (
                f"Unable to find ZenML repository in your current working "
                f"directory ({path}) or any parent directories. If you "
                f"want to use an existing repository which is in a different "
                f"location, set the environment variable "
                f"'{ENV_ZENML_REPOSITORY_PATH}'. If you want to create a new "
                f"repository, run `zenml init`."
            )

        def _find_repository_helper(path_: Path) -> Optional[Path]:
            """Recursively search parent directories for a ZenML repository.

            Args:
                path_: The path to search.

            Returns:
                Absolute path to a ZenML repository directory or None if no
                repository directory was found.
            """
            if Client.is_repository_directory(path_):
                return path_

            if not search_parent_directories or io_utils.is_root(str(path_)):
                return None

            return _find_repository_helper(path_.parent)

        repository_path = _find_repository_helper(path)

        if repository_path:
            return repository_path.resolve()
        if enable_warnings:
            logger.warning(warning_message)
        return None

    @staticmethod
    def is_inside_repository(file_path: str) -> bool:
        """Returns whether a file is inside the active ZenML repository.

        Args:
            file_path: A file path.

        Returns:
            True if the file is inside the active ZenML repository, False
            otherwise.
        """
        if repo_path := Client.find_repository():
            return repo_path in Path(file_path).resolve().parents
        return False

    @property
    def zen_store(self) -> "BaseZenStore":
        """Shortcut to return the global zen store.

        Returns:
            The global zen store.
        """
        return GlobalConfiguration().zen_store

    @property
    def root(self) -> Optional[Path]:
        """The root directory of this client.

        Returns:
            The root directory of this client, or None, if the client
            has not been initialized.
        """
        return self._root

    @property
    def config_directory(self) -> Optional[Path]:
        """The configuration directory of this client.

        Returns:
            The configuration directory of this client, or None, if the
            client doesn't have an active root.
        """
        return self.root / REPOSITORY_DIRECTORY_NAME if self.root else None

    def activate_root(self, root: Optional[Path] = None) -> None:
        """Set the active repository root directory.

        Args:
            root: The path to set as the active repository root. If not set,
                the repository root is determined using the environment
                variable `ZENML_REPOSITORY_PATH` (if set) and by recursively
                searching in the parent directories of the current working
                directory.
        """
        self._set_active_root(root)

    def set_active_project(
        self, project_name_or_id: Union[str, UUID]
    ) -> "ProjectResponse":
        """Set the project for the local client.

        Args:
            project_name_or_id: The name or ID of the project to set active.

        Returns:
            The model of the active project.
        """
        project = self.zen_store.get_project(
            project_name_or_id=project_name_or_id
        )  # raises KeyError
        if self._config:
            self._config.set_active_project(project)
            # Sanitize the client configuration to reflect the current
            # settings
            self._sanitize_config()
        else:
            # set the active project globally only if the client doesn't use
            # a local configuration
            GlobalConfiguration().set_active_project(project)
        return project

    # ----------------------------- Server Settings ----------------------------

    def get_settings(self, hydrate: bool = True) -> ServerSettingsResponse:
        """Get the server settings.

        Args:
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The server settings.
        """
        return self.zen_store.get_server_settings(hydrate=hydrate)

    def update_server_settings(
        self,
        updated_name: Optional[str] = None,
        updated_logo_url: Optional[str] = None,
        updated_enable_analytics: Optional[bool] = None,
        updated_enable_announcements: Optional[bool] = None,
        updated_enable_updates: Optional[bool] = None,
        updated_onboarding_state: Optional[Dict[str, Any]] = None,
    ) -> ServerSettingsResponse:
        """Update the server settings.

        Args:
            updated_name: Updated name for the server.
            updated_logo_url: Updated logo URL for the server.
            updated_enable_analytics: Updated value whether to enable
                analytics for the server.
            updated_enable_announcements: Updated value whether to display
                announcements about ZenML.
            updated_enable_updates: Updated value whether to display updates
                about ZenML.
            updated_onboarding_state: Updated onboarding state for the server.

        Returns:
            The updated server settings.
        """
        update_model = ServerSettingsUpdate(
            server_name=updated_name,
            logo_url=updated_logo_url,
            enable_analytics=updated_enable_analytics,
            display_announcements=updated_enable_announcements,
            display_updates=updated_enable_updates,
            onboarding_state=updated_onboarding_state,
        )
        return self.zen_store.update_server_settings(update_model)

    # ---------------------------------- Users ---------------------------------

    def create_user(
        self,
        name: str,
        password: Optional[str] = None,
        is_admin: bool = False,
    ) -> UserResponse:
        """Create a new user.

        Args:
            name: The name of the user.
            password: The password of the user. If not provided, the user will
                be created with empty password.
            is_admin: Whether the user should be an admin.

        Returns:
            The model of the created user.
        """
        user = UserRequest(
            name=name, password=password or None, is_admin=is_admin
        )
        user.active = (
            password != "" if self.zen_store.type != StoreType.REST else True
        )
        created_user = self.zen_store.create_user(user=user)

        return created_user

    def get_user(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> UserResponse:
        """Gets a user.

        Args:
            name_id_or_prefix: The name or ID of the user.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The User
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_user,
            list_method=self.list_users,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_users(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        external_user_id: Optional[str] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        full_name: Optional[str] = None,
        email: Optional[str] = None,
        active: Optional[bool] = None,
        email_opted_in: Optional[bool] = None,
        hydrate: bool = False,
    ) -> Page[UserResponse]:
        """List all users.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of stacks to filter by.
            external_user_id: Use the external user id for filtering.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: Use the username for filtering
            full_name: Use the user full name for filtering
            email: Use the user email for filtering
            active: User the user active status for filtering
            email_opted_in: Use the user opt in status for filtering
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The User
        """
        return self.zen_store.list_users(
            UserFilter(
                sort_by=sort_by,
                page=page,
                size=size,
                logical_operator=logical_operator,
                id=id,
                external_user_id=external_user_id,
                created=created,
                updated=updated,
                name=name,
                full_name=full_name,
                email=email,
                active=active,
                email_opted_in=email_opted_in,
            ),
            hydrate=hydrate,
        )

    def update_user(
        self,
        name_id_or_prefix: Union[str, UUID],
        updated_name: Optional[str] = None,
        updated_full_name: Optional[str] = None,
        updated_email: Optional[str] = None,
        updated_email_opt_in: Optional[bool] = None,
        updated_password: Optional[str] = None,
        old_password: Optional[str] = None,
        updated_is_admin: Optional[bool] = None,
        updated_metadata: Optional[Dict[str, Any]] = None,
        updated_default_project_id: Optional[UUID] = None,
        active: Optional[bool] = None,
    ) -> UserResponse:
        """Update a user.

        Args:
            name_id_or_prefix: The name or ID of the user to update.
            updated_name: The new name of the user.
            updated_full_name: The new full name of the user.
            updated_email: The new email of the user.
            updated_email_opt_in: The new email opt-in status of the user.
            updated_password: The new password of the user.
            old_password: The old password of the user. Required for password
                update.
            updated_is_admin: Whether the user should be an admin.
            updated_metadata: The new metadata for the user.
            updated_default_project_id: The new default project ID for the user.
            active: Use to activate or deactivate the user.

        Returns:
            The updated user.

        Raises:
            ValidationError: If the old password is not provided when updating
                the password.
        """
        user = self.get_user(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )
        user_update = UserUpdate(name=updated_name or user.name)
        if updated_full_name:
            user_update.full_name = updated_full_name
        if updated_email is not None:
            user_update.email = updated_email
            user_update.email_opted_in = (
                updated_email_opt_in or user.email_opted_in
            )
        if updated_email_opt_in is not None:
            user_update.email_opted_in = updated_email_opt_in
        if updated_password is not None:
            user_update.password = updated_password
            if old_password is None:
                raise ValidationError(
                    "Old password is required to update the password."
                )
            user_update.old_password = old_password
        if updated_is_admin is not None:
            user_update.is_admin = updated_is_admin
        if active is not None:
            user_update.active = active

        if updated_metadata is not None:
            user_update.user_metadata = updated_metadata

        if updated_default_project_id is not None:
            user_update.default_project_id = updated_default_project_id

        return self.zen_store.update_user(
            user_id=user.id, user_update=user_update
        )

    @_fail_for_sql_zen_store
    def deactivate_user(self, name_id_or_prefix: str) -> "UserResponse":
        """Deactivate a user and generate an activation token.

        Args:
            name_id_or_prefix: The name or ID of the user to reset.

        Returns:
            The deactivated user.
        """
        from zenml.zen_stores.rest_zen_store import RestZenStore

        user = self.get_user(name_id_or_prefix, allow_name_prefix_match=False)
        assert isinstance(self.zen_store, RestZenStore)
        return self.zen_store.deactivate_user(user_name_or_id=user.name)

    def delete_user(self, name_id_or_prefix: str) -> None:
        """Delete a user.

        Args:
            name_id_or_prefix: The name or ID of the user to delete.
        """
        user = self.get_user(name_id_or_prefix, allow_name_prefix_match=False)
        self.zen_store.delete_user(user_name_or_id=user.name)

    @property
    def active_user(self) -> "UserResponse":
        """Get the user that is currently in use.

        Returns:
            The active user.
        """
        if self._active_user is None:
            self._active_user = self.zen_store.get_user(include_private=True)
        return self._active_user

    # -------------------------------- Projects ------------------------------

    def create_project(
        self,
        name: str,
        description: str,
        display_name: Optional[str] = None,
    ) -> ProjectResponse:
        """Create a new project.

        Args:
            name: Name of the project.
            description: Description of the project.
            display_name: Display name of the project.

        Returns:
            The created project.
        """
        return self.zen_store.create_project(
            ProjectRequest(
                name=name,
                description=description,
                display_name=display_name or "",
            )
        )

    def get_project(
        self,
        name_id_or_prefix: Optional[Union[UUID, str]],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> ProjectResponse:
        """Gets a project.

        Args:
            name_id_or_prefix: The name or ID of the project.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The project
        """
        if not name_id_or_prefix:
            return self.active_project
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_project,
            list_method=self.list_projects,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_projects(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        display_name: Optional[str] = None,
        hydrate: bool = False,
    ) -> Page[ProjectResponse]:
        """List all projects.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the project ID to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: Use the project name for filtering
            display_name: Use the project display name for filtering
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            Page of projects
        """
        return self.zen_store.list_projects(
            ProjectFilter(
                sort_by=sort_by,
                page=page,
                size=size,
                logical_operator=logical_operator,
                id=id,
                created=created,
                updated=updated,
                name=name,
                display_name=display_name,
            ),
            hydrate=hydrate,
        )

    def update_project(
        self,
        name_id_or_prefix: Optional[Union[UUID, str]],
        new_name: Optional[str] = None,
        new_display_name: Optional[str] = None,
        new_description: Optional[str] = None,
    ) -> ProjectResponse:
        """Update a project.

        Args:
            name_id_or_prefix: Name, ID or prefix of the project to update.
            new_name: New name of the project.
            new_display_name: New display name of the project.
            new_description: New description of the project.

        Returns:
            The updated project.
        """
        project = self.get_project(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )
        project_update = ProjectUpdate(
            name=new_name or project.name,
            display_name=new_display_name or project.display_name,
        )
        if new_description:
            project_update.description = new_description
        return self.zen_store.update_project(
            project_id=project.id,
            project_update=project_update,
        )

    def delete_project(self, name_id_or_prefix: str) -> None:
        """Delete a project.

        Args:
            name_id_or_prefix: The name or ID of the project to delete.

        Raises:
            IllegalOperationError: If the project to delete is the active
                project.
        """
        project = self.get_project(
            name_id_or_prefix, allow_name_prefix_match=False
        )
        if self.active_project.id == project.id:
            raise IllegalOperationError(
                f"Project '{name_id_or_prefix}' cannot be deleted since "
                "it is currently active. Please set another project as "
                "active first."
            )
        self.zen_store.delete_project(project_name_or_id=project.id)

    @property
    def active_project(self) -> ProjectResponse:
        """Get the currently active project of the local client.

        If no active project is configured locally for the client, the
        active project in the global configuration is used instead.

        Returns:
            The active project.

        Raises:
            RuntimeError: If the active project is not set.
        """
        if project_id := os.environ.get(ENV_ZENML_ACTIVE_PROJECT_ID):
            if not self._active_project or self._active_project.id != UUID(
                project_id
            ):
                self._active_project = self.get_project(project_id)

            return self._active_project

        from zenml.constants import DEFAULT_PROJECT_NAME

        # If running in a ZenML server environment, the active project is
        # not relevant
        if ENV_ZENML_SERVER in os.environ:
            return self.get_project(DEFAULT_PROJECT_NAME)

        project = (
            self._config.active_project if self._config else None
        ) or GlobalConfiguration().get_active_project()
        if not project:
            raise RuntimeError(
                "No active project is configured. Run "
                "`zenml project set <NAME>` to set the active "
                "project."
            )

        if project.name != DEFAULT_PROJECT_NAME:
            if not self.zen_store.get_store_info().is_pro_server():
                logger.warning(
                    f"You are running with a non-default project "
                    f"'{project.name}'. The ZenML project feature is "
                    "available only in ZenML Pro. Pipelines, pipeline runs and "
                    "artifacts produced in this project will not be "
                    "accessible through the dashboard. Please visit "
                    "https://zenml.io/pro to learn more."
                )
        return project

    # --------------------------------- Stacks ---------------------------------

    def create_stack(
        self,
        name: str,
        components: Mapping[StackComponentType, Union[str, UUID]],
        stack_spec_file: Optional[str] = None,
        labels: Optional[Dict[str, Any]] = None,
    ) -> StackResponse:
        """Registers a stack and its components.

        Args:
            name: The name of the stack to register.
            components: dictionary which maps component types to component names
            stack_spec_file: path to the stack spec file
            labels: The labels of the stack.

        Returns:
            The model of the registered stack.
        """
        stack_components = {}

        for c_type, c_identifier in components.items():
            # Skip non-existent components.
            if not c_identifier:
                continue

            # Get the component.
            component = self.get_stack_component(
                name_id_or_prefix=c_identifier,
                component_type=c_type,
            )
            stack_components[c_type] = [component.id]

        stack = StackRequest(
            name=name,
            components=stack_components,
            stack_spec_path=stack_spec_file,
            labels=labels,
        )

        self._validate_stack_configuration(stack=stack)

        return self.zen_store.create_stack(stack=stack)

    def get_stack(
        self,
        name_id_or_prefix: Optional[Union[UUID, str]] = None,
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> StackResponse:
        """Get a stack by name, ID or prefix.

        If no name, ID or prefix is provided, the active stack is returned.

        Args:
            name_id_or_prefix: The name, ID or prefix of the stack.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The stack.
        """
        if name_id_or_prefix is not None:
            return self._get_entity_by_id_or_name_or_prefix(
                get_method=self.zen_store.get_stack,
                list_method=self.list_stacks,
                name_id_or_prefix=name_id_or_prefix,
                allow_name_prefix_match=allow_name_prefix_match,
                hydrate=hydrate,
            )
        else:
            return self.active_stack_model

    def list_stacks(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        description: Optional[str] = None,
        component_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        component: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[StackResponse]:
        """Lists all stacks.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of stacks to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            description: Use the stack description for filtering
            component_id: The id of the component to filter by.
            user: The name/ID of the user to filter by.
            component: The name/ID of the component to filter by.
            name: The name of the stack to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of stacks.
        """
        stack_filter_model = StackFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            component_id=component_id,
            user=user,
            component=component,
            name=name,
            description=description,
            id=id,
            created=created,
            updated=updated,
        )
        return self.zen_store.list_stacks(stack_filter_model, hydrate=hydrate)

    def update_stack(
        self,
        name_id_or_prefix: Optional[Union[UUID, str]] = None,
        name: Optional[str] = None,
        stack_spec_file: Optional[str] = None,
        labels: Optional[Dict[str, Any]] = None,
        description: Optional[str] = None,
        component_updates: Optional[
            Dict[StackComponentType, List[Union[UUID, str]]]
        ] = None,
    ) -> StackResponse:
        """Updates a stack and its components.

        Args:
            name_id_or_prefix: The name, id or prefix of the stack to update.
            name: the new name of the stack.
            stack_spec_file: path to the stack spec file.
            labels: The new labels of the stack component.
            description: the new description of the stack.
            component_updates: dictionary which maps stack component types to
                lists of new stack component names or ids.

        Returns:
            The model of the updated stack.

        Raises:
            EntityExistsError: If the stack name is already taken.
        """
        # First, get the stack
        stack = self.get_stack(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )

        # Create the update model
        update_model = StackUpdate(
            stack_spec_path=stack_spec_file,
        )

        if name:
            if self.list_stacks(name=name):
                raise EntityExistsError(
                    "There are already existing stacks with the name "
                    f"'{name}'."
                )

            update_model.name = name

        if description:
            update_model.description = description

        # Get the current components
        if component_updates:
            components_dict = stack.components.copy()

            for component_type, component_id_list in component_updates.items():
                if component_id_list is not None:
                    components_dict[component_type] = [
                        self.get_stack_component(
                            name_id_or_prefix=component_id,
                            component_type=component_type,
                        )
                        for component_id in component_id_list
                    ]

            update_model.components = {
                c_type: [c.id for c in c_list]
                for c_type, c_list in components_dict.items()
            }

        if labels is not None:
            existing_labels = stack.labels or {}
            existing_labels.update(labels)

            existing_labels = {
                k: v for k, v in existing_labels.items() if v is not None
            }
            update_model.labels = existing_labels

        updated_stack = self.zen_store.update_stack(
            stack_id=stack.id,
            stack_update=update_model,
        )
        if updated_stack.id == self.active_stack_model.id:
            if self._config:
                self._config.set_active_stack(updated_stack)
            else:
                GlobalConfiguration().set_active_stack(updated_stack)
        return updated_stack

    def delete_stack(
        self, name_id_or_prefix: Union[str, UUID], recursive: bool = False
    ) -> None:
        """Deregisters a stack.

        Args:
            name_id_or_prefix: The name, id or prefix id of the stack
                to deregister.
            recursive: If `True`, all components of the stack which are not
                associated with any other stack will also be deleted.

        Raises:
            ValueError: If the stack is the currently active stack for this
                client.
        """
        stack = self.get_stack(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )

        if stack.id == self.active_stack_model.id:
            raise ValueError(
                f"Unable to deregister active stack '{stack.name}'. Make "
                f"sure to designate a new active stack before deleting this "
                f"one."
            )

        cfg = GlobalConfiguration()
        if stack.id == cfg.active_stack_id:
            raise ValueError(
                f"Unable to deregister '{stack.name}' as it is the active "
                f"stack within your global configuration. Make "
                f"sure to designate a new active stack before deleting this "
                f"one."
            )

        if recursive:
            stack_components_free_for_deletion = []

            # Get all stack components associated with this stack
            for component_type, component_model in stack.components.items():
                # Get stack associated with the stack component

                stacks = self.list_stacks(
                    component_id=component_model[0].id, size=2, page=1
                )

                # Check if the stack component is part of another stack
                if len(stacks) == 1 and stack.id == stacks[0].id:
                    stack_components_free_for_deletion.append(
                        (component_type, component_model)
                    )

            self.delete_stack(stack.id)

            for (
                stack_component_type,
                stack_component_model,
            ) in stack_components_free_for_deletion:
                self.delete_stack_component(
                    stack_component_model[0].name, stack_component_type
                )

            logger.info("Deregistered stack with name '%s'.", stack.name)
            return

        self.zen_store.delete_stack(stack_id=stack.id)
        logger.info("Deregistered stack with name '%s'.", stack.name)

    @property
    def active_stack(self) -> "Stack":
        """The active stack for this client.

        Returns:
            The active stack for this client.
        """
        from zenml.stack.stack import Stack

        return Stack.from_model(self.active_stack_model)

    @property
    def active_stack_model(self) -> StackResponse:
        """The model of the active stack for this client.

        If no active stack is configured locally for the client, the active
        stack in the global configuration is used instead.

        Returns:
            The model of the active stack for this client.
        """
        if env_stack_id := os.environ.get(ENV_ZENML_ACTIVE_STACK_ID):
            if not self._active_stack or self._active_stack.id != UUID(
                env_stack_id
            ):
                self._active_stack = self.get_stack(env_stack_id)

            return self._active_stack

        stack_id: Optional[UUID] = None

        if self._config:
            if self._config._active_stack:
                return self._config._active_stack

            stack_id = self._config.active_stack_id

        if not stack_id:
            # Initialize the zen store so the global config loads the active
            # stack
            _ = GlobalConfiguration().zen_store
            if active_stack := GlobalConfiguration()._active_stack:
                return active_stack

            stack_id = GlobalConfiguration().get_active_stack_id()

        return self.get_stack(stack_id)

    def activate_stack(
        self, stack_name_id_or_prefix: Union[str, UUID]
    ) -> None:
        """Sets the stack as active.

        Args:
            stack_name_id_or_prefix: Model of the stack to activate.

        Raises:
            KeyError: If the stack is not registered.
        """
        # Make sure the stack is registered
        try:
            stack = self.get_stack(name_id_or_prefix=stack_name_id_or_prefix)
        except KeyError as e:
            raise KeyError(
                f"Stack '{stack_name_id_or_prefix}' cannot be activated since "
                f"it is not registered yet. Please register it first."
            ) from e

        if self._config:
            self._config.set_active_stack(stack=stack)

        else:
            # set the active stack globally only if the client doesn't use
            # a local configuration
            GlobalConfiguration().set_active_stack(stack=stack)

    def _validate_stack_configuration(self, stack: StackRequest) -> None:
        """Validates the configuration of a stack.

        Args:
            stack: The stack to validate.
        """
        local_components: List[str] = []
        remote_components: List[str] = []
        assert stack.components is not None
        for component_type, components in stack.components.items():
            component_flavor: Union[FlavorResponse, str]

            for component in components:
                if isinstance(component, UUID):
                    component_response = self.get_stack_component(
                        name_id_or_prefix=component,
                        component_type=component_type,
                    )
                    component_config = component_response.configuration
                    component_flavor = component_response.flavor
                else:
                    component_config = component.configuration
                    component_flavor = component.flavor

                # Create and validate the configuration
                from zenml.stack.utils import (
                    validate_stack_component_config,
                    warn_if_config_server_mismatch,
                )

                configuration = validate_stack_component_config(
                    configuration_dict=component_config,
                    flavor=component_flavor,
                    component_type=component_type,
                    # Always enforce validation of custom flavors
                    validate_custom_flavors=True,
                )
                # Guaranteed to not be None by setting
                # `validate_custom_flavors=True` above
                assert configuration is not None
                warn_if_config_server_mismatch(configuration)
                flavor_name = (
                    component_flavor.name
                    if isinstance(component_flavor, FlavorResponse)
                    else component_flavor
                )
                if configuration.is_local:
                    local_components.append(
                        f"{component_type.value}: {flavor_name}"
                    )
                elif configuration.is_remote:
                    remote_components.append(
                        f"{component_type.value}: {flavor_name}"
                    )

        if local_components and remote_components:
            logger.warning(
                f"You are configuring a stack that is composed of components "
                f"that are relying on local resources "
                f"({', '.join(local_components)}) as well as "
                f"components that are running remotely "
                f"({', '.join(remote_components)}). This is not recommended as "
                f"it can lead to unexpected behavior, especially if the remote "
                f"components need to access the local resources. Please make "
                f"sure that your stack is configured correctly, or try to use "
                f"component flavors or configurations that do not require "
                f"local resources."
            )

    # ----------------------------- Services -----------------------------------

    def create_service(
        self,
        config: "ServiceConfig",
        service_type: ServiceType,
        model_version_id: Optional[UUID] = None,
    ) -> ServiceResponse:
        """Registers a service.

        Args:
            config: The configuration of the service.
            service_type: The type of the service.
            model_version_id: The ID of the model version to associate with the
                service.

        Returns:
            The registered service.
        """
        service_request = ServiceRequest(
            name=config.service_name,
            service_type=service_type,
            config=config.model_dump(),
            project=self.active_project.id,
            model_version_id=model_version_id,
        )
        # Register the service
        return self.zen_store.create_service(service_request)

    def get_service(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
        type: Optional[str] = None,
        project: Optional[Union[str, UUID]] = None,
    ) -> ServiceResponse:
        """Gets a service.

        Args:
            name_id_or_prefix: The name or ID of the service.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            type: The type of the service.
            project: The project name/ID to filter by.

        Returns:
            The Service
        """

        def type_scoped_list_method(
            hydrate: bool = True,
            **kwargs: Any,
        ) -> Page[ServiceResponse]:
            """Call `zen_store.list_services` with type scoping.

            Args:
                hydrate: Flag deciding whether to hydrate the output model(s)
                    by including metadata fields in the response.
                **kwargs: Keyword arguments to pass to `ServiceFilterModel`.

            Returns:
                The type-scoped list of services.
            """
            service_filter_model = ServiceFilter(**kwargs)
            if type:
                service_filter_model.set_type(type=type)
            return self.zen_store.list_services(
                filter_model=service_filter_model,
                hydrate=hydrate,
            )

        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_service,
            list_method=type_scoped_list_method,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            project=project,
            hydrate=hydrate,
        )

    def list_services(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        type: Optional[str] = None,
        flavor: Optional[str] = None,
        user: Optional[Union[UUID, str]] = None,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = False,
        running: Optional[bool] = None,
        service_name: Optional[str] = None,
        pipeline_name: Optional[str] = None,
        pipeline_run_id: Optional[str] = None,
        pipeline_step_name: Optional[str] = None,
        model_version_id: Optional[Union[str, UUID]] = None,
        config: Optional[Dict[str, Any]] = None,
    ) -> Page[ServiceResponse]:
        """List all services.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of services to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            type: Use the service type for filtering
            flavor: Use the service flavor for filtering
            project: The project name/ID to filter by.
            user: Filter by user name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            running: Use the running status for filtering
            pipeline_name: Use the pipeline name for filtering
            service_name: Use the service name or model name
                for filtering
            pipeline_step_name: Use the pipeline step name for filtering
            model_version_id: Use the model version id for filtering
            config: Use the config for filtering
            pipeline_run_id: Use the pipeline run id for filtering

        Returns:
            The Service response page.
        """
        service_filter_model = ServiceFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            type=type,
            flavor=flavor,
            project=project or self.active_project.id,
            user=user,
            running=running,
            name=service_name,
            pipeline_name=pipeline_name,
            pipeline_step_name=pipeline_step_name,
            model_version_id=model_version_id,
            pipeline_run_id=pipeline_run_id,
            config=dict_to_bytes(config) if config else None,
        )
        return self.zen_store.list_services(
            filter_model=service_filter_model, hydrate=hydrate
        )

    def update_service(
        self,
        id: UUID,
        name: Optional[str] = None,
        service_source: Optional[str] = None,
        admin_state: Optional[ServiceState] = None,
        status: Optional[Dict[str, Any]] = None,
        endpoint: Optional[Dict[str, Any]] = None,
        labels: Optional[Dict[str, str]] = None,
        prediction_url: Optional[str] = None,
        health_check_url: Optional[str] = None,
        model_version_id: Optional[UUID] = None,
    ) -> ServiceResponse:
        """Update a service.

        Args:
            id: The ID of the service to update.
            name: The new name of the service.
            admin_state: The new admin state of the service.
            status: The new status of the service.
            endpoint: The new endpoint of the service.
            service_source: The new service source of the service.
            labels: The new labels of the service.
            prediction_url: The new prediction url of the service.
            health_check_url: The new health check url of the service.
            model_version_id: The new model version id of the service.

        Returns:
            The updated service.
        """
        service_update = ServiceUpdate()
        if name:
            service_update.name = name
        if service_source:
            service_update.service_source = service_source
        if admin_state:
            service_update.admin_state = admin_state
        if status:
            service_update.status = status
        if endpoint:
            service_update.endpoint = endpoint
        if labels:
            service_update.labels = labels
        if prediction_url:
            service_update.prediction_url = prediction_url
        if health_check_url:
            service_update.health_check_url = health_check_url
        if model_version_id:
            service_update.model_version_id = model_version_id
        return self.zen_store.update_service(
            service_id=id, update=service_update
        )

    def delete_service(
        self,
        name_id_or_prefix: UUID,
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Delete a service.

        Args:
            name_id_or_prefix: The name or ID of the service to delete.
            project: The project name/ID to filter by.
        """
        service = self.get_service(
            name_id_or_prefix,
            allow_name_prefix_match=False,
            project=project,
        )
        self.zen_store.delete_service(service_id=service.id)

    # -------------------------------- Components ------------------------------

    def get_stack_component(
        self,
        component_type: StackComponentType,
        name_id_or_prefix: Optional[Union[str, UUID]] = None,
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> ComponentResponse:
        """Fetches a registered stack component.

        If the name_id_or_prefix is provided, it will try to fetch the component
        with the corresponding identifier. If not, it will try to fetch the
        active component of the given type.

        Args:
            component_type: The type of the component to fetch
            name_id_or_prefix: The id of the component to fetch.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The registered stack component.

        Raises:
            KeyError: If no name_id_or_prefix is provided and no such component
                is part of the active stack.
        """
        # If no `name_id_or_prefix` provided, try to get the active component.
        if not name_id_or_prefix:
            components = self.active_stack_model.components.get(
                component_type, None
            )
            if components:
                return components[0]
            raise KeyError(
                "No name_id_or_prefix provided and there is no active "
                f"{component_type} in the current active stack."
            )

        # Else, try to fetch the component with an explicit type filter
        def type_scoped_list_method(
            hydrate: bool = False,
            **kwargs: Any,
        ) -> Page[ComponentResponse]:
            """Call `zen_store.list_stack_components` with type scoping.

            Args:
                hydrate: Flag deciding whether to hydrate the output model(s)
                    by including metadata fields in the response.
                **kwargs: Keyword arguments to pass to `ComponentFilterModel`.

            Returns:
                The type-scoped list of components.
            """
            component_filter_model = ComponentFilter(**kwargs)
            component_filter_model.set_scope_type(
                component_type=component_type
            )
            return self.zen_store.list_stack_components(
                component_filter_model=component_filter_model,
                hydrate=hydrate,
            )

        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_stack_component,
            list_method=type_scoped_list_method,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_stack_components(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        flavor: Optional[str] = None,
        type: Optional[str] = None,
        connector_id: Optional[Union[str, UUID]] = None,
        stack_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[ComponentResponse]:
        """Lists all registered stack components.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of component to filter by.
            created: Use to component by time of creation
            updated: Use the last updated date for filtering
            flavor: Use the component flavor for filtering
            type: Use the component type for filtering
            connector_id: The id of the connector to filter by.
            stack_id: The id of the stack to filter by.
            name: The name of the component to filter by.
            user: The ID of name of the user to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of stack components.
        """
        component_filter_model = ComponentFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            connector_id=connector_id,
            stack_id=stack_id,
            name=name,
            flavor=flavor,
            type=type,
            id=id,
            created=created,
            updated=updated,
            user=user,
        )

        return self.zen_store.list_stack_components(
            component_filter_model=component_filter_model, hydrate=hydrate
        )

    def create_stack_component(
        self,
        name: str,
        flavor: str,
        component_type: StackComponentType,
        configuration: Dict[str, str],
        labels: Optional[Dict[str, Any]] = None,
    ) -> "ComponentResponse":
        """Registers a stack component.

        Args:
            name: The name of the stack component.
            flavor: The flavor of the stack component.
            component_type: The type of the stack component.
            configuration: The configuration of the stack component.
            labels: The labels of the stack component.

        Returns:
            The model of the registered component.
        """
        from zenml.stack.utils import (
            validate_stack_component_config,
            warn_if_config_server_mismatch,
        )

        validated_config = validate_stack_component_config(
            configuration_dict=configuration,
            flavor=flavor,
            component_type=component_type,
            # Always enforce validation of custom flavors
            validate_custom_flavors=True,
        )
        # Guaranteed to not be None by setting
        # `validate_custom_flavors=True` above
        assert validated_config is not None
        warn_if_config_server_mismatch(validated_config)

        create_component_model = ComponentRequest(
            name=name,
            type=component_type,
            flavor=flavor,
            configuration=validated_config.model_dump(
                mode="json", exclude_unset=True
            ),
            labels=labels,
        )

        # Register the new model
        return self.zen_store.create_stack_component(
            component=create_component_model
        )

    def update_stack_component(
        self,
        name_id_or_prefix: Optional[Union[UUID, str]],
        component_type: StackComponentType,
        name: Optional[str] = None,
        configuration: Optional[Dict[str, Any]] = None,
        labels: Optional[Dict[str, Any]] = None,
        disconnect: Optional[bool] = None,
        connector_id: Optional[UUID] = None,
        connector_resource_id: Optional[str] = None,
    ) -> ComponentResponse:
        """Updates a stack component.

        Args:
            name_id_or_prefix: The name, id or prefix of the stack component to
                update.
            component_type: The type of the stack component to update.
            name: The new name of the stack component.
            configuration: The new configuration of the stack component.
            labels: The new labels of the stack component.
            disconnect: Whether to disconnect the stack component from its
                service connector.
            connector_id: The new connector id of the stack component.
            connector_resource_id: The new connector resource id of the
                stack component.

        Returns:
            The updated stack component.

        Raises:
            EntityExistsError: If the new name is already taken.
        """
        # Get the existing component model
        component = self.get_stack_component(
            name_id_or_prefix=name_id_or_prefix,
            component_type=component_type,
            allow_name_prefix_match=False,
        )

        update_model = ComponentUpdate()

        if name is not None:
            existing_components = self.list_stack_components(
                name=name,
                type=component_type,
            )
            if existing_components.total > 0:
                raise EntityExistsError(
                    f"There are already existing components with the "
                    f"name '{name}'."
                )
            update_model.name = name

        if configuration is not None:
            existing_configuration = component.configuration
            existing_configuration.update(configuration)
            existing_configuration = {
                k: v
                for k, v in existing_configuration.items()
                if v is not None
            }

            from zenml.stack.utils import (
                validate_stack_component_config,
                warn_if_config_server_mismatch,
            )

            validated_config = validate_stack_component_config(
                configuration_dict=existing_configuration,
                flavor=component.flavor,
                component_type=component.type,
                # Always enforce validation of custom flavors
                validate_custom_flavors=True,
            )
            # Guaranteed to not be None by setting
            # `validate_custom_flavors=True` above
            assert validated_config is not None
            warn_if_config_server_mismatch(validated_config)

            update_model.configuration = validated_config.model_dump(
                mode="json", exclude_unset=True
            )

        if labels is not None:
            existing_labels = component.labels or {}
            existing_labels.update(labels)

            existing_labels = {
                k: v for k, v in existing_labels.items() if v is not None
            }
            update_model.labels = existing_labels

        if disconnect:
            update_model.connector = None
            update_model.connector_resource_id = None
        else:
            existing_component = self.get_stack_component(
                name_id_or_prefix=name_id_or_prefix,
                component_type=component_type,
                allow_name_prefix_match=False,
            )
            update_model.connector = connector_id
            update_model.connector_resource_id = connector_resource_id
            if connector_id is None and existing_component.connector:
                update_model.connector = existing_component.connector.id
                update_model.connector_resource_id = (
                    existing_component.connector_resource_id
                )

        # Send the updated component to the ZenStore
        return self.zen_store.update_stack_component(
            component_id=component.id,
            component_update=update_model,
        )

    def delete_stack_component(
        self,
        name_id_or_prefix: Union[str, UUID],
        component_type: StackComponentType,
    ) -> None:
        """Deletes a registered stack component.

        Args:
            name_id_or_prefix: The model of the component to delete.
            component_type: The type of the component to delete.
        """
        component = self.get_stack_component(
            name_id_or_prefix=name_id_or_prefix,
            component_type=component_type,
            allow_name_prefix_match=False,
        )

        self.zen_store.delete_stack_component(component_id=component.id)
        logger.info(
            "Deregistered stack component (type: %s) with name '%s'.",
            component.type,
            component.name,
        )

    # --------------------------------- Flavors --------------------------------

    def create_flavor(
        self,
        source: str,
        component_type: StackComponentType,
    ) -> FlavorResponse:
        """Creates a new flavor.

        Args:
            source: The flavor to create.
            component_type: The type of the flavor.

        Returns:
            The created flavor (in model form).

        Raises:
            ValueError: in case the config_schema of the flavor is too large.
        """
        from zenml.stack.flavor import validate_flavor_source

        flavor = validate_flavor_source(
            source=source, component_type=component_type
        )()

        if len(flavor.config_schema) > TEXT_FIELD_MAX_LENGTH:
            raise ValueError(
                "Json representation of configuration schema"
                "exceeds max length. This could be caused by an"
                "overly long docstring on the flavors "
                "configuration class' docstring."
            )

        flavor_request = flavor.to_model(integration="custom", is_custom=True)
        return self.zen_store.create_flavor(flavor=flavor_request)

    def get_flavor(
        self,
        name_id_or_prefix: str,
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> FlavorResponse:
        """Get a stack component flavor.

        Args:
            name_id_or_prefix: The name, ID or prefix to the id of the flavor
                to get.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The stack component flavor.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_flavor,
            list_method=self.list_flavors,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_flavors(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        type: Optional[str] = None,
        integration: Optional[str] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[FlavorResponse]:
        """Fetches all the flavor models.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of flavors to filter by.
            created: Use to flavors by time of creation
            updated: Use the last updated date for filtering
            user: Filter by user name/ID.
            name: The name of the flavor to filter by.
            type: The type of the flavor to filter by.
            integration: The integration of the flavor to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A list of all the flavor models.
        """
        flavor_filter_model = FlavorFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            user=user,
            name=name,
            type=type,
            integration=integration,
            id=id,
            created=created,
            updated=updated,
        )
        return self.zen_store.list_flavors(
            flavor_filter_model=flavor_filter_model, hydrate=hydrate
        )

    def delete_flavor(self, name_id_or_prefix: str) -> None:
        """Deletes a flavor.

        Args:
            name_id_or_prefix: The name, id or prefix of the id for the
                flavor to delete.
        """
        flavor = self.get_flavor(
            name_id_or_prefix, allow_name_prefix_match=False
        )
        self.zen_store.delete_flavor(flavor_id=flavor.id)

        logger.info(f"Deleted flavor '{flavor.name}' of type '{flavor.type}'.")

    def get_flavors_by_type(
        self, component_type: "StackComponentType"
    ) -> Page[FlavorResponse]:
        """Fetches the list of flavor for a stack component type.

        Args:
            component_type: The type of the component to fetch.

        Returns:
            The list of flavors.
        """
        logger.debug(f"Fetching the flavors of type {component_type}.")

        return self.list_flavors(
            type=component_type,
        )

    def get_flavor_by_name_and_type(
        self, name: str, component_type: "StackComponentType"
    ) -> FlavorResponse:
        """Fetches a registered flavor.

        Args:
            component_type: The type of the component to fetch.
            name: The name of the flavor to fetch.

        Returns:
            The registered flavor.

        Raises:
            KeyError: If no flavor exists for the given type and name.
        """
        logger.debug(
            f"Fetching the flavor of type {component_type} with name {name}."
        )

        if not (
            flavors := self.list_flavors(
                type=component_type, name=name, hydrate=True
            ).items
        ):
            raise KeyError(
                f"No flavor with name '{name}' and type '{component_type}' "
                "exists."
            )
        if len(flavors) > 1:
            raise KeyError(
                f"More than one flavor with name {name} and type "
                f"{component_type} exists."
            )

        return flavors[0]

    # ------------------------------- Pipelines --------------------------------

    def list_pipelines(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        latest_run_status: Optional[str] = None,
        project: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        tag: Optional[str] = None,
        tags: Optional[List[str]] = None,
        hydrate: bool = False,
    ) -> Page[PipelineResponse]:
        """List all pipelines.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of pipeline to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: The name of the pipeline to filter by.
            latest_run_status: Filter by the status of the latest run of a
                pipeline.
            project: The project name/ID to filter by.
            user: The name/ID of the user to filter by.
            tag: Tag to filter by.
            tags: Tags to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page with Pipeline fitting the filter description
        """
        pipeline_filter_model = PipelineFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            latest_run_status=latest_run_status,
            project=project or self.active_project.id,
            user=user,
            tag=tag,
            tags=tags,
        )
        return self.zen_store.list_pipelines(
            pipeline_filter_model=pipeline_filter_model,
            hydrate=hydrate,
        )

    def get_pipeline(
        self,
        name_id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
    ) -> PipelineResponse:
        """Get a pipeline by name, id or prefix.

        Args:
            name_id_or_prefix: The name, ID or ID prefix of the pipeline.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The pipeline.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_pipeline,
            list_method=self.list_pipelines,
            name_id_or_prefix=name_id_or_prefix,
            project=project,
            hydrate=hydrate,
        )

    def delete_pipeline(
        self,
        name_id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Delete a pipeline.

        Args:
            name_id_or_prefix: The name, ID or ID prefix of the pipeline.
            project: The project name/ID to filter by.
        """
        pipeline = self.get_pipeline(
            name_id_or_prefix=name_id_or_prefix, project=project
        )
        self.zen_store.delete_pipeline(pipeline_id=pipeline.id)

    @_fail_for_sql_zen_store
    def trigger_pipeline(
        self,
        pipeline_name_or_id: Union[str, UUID, None] = None,
        run_configuration: Union[
            PipelineRunConfiguration, Dict[str, Any], None
        ] = None,
        config_path: Optional[str] = None,
        template_id: Optional[UUID] = None,
        stack_name_or_id: Union[str, UUID, None] = None,
        synchronous: bool = False,
        project: Optional[Union[str, UUID]] = None,
    ) -> PipelineRunResponse:
        """Trigger a pipeline from the server.

        Usage examples:
        * Run the latest runnable template for a pipeline:
        ```python
        Client().trigger_pipeline(pipeline_name_or_id=<NAME>)
        ```
        * Run the latest runnable template for a pipeline on a specific stack:
        ```python
        Client().trigger_pipeline(
            pipeline_name_or_id=<NAME>,
            stack_name_or_id=<STACK_NAME_OR_ID>
        )
        ```
        * Run a specific template:
        ```python
        Client().trigger_pipeline(template_id=<ID>)
        ```

        Args:
            pipeline_name_or_id: Name or ID of the pipeline. If this is
                specified, the latest runnable template for this pipeline will
                be used for the run (Runnable here means that the build
                associated with the template is for a remote stack without any
                custom flavor stack components). If not given, a template ID
                that should be run needs to be specified.
            run_configuration: Configuration for the run. Either this or a
                path to a config file can be specified.
            config_path: Path to a YAML configuration file. This file will be
                parsed as a `PipelineRunConfiguration` object. Either this or
                the configuration in code can be specified.
            template_id: ID of the template to run. Either this or a pipeline
                can be specified.
            stack_name_or_id: Name or ID of the stack on which to run the
                pipeline. If not specified, this method will try to find a
                runnable template on any stack.
            synchronous: If `True`, this method will wait until the triggered
                run is finished.
            project: The project name/ID to filter by.

        Raises:
            RuntimeError: If triggering the pipeline failed.

        Returns:
            Model of the pipeline run.
        """
        from zenml.pipelines.run_utils import (
            validate_run_config_is_runnable_from_server,
            validate_stack_is_runnable_from_server,
            wait_for_pipeline_run_to_finish,
        )

        if Counter([template_id, pipeline_name_or_id])[None] != 1:
            raise RuntimeError(
                "You need to specify exactly one of pipeline or template "
                "to trigger."
            )

        if run_configuration and config_path:
            raise RuntimeError(
                "Only config path or runtime configuration can be specified."
            )

        if config_path:
            run_configuration = PipelineRunConfiguration.from_yaml(config_path)

        if isinstance(run_configuration, Dict):
            run_configuration = PipelineRunConfiguration.model_validate(
                run_configuration
            )

        if run_configuration:
            validate_run_config_is_runnable_from_server(run_configuration)

        if template_id:
            if stack_name_or_id:
                logger.warning(
                    "Template ID and stack specified, ignoring the stack and "
                    "using stack associated with the template instead."
                )

            run = self.zen_store.run_template(
                template_id=template_id,
                run_configuration=run_configuration,
            )
        else:
            assert pipeline_name_or_id
            pipeline = self.get_pipeline(name_id_or_prefix=pipeline_name_or_id)

            stack = None
            if stack_name_or_id:
                stack = self.get_stack(
                    stack_name_or_id, allow_name_prefix_match=False
                )
                validate_stack_is_runnable_from_server(
                    zen_store=self.zen_store, stack=stack
                )

            templates = depaginate(
                self.list_run_templates,
                pipeline_id=pipeline.id,
                stack_id=stack.id if stack else None,
                project=project or pipeline.project_id,
            )

            for template in templates:
                if not template.build:
                    continue

                stack = template.build.stack
                if not stack:
                    continue

                try:
                    validate_stack_is_runnable_from_server(
                        zen_store=self.zen_store, stack=stack
                    )
                except ValueError:
                    continue

                run = self.zen_store.run_template(
                    template_id=template.id,
                    run_configuration=run_configuration,
                )
                break
            else:
                raise RuntimeError(
                    "Unable to find a runnable template for the given stack "
                    "and pipeline."
                )

        if synchronous:
            run = wait_for_pipeline_run_to_finish(run_id=run.id)

        return run

    # -------------------------------- Builds ----------------------------------

    def get_build(
        self,
        id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
    ) -> PipelineBuildResponse:
        """Get a build by id or prefix.

        Args:
            id_or_prefix: The id or id prefix of the build.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The build.

        Raises:
            KeyError: If no build was found for the given id or prefix.
            ZenKeyError: If multiple builds were found that match the given
                id or prefix.
        """
        from zenml.utils.uuid_utils import is_valid_uuid

        # First interpret as full UUID
        if is_valid_uuid(id_or_prefix):
            if not isinstance(id_or_prefix, UUID):
                id_or_prefix = UUID(id_or_prefix, version=4)

            return self.zen_store.get_build(
                id_or_prefix,
                hydrate=hydrate,
            )

        list_kwargs: Dict[str, Any] = dict(
            id=f"startswith:{id_or_prefix}",
            hydrate=hydrate,
        )
        scope = ""
        if project:
            list_kwargs["project"] = project
            scope = f" in project {project}"

        entity = self.list_builds(**list_kwargs)

        # If only a single entity is found, return it.
        if entity.total == 1:
            return entity.items[0]

        # If no entity is found, raise an error.
        if entity.total == 0:
            raise KeyError(
                f"No builds have been found that have either an id or prefix "
                f"that matches the provided string '{id_or_prefix}'{scope}."
            )

        raise ZenKeyError(
            f"{entity.total} builds have been found{scope} that have "
            f"an ID that matches the provided "
            f"string '{id_or_prefix}':\n"
            f"{[entity.items]}.\n"
            f"Please use the id to uniquely identify "
            f"only one of the builds."
        )

    def list_builds(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        project: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        pipeline_id: Optional[Union[str, UUID]] = None,
        stack_id: Optional[Union[str, UUID]] = None,
        container_registry_id: Optional[Union[UUID, str]] = None,
        is_local: Optional[bool] = None,
        contains_code: Optional[bool] = None,
        zenml_version: Optional[str] = None,
        python_version: Optional[str] = None,
        checksum: Optional[str] = None,
        stack_checksum: Optional[str] = None,
        duration: Optional[Union[int, str]] = None,
        hydrate: bool = False,
    ) -> Page[PipelineBuildResponse]:
        """List all builds.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of build to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            project: The project name/ID to filter by.
            user: Filter by user name/ID.
            pipeline_id: The id of the pipeline to filter by.
            stack_id: The id of the stack to filter by.
            container_registry_id: The id of the container registry to
                filter by.
            is_local: Use to filter local builds.
            contains_code: Use to filter builds that contain code.
            zenml_version: The version of ZenML to filter by.
            python_version: The Python version to filter by.
            checksum: The build checksum to filter by.
            stack_checksum: The stack checksum to filter by.
            duration: The duration of the build in seconds to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page with builds fitting the filter description
        """
        build_filter_model = PipelineBuildFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            project=project or self.active_project.id,
            user=user,
            pipeline_id=pipeline_id,
            stack_id=stack_id,
            container_registry_id=container_registry_id,
            is_local=is_local,
            contains_code=contains_code,
            zenml_version=zenml_version,
            python_version=python_version,
            checksum=checksum,
            stack_checksum=stack_checksum,
            duration=duration,
        )
        return self.zen_store.list_builds(
            build_filter_model=build_filter_model,
            hydrate=hydrate,
        )

    def delete_build(
        self, id_or_prefix: str, project: Optional[Union[str, UUID]] = None
    ) -> None:
        """Delete a build.

        Args:
            id_or_prefix: The id or id prefix of the build.
            project: The project name/ID to filter by.
        """
        build = self.get_build(id_or_prefix=id_or_prefix, project=project)
        self.zen_store.delete_build(build_id=build.id)

    # --------------------------------- Event Sources -------------------------

    @_fail_for_sql_zen_store
    def create_event_source(
        self,
        name: str,
        configuration: Dict[str, Any],
        flavor: str,
        event_source_subtype: PluginSubType,
        description: str = "",
    ) -> EventSourceResponse:
        """Registers an event source.

        Args:
            name: The name of the event source to create.
            configuration: Configuration for this event source.
            flavor: The flavor of event source.
            event_source_subtype: The event source subtype.
            description: The description of the event source.

        Returns:
            The model of the registered event source.
        """
        event_source = EventSourceRequest(
            name=name,
            configuration=configuration,
            description=description,
            flavor=flavor,
            plugin_type=PluginType.EVENT_SOURCE,
            plugin_subtype=event_source_subtype,
            project=self.active_project.id,
        )

        return self.zen_store.create_event_source(event_source=event_source)

    @_fail_for_sql_zen_store
    def get_event_source(
        self,
        name_id_or_prefix: Union[UUID, str],
        allow_name_prefix_match: bool = True,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
    ) -> EventSourceResponse:
        """Get an event source by name, ID or prefix.

        Args:
            name_id_or_prefix: The name, ID or prefix of the stack.
            allow_name_prefix_match: If True, allow matching by name prefix.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The event_source.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_event_source,
            list_method=self.list_event_sources,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            project=project,
            hydrate=hydrate,
        )

    def list_event_sources(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        flavor: Optional[str] = None,
        event_source_type: Optional[str] = None,
        project: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[EventSourceResponse]:
        """Lists all event_sources.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of event_sources to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            project: The project name/ID to filter by.
            user: Filter by user name/ID.
            name: The name of the event_source to filter by.
            flavor: The flavor of the event_source to filter by.
            event_source_type: The subtype of the event_source to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of event_sources.
        """
        event_source_filter_model = EventSourceFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            project=project or self.active_project.id,
            user=user,
            name=name,
            flavor=flavor,
            plugin_subtype=event_source_type,
            id=id,
            created=created,
            updated=updated,
        )
        return self.zen_store.list_event_sources(
            event_source_filter_model, hydrate=hydrate
        )

    @_fail_for_sql_zen_store
    def update_event_source(
        self,
        name_id_or_prefix: Union[UUID, str],
        name: Optional[str] = None,
        description: Optional[str] = None,
        configuration: Optional[Dict[str, Any]] = None,
        rotate_secret: Optional[bool] = None,
        is_active: Optional[bool] = None,
        project: Optional[Union[str, UUID]] = None,
    ) -> EventSourceResponse:
        """Updates an event_source.

        Args:
            name_id_or_prefix: The name, id or prefix of the event_source to update.
            name: the new name of the event_source.
            description: the new description of the event_source.
            configuration: The event source configuration.
            rotate_secret: Allows rotating of secret, if true, the response will
                contain the new secret value
            is_active: Optional[bool] = Allows for activation/deactivating the
                event source
            project: The project name/ID to filter by.

        Returns:
            The model of the updated event_source.

        Raises:
            EntityExistsError: If the event_source name is already taken.
        """
        # First, get the eve
        event_source = self.get_event_source(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
            project=project,
        )

        # Create the update model
        update_model = EventSourceUpdate(
            name=name,
            description=description,
            configuration=configuration,
            rotate_secret=rotate_secret,
            is_active=is_active,
        )

        if name:
            if self.list_event_sources(name=name):
                raise EntityExistsError(
                    "There are already existing event_sources with the name "
                    f"'{name}'."
                )

        updated_event_source = self.zen_store.update_event_source(
            event_source_id=event_source.id,
            event_source_update=update_model,
        )
        return updated_event_source

    @_fail_for_sql_zen_store
    def delete_event_source(
        self,
        name_id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Deletes an event_source.

        Args:
            name_id_or_prefix: The name, id or prefix id of the event_source
                to deregister.
            project: The project name/ID to filter by.
        """
        event_source = self.get_event_source(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
            project=project,
        )

        self.zen_store.delete_event_source(event_source_id=event_source.id)
        logger.info("Deleted event_source with name '%s'.", event_source.name)

    # --------------------------------- Actions -------------------------

    @_fail_for_sql_zen_store
    def create_action(
        self,
        name: str,
        flavor: str,
        action_type: PluginSubType,
        configuration: Dict[str, Any],
        service_account_id: UUID,
        auth_window: Optional[int] = None,
        description: str = "",
    ) -> ActionResponse:
        """Create an action.

        Args:
            name: The name of the action.
            flavor: The flavor of the action,
            action_type: The action subtype.
            configuration: The action configuration.
            service_account_id: The service account that is used to execute the
                action.
            auth_window: The time window in minutes for which the service
                account is authorized to execute the action. Set this to 0 to
                authorize the service account indefinitely (not recommended).
            description: The description of the action.

        Returns:
            The created action
        """
        action = ActionRequest(
            name=name,
            description=description,
            flavor=flavor,
            plugin_subtype=action_type,
            configuration=configuration,
            service_account_id=service_account_id,
            auth_window=auth_window,
            project=self.active_project.id,
        )

        return self.zen_store.create_action(action=action)

    @_fail_for_sql_zen_store
    def get_action(
        self,
        name_id_or_prefix: Union[UUID, str],
        allow_name_prefix_match: bool = True,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
    ) -> ActionResponse:
        """Get an action by name, ID or prefix.

        Args:
            name_id_or_prefix: The name, ID or prefix of the action.
            allow_name_prefix_match: If True, allow matching by name prefix.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The action.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_action,
            list_method=self.list_actions,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            project=project,
            hydrate=hydrate,
        )

    @_fail_for_sql_zen_store
    def list_actions(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        flavor: Optional[str] = None,
        action_type: Optional[str] = None,
        project: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[ActionResponse]:
        """List actions.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of the action to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            project: The project name/ID to filter by.
            user: Filter by user name/ID.
            name: The name of the action to filter by.
            flavor: The flavor of the action to filter by.
            action_type: The type of the action to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of actions.
        """
        filter_model = ActionFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            project=project or self.active_project.id,
            user=user,
            name=name,
            id=id,
            flavor=flavor,
            plugin_subtype=action_type,
            created=created,
            updated=updated,
        )
        return self.zen_store.list_actions(filter_model, hydrate=hydrate)

    @_fail_for_sql_zen_store
    def update_action(
        self,
        name_id_or_prefix: Union[UUID, str],
        name: Optional[str] = None,
        description: Optional[str] = None,
        configuration: Optional[Dict[str, Any]] = None,
        service_account_id: Optional[UUID] = None,
        auth_window: Optional[int] = None,
        project: Optional[Union[str, UUID]] = None,
    ) -> ActionResponse:
        """Update an action.

        Args:
            name_id_or_prefix: The name, id or prefix of the action to update.
            name: The new name of the action.
            description: The new description of the action.
            configuration: The new configuration of the action.
            service_account_id: The new service account that is used to execute
                the action.
            auth_window: The new time window in minutes for which the service
                account is authorized to execute the action. Set this to 0 to
                authorize the service account indefinitely (not recommended).
            project: The project name/ID to filter by.

        Returns:
            The updated action.
        """
        action = self.get_action(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
            project=project,
        )

        update_model = ActionUpdate(
            name=name,
            description=description,
            configuration=configuration,
            service_account_id=service_account_id,
            auth_window=auth_window,
        )

        return self.zen_store.update_action(
            action_id=action.id,
            action_update=update_model,
        )

    @_fail_for_sql_zen_store
    def delete_action(
        self,
        name_id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Delete an action.

        Args:
            name_id_or_prefix: The name, id or prefix id of the action
                to delete.
            project: The project name/ID to filter by.
        """
        action = self.get_action(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
            project=project,
        )

        self.zen_store.delete_action(action_id=action.id)
        logger.info("Deleted action with name '%s'.", action.name)

    # --------------------------------- Triggers -------------------------

    @_fail_for_sql_zen_store
    def create_trigger(
        self,
        name: str,
        event_source_id: UUID,
        event_filter: Dict[str, Any],
        action_id: UUID,
        description: str = "",
    ) -> TriggerResponse:
        """Registers a trigger.

        Args:
            name: The name of the trigger to create.
            event_source_id: The id of the event source id
            event_filter: The event filter
            action_id: The ID of the action that should be triggered.
            description: The description of the trigger

        Returns:
            The created trigger.
        """
        trigger = TriggerRequest(
            name=name,
            description=description,
            event_source_id=event_source_id,
            event_filter=event_filter,
            action_id=action_id,
            project=self.active_project.id,
        )

        return self.zen_store.create_trigger(trigger=trigger)

    @_fail_for_sql_zen_store
    def get_trigger(
        self,
        name_id_or_prefix: Union[UUID, str],
        allow_name_prefix_match: bool = True,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
    ) -> TriggerResponse:
        """Get a trigger by name, ID or prefix.

        Args:
            name_id_or_prefix: The name, ID or prefix of the trigger.
            allow_name_prefix_match: If True, allow matching by name prefix.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The trigger.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_trigger,
            list_method=self.list_triggers,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            project=project,
            hydrate=hydrate,
        )

    @_fail_for_sql_zen_store
    def list_triggers(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        event_source_id: Optional[UUID] = None,
        action_id: Optional[UUID] = None,
        event_source_flavor: Optional[str] = None,
        event_source_subtype: Optional[str] = None,
        action_flavor: Optional[str] = None,
        action_subtype: Optional[str] = None,
        project: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[TriggerResponse]:
        """Lists all triggers.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of triggers to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            project: The project name/ID to filter by.
            user: Filter by user name/ID.
            name: The name of the trigger to filter by.
            event_source_id: The event source associated with the trigger.
            action_id: The action associated with the trigger.
            event_source_flavor: Flavor of the event source associated with the
                trigger.
            event_source_subtype: Type of the event source associated with the
                trigger.
            action_flavor: Flavor of the action associated with the trigger.
            action_subtype: Type of the action associated with the trigger.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of triggers.
        """
        trigger_filter_model = TriggerFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            project=project or self.active_project.id,
            user=user,
            name=name,
            event_source_id=event_source_id,
            action_id=action_id,
            event_source_flavor=event_source_flavor,
            event_source_subtype=event_source_subtype,
            action_flavor=action_flavor,
            action_subtype=action_subtype,
            id=id,
            created=created,
            updated=updated,
        )
        return self.zen_store.list_triggers(
            trigger_filter_model, hydrate=hydrate
        )

    @_fail_for_sql_zen_store
    def update_trigger(
        self,
        name_id_or_prefix: Union[UUID, str],
        name: Optional[str] = None,
        description: Optional[str] = None,
        event_filter: Optional[Dict[str, Any]] = None,
        is_active: Optional[bool] = None,
        project: Optional[Union[str, UUID]] = None,
    ) -> TriggerResponse:
        """Updates a trigger.

        Args:
            name_id_or_prefix: The name, id or prefix of the trigger to update.
            name: the new name of the trigger.
            description: the new description of the trigger.
            event_filter: The event filter configuration.
            is_active: Whether the trigger is active or not.
            project: The project name/ID to filter by.

        Returns:
            The model of the updated trigger.

        Raises:
            EntityExistsError: If the trigger name is already taken.
        """
        # First, get the eve
        trigger = self.get_trigger(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
            project=project,
        )

        # Create the update model
        update_model = TriggerUpdate(
            name=name,
            description=description,
            event_filter=event_filter,
            is_active=is_active,
        )

        if name:
            if self.list_triggers(name=name):
                raise EntityExistsError(
                    "There are already is an existing trigger with the name "
                    f"'{name}'."
                )

        updated_trigger = self.zen_store.update_trigger(
            trigger_id=trigger.id,
            trigger_update=update_model,
        )
        return updated_trigger

    @_fail_for_sql_zen_store
    def delete_trigger(
        self,
        name_id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Deletes an trigger.

        Args:
            name_id_or_prefix: The name, id or prefix id of the trigger
                to deregister.
            project: The project name/ID to filter by.
        """
        trigger = self.get_trigger(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
            project=project,
        )

        self.zen_store.delete_trigger(trigger_id=trigger.id)
        logger.info("Deleted trigger with name '%s'.", trigger.name)

    # ------------------------------ Deployments -------------------------------

    def get_deployment(
        self,
        id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
    ) -> PipelineDeploymentResponse:
        """Get a deployment by id or prefix.

        Args:
            id_or_prefix: The id or id prefix of the deployment.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The deployment.

        Raises:
            KeyError: If no deployment was found for the given id or prefix.
            ZenKeyError: If multiple deployments were found that match the given
                id or prefix.
        """
        from zenml.utils.uuid_utils import is_valid_uuid

        # First interpret as full UUID
        if is_valid_uuid(id_or_prefix):
            id_ = (
                UUID(id_or_prefix)
                if isinstance(id_or_prefix, str)
                else id_or_prefix
            )
            return self.zen_store.get_deployment(id_, hydrate=hydrate)

        list_kwargs: Dict[str, Any] = dict(
            id=f"startswith:{id_or_prefix}",
            hydrate=hydrate,
        )
        scope = ""
        if project:
            list_kwargs["project"] = project
            scope = f" in project {project}"

        entity = self.list_deployments(**list_kwargs)

        # If only a single entity is found, return it.
        if entity.total == 1:
            return entity.items[0]

        # If no entity is found, raise an error.
        if entity.total == 0:
            raise KeyError(
                f"No deployment have been found that have either an id or "
                f"prefix that matches the provided string '{id_or_prefix}'{scope}."
            )

        raise ZenKeyError(
            f"{entity.total} deployments have been found{scope} that have "
            f"an ID that matches the provided "
            f"string '{id_or_prefix}':\n"
            f"{[entity.items]}.\n"
            f"Please use the id to uniquely identify "
            f"only one of the deployments."
        )

    def list_deployments(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        project: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        pipeline_id: Optional[Union[str, UUID]] = None,
        stack_id: Optional[Union[str, UUID]] = None,
        build_id: Optional[Union[str, UUID]] = None,
        template_id: Optional[Union[str, UUID]] = None,
        hydrate: bool = False,
    ) -> Page[PipelineDeploymentResponse]:
        """List all deployments.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of build to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            project: The project name/ID to filter by.
            user: Filter by user name/ID.
            pipeline_id: The id of the pipeline to filter by.
            stack_id: The id of the stack to filter by.
            build_id: The id of the build to filter by.
            template_id: The ID of the template to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page with deployments fitting the filter description
        """
        deployment_filter_model = PipelineDeploymentFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            project=project or self.active_project.id,
            user=user,
            pipeline_id=pipeline_id,
            stack_id=stack_id,
            build_id=build_id,
            template_id=template_id,
        )
        return self.zen_store.list_deployments(
            deployment_filter_model=deployment_filter_model,
            hydrate=hydrate,
        )

    def delete_deployment(
        self,
        id_or_prefix: str,
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Delete a deployment.

        Args:
            id_or_prefix: The id or id prefix of the deployment.
            project: The project name/ID to filter by.
        """
        deployment = self.get_deployment(
            id_or_prefix=id_or_prefix,
            project=project,
            hydrate=False,
        )
        self.zen_store.delete_deployment(deployment_id=deployment.id)

    # ------------------------------ Run templates -----------------------------

    def create_run_template(
        self,
        name: str,
        deployment_id: UUID,
        description: Optional[str] = None,
        tags: Optional[List[str]] = None,
    ) -> RunTemplateResponse:
        """Create a run template.

        Args:
            name: The name of the run template.
            deployment_id: ID of the deployment which this template should be
                based off of.
            description: The description of the run template.
            tags: Tags associated with the run template.

        Returns:
            The created run template.
        """
        return self.zen_store.create_run_template(
            template=RunTemplateRequest(
                name=name,
                description=description,
                source_deployment_id=deployment_id,
                tags=tags,
                project=self.active_project.id,
            )
        )

    def get_run_template(
        self,
        name_id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
    ) -> RunTemplateResponse:
        """Get a run template.

        Args:
            name_id_or_prefix: Name/ID/ID prefix of the template to get.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The run template.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_run_template,
            list_method=functools.partial(
                self.list_run_templates, hidden=None
            ),
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
            project=project,
            hydrate=hydrate,
        )

    def list_run_templates(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        id: Optional[Union[UUID, str]] = None,
        name: Optional[str] = None,
        hidden: Optional[bool] = False,
        tag: Optional[str] = None,
        project: Optional[Union[str, UUID]] = None,
        pipeline_id: Optional[Union[str, UUID]] = None,
        build_id: Optional[Union[str, UUID]] = None,
        stack_id: Optional[Union[str, UUID]] = None,
        code_repository_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        pipeline: Optional[Union[UUID, str]] = None,
        stack: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[RunTemplateResponse]:
        """Get a page of run templates.

        Args:
            sort_by: The column to sort by.
            page: The page of items.
            size: The maximum size of all pages.
            logical_operator: Which logical operator to use [and, or].
            created: Filter by the creation date.
            updated: Filter by the last updated date.
            id: Filter by run template ID.
            name: Filter by run template name.
            hidden: Filter by run template hidden status.
            tag: Filter by run template tags.
            project: Filter by project name/ID.
            pipeline_id: Filter by pipeline ID.
            build_id: Filter by build ID.
            stack_id: Filter by stack ID.
            code_repository_id: Filter by code repository ID.
            user: Filter by user name/ID.
            pipeline: Filter by pipeline name/ID.
            stack: Filter by stack name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of run templates.
        """
        filter = RunTemplateFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            created=created,
            updated=updated,
            id=id,
            name=name,
            hidden=hidden,
            tag=tag,
            project=project or self.active_project.id,
            pipeline_id=pipeline_id,
            build_id=build_id,
            stack_id=stack_id,
            code_repository_id=code_repository_id,
            user=user,
            pipeline=pipeline,
            stack=stack,
        )

        return self.zen_store.list_run_templates(
            template_filter_model=filter, hydrate=hydrate
        )

    def update_run_template(
        self,
        name_id_or_prefix: Union[str, UUID],
        name: Optional[str] = None,
        description: Optional[str] = None,
        hidden: Optional[bool] = None,
        add_tags: Optional[List[str]] = None,
        remove_tags: Optional[List[str]] = None,
        project: Optional[Union[str, UUID]] = None,
    ) -> RunTemplateResponse:
        """Update a run template.

        Args:
            name_id_or_prefix: Name/ID/ID prefix of the template to update.
            name: The new name of the run template.
            description: The new description of the run template.
            hidden: The new hidden status of the run template.
            add_tags: Tags to add to the run template.
            remove_tags: Tags to remove from the run template.
            project: The project name/ID to filter by.

        Returns:
            The updated run template.
        """
        if is_valid_uuid(name_id_or_prefix):
            template_id = (
                UUID(name_id_or_prefix)
                if isinstance(name_id_or_prefix, str)
                else name_id_or_prefix
            )
        else:
            template_id = self.get_run_template(
                name_id_or_prefix,
                project=project,
                hydrate=False,
            ).id

        return self.zen_store.update_run_template(
            template_id=template_id,
            template_update=RunTemplateUpdate(
                name=name,
                description=description,
                hidden=hidden,
                add_tags=add_tags,
                remove_tags=remove_tags,
            ),
        )

    def delete_run_template(
        self,
        name_id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Delete a run template.

        Args:
            name_id_or_prefix: Name/ID/ID prefix of the template to delete.
            project: The project name/ID to filter by.
        """
        if is_valid_uuid(name_id_or_prefix):
            template_id = (
                UUID(name_id_or_prefix)
                if isinstance(name_id_or_prefix, str)
                else name_id_or_prefix
            )
        else:
            template_id = self.get_run_template(
                name_id_or_prefix,
                project=project,
                hydrate=False,
            ).id

        self.zen_store.delete_run_template(template_id=template_id)

    # ------------------------------- Schedules --------------------------------

    def get_schedule(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
    ) -> ScheduleResponse:
        """Get a schedule by name, id or prefix.

        Args:
            name_id_or_prefix: The name, id or prefix of the schedule.
            allow_name_prefix_match: If True, allow matching by name prefix.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The schedule.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_schedule,
            list_method=self.list_schedules,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            project=project,
            hydrate=hydrate,
        )

    def list_schedules(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        project: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        pipeline_id: Optional[Union[str, UUID]] = None,
        orchestrator_id: Optional[Union[str, UUID]] = None,
        active: Optional[Union[str, bool]] = None,
        cron_expression: Optional[str] = None,
        start_time: Optional[Union[datetime, str]] = None,
        end_time: Optional[Union[datetime, str]] = None,
        interval_second: Optional[int] = None,
        catchup: Optional[Union[str, bool]] = None,
        hydrate: bool = False,
        run_once_start_time: Optional[Union[datetime, str]] = None,
    ) -> Page[ScheduleResponse]:
        """List schedules.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of stacks to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: The name of the stack to filter by.
            project: The project name/ID to filter by.
            user: Filter by user name/ID.
            pipeline_id: The id of the pipeline to filter by.
            orchestrator_id: The id of the orchestrator to filter by.
            active: Use to filter by active status.
            cron_expression: Use to filter by cron expression.
            start_time: Use to filter by start time.
            end_time: Use to filter by end time.
            interval_second: Use to filter by interval second.
            catchup: Use to filter by catchup.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            run_once_start_time: Use to filter by run once start time.

        Returns:
            A list of schedules.
        """
        schedule_filter_model = ScheduleFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            project=project or self.active_project.id,
            user=user,
            pipeline_id=pipeline_id,
            orchestrator_id=orchestrator_id,
            active=active,
            cron_expression=cron_expression,
            start_time=start_time,
            end_time=end_time,
            interval_second=interval_second,
            catchup=catchup,
            run_once_start_time=run_once_start_time,
        )
        return self.zen_store.list_schedules(
            schedule_filter_model=schedule_filter_model,
            hydrate=hydrate,
        )

    def delete_schedule(
        self,
        name_id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Delete a schedule.

        Args:
            name_id_or_prefix: The name, id or prefix id of the schedule
                to delete.
            project: The project name/ID to filter by.
        """
        schedule = self.get_schedule(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
            project=project,
        )
        logger.warning(
            f"Deleting schedule '{name_id_or_prefix}'... This will only delete "
            "the reference of the schedule from ZenML. Please make sure to "
            "manually stop/delete this schedule in your orchestrator as well!"
        )
        self.zen_store.delete_schedule(schedule_id=schedule.id)

    # ----------------------------- Pipeline runs ------------------------------

    def get_pipeline_run(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
        include_full_metadata: bool = False,
    ) -> PipelineRunResponse:
        """Gets a pipeline run by name, ID, or prefix.

        Args:
            name_id_or_prefix: Name, ID, or prefix of the pipeline run.
            allow_name_prefix_match: If True, allow matching by name prefix.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            include_full_metadata: If True, include metadata of all steps in
                the response.

        Returns:
            The pipeline run.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_run,
            list_method=self.list_pipeline_runs,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            project=project,
            hydrate=hydrate,
            include_full_metadata=include_full_metadata,
        )

    def list_pipeline_runs(
        self,
        sort_by: str = "desc:created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        project: Optional[Union[str, UUID]] = None,
        pipeline_id: Optional[Union[str, UUID]] = None,
        pipeline_name: Optional[str] = None,
        stack_id: Optional[Union[str, UUID]] = None,
        schedule_id: Optional[Union[str, UUID]] = None,
        build_id: Optional[Union[str, UUID]] = None,
        deployment_id: Optional[Union[str, UUID]] = None,
        code_repository_id: Optional[Union[str, UUID]] = None,
        template_id: Optional[Union[str, UUID]] = None,
        model_version_id: Optional[Union[str, UUID]] = None,
        orchestrator_run_id: Optional[str] = None,
        status: Optional[str] = None,
        start_time: Optional[Union[datetime, str]] = None,
        end_time: Optional[Union[datetime, str]] = None,
        unlisted: Optional[bool] = None,
        templatable: Optional[bool] = None,
        tag: Optional[str] = None,
        tags: Optional[List[str]] = None,
        user: Optional[Union[UUID, str]] = None,
        run_metadata: Optional[List[str]] = None,
        pipeline: Optional[Union[UUID, str]] = None,
        code_repository: Optional[Union[UUID, str]] = None,
        model: Optional[Union[UUID, str]] = None,
        stack: Optional[Union[UUID, str]] = None,
        stack_component: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
        include_full_metadata: bool = False,
    ) -> Page[PipelineRunResponse]:
        """List all pipeline runs.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: The id of the runs to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            project: The project name/ID to filter by.
            pipeline_id: The id of the pipeline to filter by.
            pipeline_name: DEPRECATED. Use `pipeline` instead to filter by
                pipeline name.
            stack_id: The id of the stack to filter by.
            schedule_id: The id of the schedule to filter by.
            build_id: The id of the build to filter by.
            deployment_id: The id of the deployment to filter by.
            code_repository_id: The id of the code repository to filter by.
            template_id: The ID of the template to filter by.
            model_version_id: The ID of the model version to filter by.
            orchestrator_run_id: The run id of the orchestrator to filter by.
            name: The name of the run to filter by.
            status: The status of the pipeline run
            start_time: The start_time for the pipeline run
            end_time: The end_time for the pipeline run
            unlisted: If the runs should be unlisted or not.
            templatable: If the runs should be templatable or not.
            tag: Tag to filter by.
            tags: Tags to filter by.
            user: The name/ID of the user to filter by.
            run_metadata: The run_metadata of the run to filter by.
            pipeline: The name/ID of the pipeline to filter by.
            code_repository: Filter by code repository name/ID.
            model: Filter by model name/ID.
            stack: Filter by stack name/ID.
            stack_component: Filter by stack component name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            include_full_metadata: If True, include metadata of all steps in
                the response.

        Returns:
            A page with Pipeline Runs fitting the filter description
        """
        runs_filter_model = PipelineRunFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            project=project or self.active_project.id,
            pipeline_id=pipeline_id,
            pipeline_name=pipeline_name,
            schedule_id=schedule_id,
            build_id=build_id,
            deployment_id=deployment_id,
            code_repository_id=code_repository_id,
            template_id=template_id,
            model_version_id=model_version_id,
            orchestrator_run_id=orchestrator_run_id,
            stack_id=stack_id,
            status=status,
            start_time=start_time,
            end_time=end_time,
            tag=tag,
            tags=tags,
            unlisted=unlisted,
            user=user,
            run_metadata=run_metadata,
            pipeline=pipeline,
            code_repository=code_repository,
            stack=stack,
            model=model,
            stack_component=stack_component,
            templatable=templatable,
        )
        return self.zen_store.list_runs(
            runs_filter_model=runs_filter_model,
            hydrate=hydrate,
            include_full_metadata=include_full_metadata,
        )

    def delete_pipeline_run(
        self,
        name_id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Deletes a pipeline run.

        Args:
            name_id_or_prefix: Name, ID, or prefix of the pipeline run.
            project: The project name/ID to filter by.
        """
        run = self.get_pipeline_run(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
            project=project,
        )
        self.zen_store.delete_run(run_id=run.id)

    # -------------------------------- Step run --------------------------------

    def get_run_step(
        self,
        step_run_id: UUID,
        hydrate: bool = True,
    ) -> StepRunResponse:
        """Get a step run by ID.

        Args:
            step_run_id: The ID of the step run to get.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The step run.
        """
        return self.zen_store.get_run_step(
            step_run_id,
            hydrate=hydrate,
        )

    def list_run_steps(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        cache_key: Optional[str] = None,
        code_hash: Optional[str] = None,
        status: Optional[str] = None,
        start_time: Optional[Union[datetime, str]] = None,
        end_time: Optional[Union[datetime, str]] = None,
        pipeline_run_id: Optional[Union[str, UUID]] = None,
        deployment_id: Optional[Union[str, UUID]] = None,
        original_step_run_id: Optional[Union[str, UUID]] = None,
        project: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        model_version_id: Optional[Union[str, UUID]] = None,
        model: Optional[Union[UUID, str]] = None,
        run_metadata: Optional[List[str]] = None,
        hydrate: bool = False,
    ) -> Page[StepRunResponse]:
        """List all pipelines.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of runs to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            start_time: Use to filter by the time when the step started running
            end_time: Use to filter by the time when the step finished running
            project: The project name/ID to filter by.
            user: Filter by user name/ID.
            pipeline_run_id: The id of the pipeline run to filter by.
            deployment_id: The id of the deployment to filter by.
            original_step_run_id: The id of the original step run to filter by.
            model_version_id: The ID of the model version to filter by.
            model: Filter by model name/ID.
            name: The name of the step run to filter by.
            cache_key: The cache key of the step run to filter by.
            code_hash: The code hash of the step run to filter by.
            status: The name of the run to filter by.
            run_metadata: Filter by run metadata.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page with Pipeline fitting the filter description
        """
        step_run_filter_model = StepRunFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            cache_key=cache_key,
            code_hash=code_hash,
            pipeline_run_id=pipeline_run_id,
            deployment_id=deployment_id,
            original_step_run_id=original_step_run_id,
            status=status,
            created=created,
            updated=updated,
            start_time=start_time,
            end_time=end_time,
            name=name,
            project=project or self.active_project.id,
            user=user,
            model_version_id=model_version_id,
            model=model,
            run_metadata=run_metadata,
        )
        return self.zen_store.list_run_steps(
            step_run_filter_model=step_run_filter_model,
            hydrate=hydrate,
        )

    # ------------------------------- Artifacts -------------------------------

    def get_artifact(
        self,
        name_id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = False,
    ) -> ArtifactResponse:
        """Get an artifact by name, id or prefix.

        Args:
            name_id_or_prefix: The name, ID or prefix of the artifact to get.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The artifact.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_artifact,
            list_method=self.list_artifacts,
            name_id_or_prefix=name_id_or_prefix,
            project=project,
            hydrate=hydrate,
        )

    def list_artifacts(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        has_custom_name: Optional[bool] = None,
        user: Optional[Union[UUID, str]] = None,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = False,
        tag: Optional[str] = None,
        tags: Optional[List[str]] = None,
    ) -> Page[ArtifactResponse]:
        """Get a list of artifacts.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of artifact to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: The name of the artifact to filter by.
            has_custom_name: Filter artifacts with/without custom names.
            user: Filter by user name or ID.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            tag: Filter artifacts by tag.
            tags: Tags to filter by.

        Returns:
            A list of artifacts.
        """
        artifact_filter_model = ArtifactFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            has_custom_name=has_custom_name,
            tag=tag,
            tags=tags,
            user=user,
            project=project or self.active_project.id,
        )
        return self.zen_store.list_artifacts(
            artifact_filter_model,
            hydrate=hydrate,
        )

    def update_artifact(
        self,
        name_id_or_prefix: Union[str, UUID],
        new_name: Optional[str] = None,
        add_tags: Optional[List[str]] = None,
        remove_tags: Optional[List[str]] = None,
        has_custom_name: Optional[bool] = None,
        project: Optional[Union[str, UUID]] = None,
    ) -> ArtifactResponse:
        """Update an artifact.

        Args:
            name_id_or_prefix: The name, ID or prefix of the artifact to update.
            new_name: The new name of the artifact.
            add_tags: Tags to add to the artifact.
            remove_tags: Tags to remove from the artifact.
            has_custom_name: Whether the artifact has a custom name.
            project: The project name/ID to filter by.

        Returns:
            The updated artifact.
        """
        artifact = self.get_artifact(
            name_id_or_prefix=name_id_or_prefix,
            project=project,
        )
        artifact_update = ArtifactUpdate(
            name=new_name,
            add_tags=add_tags,
            remove_tags=remove_tags,
            has_custom_name=has_custom_name,
        )
        return self.zen_store.update_artifact(
            artifact_id=artifact.id, artifact_update=artifact_update
        )

    def delete_artifact(
        self,
        name_id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Delete an artifact.

        Args:
            name_id_or_prefix: The name, ID or prefix of the artifact to delete.
            project: The project name/ID to filter by.
        """
        artifact = self.get_artifact(
            name_id_or_prefix=name_id_or_prefix,
            project=project,
        )
        self.zen_store.delete_artifact(artifact_id=artifact.id)
        logger.info(f"Deleted artifact '{artifact.name}'.")

    def prune_artifacts(
        self,
        only_versions: bool = True,
        delete_from_artifact_store: bool = False,
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Delete all unused artifacts and artifact versions.

        Args:
            only_versions: Only delete artifact versions, keeping artifacts
            delete_from_artifact_store: Delete data from artifact metadata
            project: The project name/ID to filter by.
        """
        if delete_from_artifact_store:
            unused_artifact_versions = depaginate(
                self.list_artifact_versions,
                only_unused=True,
                project=project,
            )
            for unused_artifact_version in unused_artifact_versions:
                self._delete_artifact_from_artifact_store(
                    unused_artifact_version
                )

        project = project or self.active_project.id

        self.zen_store.prune_artifact_versions(
            project_name_or_id=project, only_versions=only_versions
        )
        logger.info("All unused artifacts and artifact versions deleted.")

    # --------------------------- Artifact Versions ---------------------------

    def get_artifact_version(
        self,
        name_id_or_prefix: Union[str, UUID],
        version: Optional[str] = None,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
    ) -> ArtifactVersionResponse:
        """Get an artifact version by ID or artifact name.

        Args:
            name_id_or_prefix: Either the ID of the artifact version or the
                name of the artifact.
            version: The version of the artifact to get. Only used if
                `name_id_or_prefix` is the name of the artifact. If not
                specified, the latest version is returned.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The artifact version.
        """
        from zenml import get_step_context

        if cll := client_lazy_loader(
            method_name="get_artifact_version",
            name_id_or_prefix=name_id_or_prefix,
            version=version,
            project=project,
            hydrate=hydrate,
        ):
            return cll  # type: ignore[return-value]

        artifact = self._get_entity_version_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_artifact_version,
            list_method=self.list_artifact_versions,
            name_id_or_prefix=name_id_or_prefix,
            version=version,
            project=project,
            hydrate=hydrate,
        )
        try:
            step_run = get_step_context().step_run
            client = Client()
            client.zen_store.update_run_step(
                step_run_id=step_run.id,
                step_run_update=StepRunUpdate(
                    loaded_artifact_versions={artifact.name: artifact.id}
                ),
            )
        except RuntimeError:
            pass  # Cannot link to step run if called outside a step
        return artifact

    def list_artifact_versions(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        artifact: Optional[Union[str, UUID]] = None,
        name: Optional[str] = None,
        version: Optional[Union[str, int]] = None,
        version_number: Optional[int] = None,
        artifact_store_id: Optional[Union[str, UUID]] = None,
        type: Optional[ArtifactType] = None,
        data_type: Optional[str] = None,
        uri: Optional[str] = None,
        materializer: Optional[str] = None,
        project: Optional[Union[str, UUID]] = None,
        model_version_id: Optional[Union[str, UUID]] = None,
        only_unused: Optional[bool] = False,
        has_custom_name: Optional[bool] = None,
        user: Optional[Union[UUID, str]] = None,
        model: Optional[Union[UUID, str]] = None,
        pipeline_run: Optional[Union[UUID, str]] = None,
        run_metadata: Optional[List[str]] = None,
        tag: Optional[str] = None,
        tags: Optional[List[str]] = None,
        hydrate: bool = False,
    ) -> Page[ArtifactVersionResponse]:
        """Get a list of artifact versions.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of artifact version to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            artifact: The name or ID of the artifact to filter by.
            name: The name of the artifact to filter by.
            version: The version of the artifact to filter by.
            version_number: The version number of the artifact to filter by.
            artifact_store_id: The id of the artifact store to filter by.
            type: The type of the artifact to filter by.
            data_type: The data type of the artifact to filter by.
            uri: The uri of the artifact to filter by.
            materializer: The materializer of the artifact to filter by.
            project: The project name/ID to filter by.
            model_version_id: Filter by model version ID.
            only_unused: Only return artifact versions that are not used in
                any pipeline runs.
            has_custom_name: Filter artifacts with/without custom names.
            tag: A tag to filter by.
            tags: Tags to filter by.
            user: Filter by user name or ID.
            model: Filter by model name or ID.
            pipeline_run: Filter by pipeline run name or ID.
            run_metadata: Filter by run metadata.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A list of artifact versions.
        """
        if name:
            artifact = name

        artifact_version_filter_model = ArtifactVersionFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            artifact=artifact,
            version=str(version) if version else None,
            version_number=version_number,
            artifact_store_id=artifact_store_id,
            type=type,
            data_type=data_type,
            uri=uri,
            materializer=materializer,
            project=project or self.active_project.id,
            model_version_id=model_version_id,
            only_unused=only_unused,
            has_custom_name=has_custom_name,
            tag=tag,
            tags=tags,
            user=user,
            model=model,
            pipeline_run=pipeline_run,
            run_metadata=run_metadata,
        )
        return self.zen_store.list_artifact_versions(
            artifact_version_filter_model,
            hydrate=hydrate,
        )

    def update_artifact_version(
        self,
        name_id_or_prefix: Union[str, UUID],
        version: Optional[str] = None,
        add_tags: Optional[List[str]] = None,
        remove_tags: Optional[List[str]] = None,
        project: Optional[Union[str, UUID]] = None,
    ) -> ArtifactVersionResponse:
        """Update an artifact version.

        Args:
            name_id_or_prefix: The name, ID or prefix of the artifact to update.
            version: The version of the artifact to update. Only used if
                `name_id_or_prefix` is the name of the artifact. If not
                specified, the latest version is updated.
            add_tags: Tags to add to the artifact version.
            remove_tags: Tags to remove from the artifact version.
            project: The project name/ID to filter by.

        Returns:
            The updated artifact version.
        """
        artifact_version = self.get_artifact_version(
            name_id_or_prefix=name_id_or_prefix,
            version=version,
            project=project,
        )
        artifact_version_update = ArtifactVersionUpdate(
            add_tags=add_tags, remove_tags=remove_tags
        )
        return self.zen_store.update_artifact_version(
            artifact_version_id=artifact_version.id,
            artifact_version_update=artifact_version_update,
        )

    def delete_artifact_version(
        self,
        name_id_or_prefix: Union[str, UUID],
        version: Optional[str] = None,
        delete_metadata: bool = True,
        delete_from_artifact_store: bool = False,
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Delete an artifact version.

        By default, this will delete only the metadata of the artifact from the
        database, not the actual object stored in the artifact store.

        Args:
            name_id_or_prefix: The ID of artifact version or name or prefix of the artifact to
                delete.
            version: The version of the artifact to delete.
            delete_metadata: If True, delete the metadata of the artifact
                version from the database.
            delete_from_artifact_store: If True, delete the artifact object
                    itself from the artifact store.
            project: The project name/ID to filter by.
        """
        artifact_version = self.get_artifact_version(
            name_id_or_prefix=name_id_or_prefix,
            version=version,
            project=project,
        )
        if delete_from_artifact_store:
            self._delete_artifact_from_artifact_store(
                artifact_version=artifact_version
            )
        if delete_metadata:
            self._delete_artifact_version(artifact_version=artifact_version)

    def _delete_artifact_version(
        self, artifact_version: ArtifactVersionResponse
    ) -> None:
        """Delete the metadata of an artifact version from the database.

        Args:
            artifact_version: The artifact version to delete.

        Raises:
            ValueError: If the artifact version is still used in any runs.
        """
        if artifact_version not in depaginate(
            self.list_artifact_versions, only_unused=True
        ):
            raise ValueError(
                "The metadata of artifact versions that are used in runs "
                "cannot be deleted. Please delete all runs that use this "
                "artifact first."
            )
        self.zen_store.delete_artifact_version(artifact_version.id)
        logger.info(
            f"Deleted version '{artifact_version.version}' of artifact "
            f"'{artifact_version.artifact.name}'."
        )

    def _delete_artifact_from_artifact_store(
        self, artifact_version: ArtifactVersionResponse
    ) -> None:
        """Delete an artifact object from the artifact store.

        Args:
            artifact_version: The artifact version to delete.

        Raises:
            Exception: If the artifact store is inaccessible.
        """
        from zenml.artifact_stores.base_artifact_store import BaseArtifactStore
        from zenml.stack.stack_component import StackComponent

        if not artifact_version.artifact_store_id:
            logger.warning(
                f"Artifact '{artifact_version.uri}' does not have an artifact "
                "store associated with it. Skipping deletion from artifact "
                "store."
            )
            return
        try:
            artifact_store_model = self.get_stack_component(
                component_type=StackComponentType.ARTIFACT_STORE,
                name_id_or_prefix=artifact_version.artifact_store_id,
            )
            artifact_store = StackComponent.from_model(artifact_store_model)
            assert isinstance(artifact_store, BaseArtifactStore)
            artifact_store.rmtree(artifact_version.uri)
        except Exception as e:
            logger.error(
                f"Failed to delete artifact '{artifact_version.uri}' from the "
                "artifact store. This might happen if your local client "
                "does not have access to the artifact store or does not "
                "have the required integrations installed. Full error: "
                f"{e}"
            )
            raise e
        else:
            logger.info(
                f"Deleted artifact '{artifact_version.uri}' from the artifact "
                "store."
            )

    # ------------------------------ Run Metadata ------------------------------

    def create_run_metadata(
        self,
        metadata: Dict[str, "MetadataType"],
        resources: List[RunMetadataResource],
        stack_component_id: Optional[UUID] = None,
        publisher_step_id: Optional[UUID] = None,
    ) -> None:
        """Create run metadata.

        Args:
            metadata: The metadata to create as a dictionary of key-value pairs.
            resources: The list of IDs and types of the resources for that the
                metadata was produced.
            stack_component_id: The ID of the stack component that produced
                the metadata.
            publisher_step_id: The ID of the step execution that publishes
                this metadata automatically.
        """
        from zenml.metadata.metadata_types import get_metadata_type

        values: Dict[str, "MetadataType"] = {}
        types: Dict[str, "MetadataTypeEnum"] = {}
        for key, value in metadata.items():
            # Skip metadata that is too large to be stored in the database.
            if len(json.dumps(value)) > TEXT_FIELD_MAX_LENGTH:
                logger.warning(
                    f"Metadata value for key '{key}' is too large to be "
                    "stored in the database. Skipping."
                )
                continue
            # Skip metadata that is not of a supported type.
            try:
                metadata_type = get_metadata_type(value)
            except ValueError as e:
                logger.warning(
                    f"Metadata value for key '{key}' is not of a supported "
                    f"type. Skipping. Full error: {e}"
                )
                continue
            values[key] = value
            types[key] = metadata_type

        run_metadata = RunMetadataRequest(
            project=self.active_project.id,
            resources=resources,
            stack_component_id=stack_component_id,
            publisher_step_id=publisher_step_id,
            values=values,
            types=types,
        )
        self.zen_store.create_run_metadata(run_metadata)

    # -------------------------------- Secrets ---------------------------------

    def create_secret(
        self,
        name: str,
        values: Dict[str, str],
        private: bool = False,
    ) -> SecretResponse:
        """Creates a new secret.

        Args:
            name: The name of the secret.
            values: The values of the secret.
            private: Whether the secret is private. A private secret is only
                accessible to the user who created it.

        Returns:
            The created secret (in model form).

        Raises:
            NotImplementedError: If centralized secrets management is not
                enabled.
        """
        create_secret_request = SecretRequest(
            name=name,
            values=values,
            private=private,
        )
        try:
            return self.zen_store.create_secret(secret=create_secret_request)
        except NotImplementedError:
            raise NotImplementedError(
                "centralized secrets management is not supported or explicitly "
                "disabled in the target ZenML deployment."
            )

    def get_secret(
        self,
        name_id_or_prefix: Union[str, UUID],
        private: Optional[bool] = None,
        allow_partial_name_match: bool = True,
        allow_partial_id_match: bool = True,
        hydrate: bool = True,
    ) -> SecretResponse:
        """Get a secret.

        Get a secret identified by a name, ID or prefix of the name or ID and
        optionally a scope.

        If a private status is not provided, privately scoped secrets will be
        searched for first, followed by publicly scoped secrets. When a name or
        prefix is used instead of a UUID value, each scope is first searched for
        an exact match, then for a ID prefix or name substring match before
        moving on to the next scope.

        Args:
            name_id_or_prefix: The name, ID or prefix to the id of the secret
                to get.
            private: Whether the secret is private. If not set, all secrets will
                be searched for, prioritizing privately scoped secrets.
            allow_partial_name_match: If True, allow partial name matches.
            allow_partial_id_match: If True, allow partial ID matches.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The secret.

        Raises:
            KeyError: If no secret is found.
            ZenKeyError: If multiple secrets are found.
            NotImplementedError: If centralized secrets management is not
                enabled.
        """
        from zenml.utils.uuid_utils import is_valid_uuid

        try:
            # First interpret as full UUID
            if is_valid_uuid(name_id_or_prefix):
                # Fetch by ID; filter by scope if provided
                secret = self.zen_store.get_secret(
                    secret_id=UUID(name_id_or_prefix)
                    if isinstance(name_id_or_prefix, str)
                    else name_id_or_prefix,
                    hydrate=hydrate,
                )
                if private is not None and secret.private != private:
                    raise KeyError(
                        f"No secret found with ID {str(name_id_or_prefix)}"
                    )

                return secret
        except NotImplementedError:
            raise NotImplementedError(
                "centralized secrets management is not supported or explicitly "
                "disabled in the target ZenML deployment."
            )

        # If not a UUID, try to find by name and then by prefix
        assert not isinstance(name_id_or_prefix, UUID)

        # Private statuses to search in order of priority
        search_private_statuses = (
            [False, True] if private is None else [private]
        )

        secrets = self.list_secrets(
            logical_operator=LogicalOperators.OR,
            name=f"contains:{name_id_or_prefix}"
            if allow_partial_name_match
            else f"equals:{name_id_or_prefix}",
            id=f"startswith:{name_id_or_prefix}"
            if allow_partial_id_match
            else None,
            hydrate=hydrate,
        )

        for search_private_status in search_private_statuses:
            partial_matches: List[SecretResponse] = []
            for secret in secrets.items:
                if secret.private != search_private_status:
                    continue
                # Exact match
                if secret.name == name_id_or_prefix:
                    # Need to fetch the secret again to get the secret values
                    return self.zen_store.get_secret(
                        secret_id=secret.id,
                        hydrate=hydrate,
                    )
                # Partial match
                partial_matches.append(secret)

            if len(partial_matches) > 1:
                match_summary = "\n".join(
                    [
                        f"[{secret.id}]: name = {secret.name}"
                        for secret in partial_matches
                    ]
                )
                raise ZenKeyError(
                    f"{len(partial_matches)} secrets have been found that have "
                    f"a name or ID that matches the provided "
                    f"string '{name_id_or_prefix}':\n"
                    f"{match_summary}.\n"
                    f"Please use the id to uniquely identify "
                    f"only one of the secrets."
                )

            # If only a single secret is found, return it
            if len(partial_matches) == 1:
                # Need to fetch the secret again to get the secret values
                return self.zen_store.get_secret(
                    secret_id=partial_matches[0].id,
                    hydrate=hydrate,
                )
        private_status = ""
        if private is not None:
            private_status = "private " if private else "public "
        msg = (
            f"No {private_status}secret found with name, ID or prefix "
            f"'{name_id_or_prefix}'"
        )

        raise KeyError(msg)

    def list_secrets(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        private: Optional[bool] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[SecretResponse]:
        """Fetches all the secret models.

        The returned secrets do not contain the secret values. To get the
        secret values, use `get_secret` individually for each secret.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of secrets to filter by.
            created: Use to secrets by time of creation
            updated: Use the last updated date for filtering
            name: The name of the secret to filter by.
            private: The private status of the secret to filter by.
            user: Filter by user name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A list of all the secret models without the secret values.

        Raises:
            NotImplementedError: If centralized secrets management is not
                enabled.
        """
        secret_filter_model = SecretFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            user=user,
            name=name,
            private=private,
            id=id,
            created=created,
            updated=updated,
        )
        try:
            return self.zen_store.list_secrets(
                secret_filter_model=secret_filter_model,
                hydrate=hydrate,
            )
        except NotImplementedError:
            raise NotImplementedError(
                "centralized secrets management is not supported or explicitly "
                "disabled in the target ZenML deployment."
            )

    def update_secret(
        self,
        name_id_or_prefix: Union[str, UUID],
        private: Optional[bool] = None,
        new_name: Optional[str] = None,
        update_private: Optional[bool] = None,
        add_or_update_values: Optional[Dict[str, str]] = None,
        remove_values: Optional[List[str]] = None,
    ) -> SecretResponse:
        """Updates a secret.

        Args:
            name_id_or_prefix: The name, id or prefix of the id for the
                secret to update.
            private: The private status of the secret to update.
            new_name: The new name of the secret.
            update_private: New value used to update the private status of the
                secret.
            add_or_update_values: The values to add or update.
            remove_values: The values to remove.

        Returns:
            The updated secret.

        Raises:
            KeyError: If trying to remove a value that doesn't exist.
            ValueError: If a key is provided in both add_or_update_values and
                remove_values.
        """
        secret = self.get_secret(
            name_id_or_prefix=name_id_or_prefix,
            private=private,
            # Don't allow partial name matches, but allow partial ID matches
            allow_partial_name_match=False,
            allow_partial_id_match=True,
            hydrate=True,
        )

        secret_update = SecretUpdate(name=new_name or secret.name)

        if update_private:
            secret_update.private = update_private
        values: Dict[str, Optional[SecretStr]] = {}
        if add_or_update_values:
            values.update(
                {
                    key: SecretStr(value)
                    for key, value in add_or_update_values.items()
                }
            )
        if remove_values:
            for key in remove_values:
                if key not in secret.values:
                    raise KeyError(
                        f"Cannot remove value '{key}' from secret "
                        f"'{secret.name}' because it does not exist."
                    )
                if key in values:
                    raise ValueError(
                        f"Key '{key}' is supplied both in the values to add or "
                        f"update and the values to be removed."
                    )
                values[key] = None
        if values:
            secret_update.values = values

        return Client().zen_store.update_secret(
            secret_id=secret.id, secret_update=secret_update
        )

    def delete_secret(
        self, name_id_or_prefix: str, private: Optional[bool] = None
    ) -> None:
        """Deletes a secret.

        Args:
            name_id_or_prefix: The name or ID of the secret.
            private: The private status of the secret to delete.
        """
        secret = self.get_secret(
            name_id_or_prefix=name_id_or_prefix,
            private=private,
            # Don't allow partial name matches, but allow partial ID matches
            allow_partial_name_match=False,
            allow_partial_id_match=True,
        )

        self.zen_store.delete_secret(secret_id=secret.id)

    def get_secret_by_name_and_private_status(
        self,
        name: str,
        private: Optional[bool] = None,
        hydrate: bool = True,
    ) -> SecretResponse:
        """Fetches a registered secret with a given name and optional private status.

        This is a version of get_secret that restricts the search to a given
        name and an optional private status, without doing any prefix or UUID
        matching.

        If no private status is provided, the search will be done first for
        private secrets, then for public secrets.

        Args:
            name: The name of the secret to get.
            private: The private status of the secret to get.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The registered secret.

        Raises:
            KeyError: If no secret exists for the given name in the given scope.
        """
        logger.debug(
            f"Fetching the secret with name '{name}' and private status "
            f"'{private}'."
        )

        # Private statuses to search in order of priority
        search_private_statuses = (
            [False, True] if private is None else [private]
        )

        for search_private_status in search_private_statuses:
            secrets = self.list_secrets(
                logical_operator=LogicalOperators.AND,
                name=f"equals:{name}",
                private=search_private_status,
                hydrate=hydrate,
            )

            if len(secrets.items) >= 1:
                # Need to fetch the secret again to get the secret values
                return self.zen_store.get_secret(
                    secret_id=secrets.items[0].id, hydrate=hydrate
                )

        private_status = ""
        if private is not None:
            private_status = "private " if private else "public "
        msg = f"No {private_status}secret with name '{name}' was found"

        raise KeyError(msg)

    def list_secrets_by_private_status(
        self,
        private: bool,
        hydrate: bool = False,
    ) -> Page[SecretResponse]:
        """Fetches the list of secrets with a given private status.

        The returned secrets do not contain the secret values. To get the
        secret values, use `get_secret` individually for each secret.

        Args:
            private: The private status of the secrets to search for.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The list of secrets in the given scope without the secret values.
        """
        logger.debug(f"Fetching the secrets with private status '{private}'.")

        return self.list_secrets(private=private, hydrate=hydrate)

    def backup_secrets(
        self,
        ignore_errors: bool = True,
        delete_secrets: bool = False,
    ) -> None:
        """Backs up all secrets to the configured backup secrets store.

        Args:
            ignore_errors: Whether to ignore individual errors during the backup
                process and attempt to backup all secrets.
            delete_secrets: Whether to delete the secrets that have been
                successfully backed up from the primary secrets store. Setting
                this flag effectively moves all secrets from the primary secrets
                store to the backup secrets store.
        """
        self.zen_store.backup_secrets(
            ignore_errors=ignore_errors, delete_secrets=delete_secrets
        )

    def restore_secrets(
        self,
        ignore_errors: bool = False,
        delete_secrets: bool = False,
    ) -> None:
        """Restore all secrets from the configured backup secrets store.

        Args:
            ignore_errors: Whether to ignore individual errors during the
                restore process and attempt to restore all secrets.
            delete_secrets: Whether to delete the secrets that have been
                successfully restored from the backup secrets store. Setting
                this flag effectively moves all secrets from the backup secrets
                store to the primary secrets store.
        """
        self.zen_store.restore_secrets(
            ignore_errors=ignore_errors, delete_secrets=delete_secrets
        )

    # --------------------------- Code repositories ---------------------------

    @staticmethod
    def _validate_code_repository_config(
        source: Source, config: Dict[str, Any]
    ) -> None:
        """Validate a code repository config.

        Args:
            source: The code repository source.
            config: The code repository config.

        Raises:
            RuntimeError: If the provided config is invalid.
        """
        from zenml.code_repositories import BaseCodeRepository

        code_repo_class: Type[BaseCodeRepository] = (
            source_utils.load_and_validate_class(
                source=source, expected_class=BaseCodeRepository
            )
        )
        try:
            code_repo_class.validate_config(config)
        except Exception as e:
            raise RuntimeError(
                "Failed to validate code repository config."
            ) from e

    def create_code_repository(
        self,
        name: str,
        config: Dict[str, Any],
        source: Source,
        description: Optional[str] = None,
        logo_url: Optional[str] = None,
    ) -> CodeRepositoryResponse:
        """Create a new code repository.

        Args:
            name: Name of the code repository.
            config: The configuration for the code repository.
            source: The code repository implementation source.
            description: The code repository description.
            logo_url: URL of a logo (png, jpg or svg) for the code repository.

        Returns:
            The created code repository.
        """
        self._validate_code_repository_config(source=source, config=config)
        repo_request = CodeRepositoryRequest(
            project=self.active_project.id,
            name=name,
            config=config,
            source=source,
            description=description,
            logo_url=logo_url,
        )
        return self.zen_store.create_code_repository(
            code_repository=repo_request
        )

    def get_code_repository(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
    ) -> CodeRepositoryResponse:
        """Get a code repository by name, id or prefix.

        Args:
            name_id_or_prefix: The name, ID or ID prefix of the code repository.
            allow_name_prefix_match: If True, allow matching by name prefix.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The code repository.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_code_repository,
            list_method=self.list_code_repositories,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
            project=project,
        )

    def list_code_repositories(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        project: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[CodeRepositoryResponse]:
        """List all code repositories.

        Args:
            sort_by: The column to sort by.
            page: The page of items.
            size: The maximum size of all pages.
            logical_operator: Which logical operator to use [and, or].
            id: Use the id of the code repository to filter by.
            created: Use to filter by time of creation.
            updated: Use the last updated date for filtering.
            name: The name of the code repository to filter by.
            project: The project name/ID to filter by.
            user: Filter by user name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of code repositories matching the filter description.
        """
        filter_model = CodeRepositoryFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            project=project or self.active_project.id,
            user=user,
        )
        return self.zen_store.list_code_repositories(
            filter_model=filter_model,
            hydrate=hydrate,
        )

    def update_code_repository(
        self,
        name_id_or_prefix: Union[UUID, str],
        name: Optional[str] = None,
        description: Optional[str] = None,
        logo_url: Optional[str] = None,
        config: Optional[Dict[str, Any]] = None,
        project: Optional[Union[str, UUID]] = None,
    ) -> CodeRepositoryResponse:
        """Update a code repository.

        Args:
            name_id_or_prefix: Name, ID or prefix of the code repository to
                update.
            name: New name of the code repository.
            description: New description of the code repository.
            logo_url: New logo URL of the code repository.
            config: New configuration options for the code repository. Will
                be used to update the existing configuration values. To remove
                values from the existing configuration, set the value for that
                key to `None`.
            project: The project name/ID to filter by.

        Returns:
            The updated code repository.
        """
        repo = self.get_code_repository(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
            project=project,
        )
        update = CodeRepositoryUpdate(
            name=name, description=description, logo_url=logo_url
        )
        if config is not None:
            combined_config = repo.config
            combined_config.update(config)
            combined_config = {
                k: v for k, v in combined_config.items() if v is not None
            }

            self._validate_code_repository_config(
                source=repo.source, config=combined_config
            )
            update.config = combined_config

        return self.zen_store.update_code_repository(
            code_repository_id=repo.id, update=update
        )

    def delete_code_repository(
        self,
        name_id_or_prefix: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Delete a code repository.

        Args:
            name_id_or_prefix: The name, ID or prefix of the code repository.
            project: The project name/ID to filter by.
        """
        repo = self.get_code_repository(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
            project=project,
        )
        self.zen_store.delete_code_repository(code_repository_id=repo.id)

    # --------------------------- Service Connectors ---------------------------

    def create_service_connector(
        self,
        name: str,
        connector_type: str,
        resource_type: Optional[str] = None,
        auth_method: Optional[str] = None,
        configuration: Optional[Dict[str, str]] = None,
        resource_id: Optional[str] = None,
        description: str = "",
        expiration_seconds: Optional[int] = None,
        expires_at: Optional[datetime] = None,
        expires_skew_tolerance: Optional[int] = None,
        labels: Optional[Dict[str, str]] = None,
        auto_configure: bool = False,
        verify: bool = True,
        list_resources: bool = True,
        register: bool = True,
    ) -> Tuple[
        Optional[
            Union[
                ServiceConnectorResponse,
                ServiceConnectorRequest,
            ]
        ],
        Optional[ServiceConnectorResourcesModel],
    ]:
        """Create, validate and/or register a service connector.

        Args:
            name: The name of the service connector.
            connector_type: The service connector type.
            auth_method: The authentication method of the service connector.
                May be omitted if auto-configuration is used.
            resource_type: The resource type for the service connector.
            configuration: The configuration of the service connector.
            resource_id: The resource id of the service connector.
            description: The description of the service connector.
            expiration_seconds: The expiration time of the service connector.
            expires_at: The expiration time of the service connector.
            expires_skew_tolerance: The allowed expiration skew for the service
                connector credentials.
            labels: The labels of the service connector.
            auto_configure: Whether to automatically configure the service
                connector from the local environment.
            verify: Whether to verify that the service connector configuration
                and credentials can be used to gain access to the resource.
            list_resources: Whether to also list the resources that the service
                connector can give access to (if verify is True).
            register: Whether to register the service connector or not.

        Returns:
            The model of the registered service connector and the resources
            that the service connector can give access to (if verify is True).

        Raises:
            ValueError: If the arguments are invalid.
            KeyError: If the service connector type is not found.
            NotImplementedError: If auto-configuration is not supported or
                not implemented for the service connector type.
            AuthorizationException: If the connector verification failed due
                to authorization issues.
        """
        from zenml.service_connectors.service_connector_registry import (
            service_connector_registry,
        )

        connector_instance: Optional[ServiceConnector] = None
        connector_resources: Optional[ServiceConnectorResourcesModel] = None

        # Get the service connector type class
        try:
            connector = self.zen_store.get_service_connector_type(
                connector_type=connector_type,
            )
        except KeyError:
            raise KeyError(
                f"Service connector type {connector_type} not found."
                "Please check that you have installed all required "
                "Python packages and ZenML integrations and try again."
            )

        if not resource_type and len(connector.resource_types) == 1:
            resource_type = connector.resource_types[0].resource_type

        # If auto_configure is set, we will try to automatically configure the
        # service connector from the local environment
        if auto_configure:
            if not connector.supports_auto_configuration:
                raise NotImplementedError(
                    f"The {connector.name} service connector type "
                    "does not support auto-configuration."
                )
            if not connector.local:
                raise NotImplementedError(
                    f"The {connector.name} service connector type "
                    "implementation is not available locally. Please "
                    "check that you have installed all required Python "
                    "packages and ZenML integrations and try again, or "
                    "skip auto-configuration."
                )

            assert connector.connector_class is not None

            connector_instance = connector.connector_class.auto_configure(
                resource_type=resource_type,
                auth_method=auth_method,
                resource_id=resource_id,
            )
            assert connector_instance is not None
            connector_request = connector_instance.to_model(
                name=name,
                description=description or "",
                labels=labels,
            )

            if verify:
                # Prefer to verify the connector config server-side if the
                # implementation if available there, because it ensures
                # that the connector can be shared with other users or used
                # from other machines and because some auth methods rely on the
                # server-side authentication environment
                if connector.remote:
                    connector_resources = (
                        self.zen_store.verify_service_connector_config(
                            connector_request,
                            list_resources=list_resources,
                        )
                    )
                else:
                    connector_resources = connector_instance.verify(
                        list_resources=list_resources,
                    )

                if connector_resources.error:
                    # Raise an exception if the connector verification failed
                    raise AuthorizationException(connector_resources.error)

        else:
            if not auth_method:
                if len(connector.auth_methods) == 1:
                    auth_method = connector.auth_methods[0].auth_method
                else:
                    raise ValueError(
                        f"Multiple authentication methods are available for "
                        f"the {connector.name} service connector type. Please "
                        f"specify one of the following: "
                        f"{list(connector.auth_method_dict.keys())}."
                    )

            connector_request = ServiceConnectorRequest(
                name=name,
                connector_type=connector_type,
                description=description,
                auth_method=auth_method,
                expiration_seconds=expiration_seconds,
                expires_at=expires_at,
                expires_skew_tolerance=expires_skew_tolerance,
                labels=labels or {},
            )
            # Validate and configure the resources
            connector_request.validate_and_configure_resources(
                connector_type=connector,
                resource_types=resource_type,
                resource_id=resource_id,
                configuration=configuration,
            )
            if verify:
                # Prefer to verify the connector config server-side if the
                # implementation if available there, because it ensures
                # that the connector can be shared with other users or used
                # from other machines and because some auth methods rely on the
                # server-side authentication environment
                if connector.remote:
                    connector_resources = (
                        self.zen_store.verify_service_connector_config(
                            connector_request,
                            list_resources=list_resources,
                        )
                    )
                else:
                    connector_instance = (
                        service_connector_registry.instantiate_connector(
                            model=connector_request
                        )
                    )
                    connector_resources = connector_instance.verify(
                        list_resources=list_resources,
                    )

                if connector_resources.error:
                    # Raise an exception if the connector verification failed
                    raise AuthorizationException(connector_resources.error)

                # For resource types that don't support multi-instances, it's
                # better to save the default resource ID in the connector, if
                # available. Otherwise, we'll need to instantiate the connector
                # again to get the default resource ID.
                connector_request.resource_id = (
                    connector_request.resource_id
                    or connector_resources.get_default_resource_id()
                )

        if not register:
            return connector_request, connector_resources

        # Register the new model
        connector_response = self.zen_store.create_service_connector(
            service_connector=connector_request
        )

        if connector_resources:
            connector_resources.id = connector_response.id
            connector_resources.name = connector_response.name
            connector_resources.connector_type = (
                connector_response.connector_type
            )

        return connector_response, connector_resources

    def get_service_connector(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        load_secrets: bool = False,
        hydrate: bool = True,
    ) -> ServiceConnectorResponse:
        """Fetches a registered service connector.

        Args:
            name_id_or_prefix: The id of the service connector to fetch.
            allow_name_prefix_match: If True, allow matching by name prefix.
            load_secrets: If True, load the secrets for the service connector.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The registered service connector.
        """
        connector = self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_service_connector,
            list_method=self.list_service_connectors,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

        if load_secrets and connector.secret_id:
            client = Client()
            try:
                secret = client.get_secret(
                    name_id_or_prefix=connector.secret_id,
                    allow_partial_id_match=False,
                    allow_partial_name_match=False,
                )
            except KeyError as err:
                logger.error(
                    "Unable to retrieve secret values associated with "
                    f"service connector '{connector.name}': {err}"
                )
            else:
                # Add secret values to connector configuration
                connector.secrets.update(secret.values)

        return connector

    def list_service_connectors(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        connector_type: Optional[str] = None,
        auth_method: Optional[str] = None,
        resource_type: Optional[str] = None,
        resource_id: Optional[str] = None,
        user: Optional[Union[UUID, str]] = None,
        labels: Optional[Dict[str, Optional[str]]] = None,
        secret_id: Optional[Union[str, UUID]] = None,
        hydrate: bool = False,
    ) -> Page[ServiceConnectorResponse]:
        """Lists all registered service connectors.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: The id of the service connector to filter by.
            created: Filter service connectors by time of creation
            updated: Use the last updated date for filtering
            connector_type: Use the service connector type for filtering
            auth_method: Use the service connector auth method for filtering
            resource_type: Filter service connectors by the resource type that
                they can give access to.
            resource_id: Filter service connectors by the resource id that
                they can give access to.
            user: Filter by user name/ID.
            name: The name of the service connector to filter by.
            labels: The labels of the service connector to filter by.
            secret_id: Filter by the id of the secret that is referenced by the
                service connector.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of service connectors.
        """
        connector_filter_model = ServiceConnectorFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            user=user,
            name=name,
            connector_type=connector_type,
            auth_method=auth_method,
            resource_type=resource_type,
            resource_id=resource_id,
            id=id,
            created=created,
            updated=updated,
            labels=labels,
            secret_id=secret_id,
        )
        return self.zen_store.list_service_connectors(
            filter_model=connector_filter_model,
            hydrate=hydrate,
        )

    def update_service_connector(
        self,
        name_id_or_prefix: Union[UUID, str],
        name: Optional[str] = None,
        auth_method: Optional[str] = None,
        resource_type: Optional[str] = None,
        configuration: Optional[Dict[str, str]] = None,
        resource_id: Optional[str] = None,
        description: Optional[str] = None,
        expires_at: Optional[datetime] = None,
        expires_skew_tolerance: Optional[int] = None,
        expiration_seconds: Optional[int] = None,
        labels: Optional[Dict[str, Optional[str]]] = None,
        verify: bool = True,
        list_resources: bool = True,
        update: bool = True,
    ) -> Tuple[
        Optional[
            Union[
                ServiceConnectorResponse,
                ServiceConnectorUpdate,
            ]
        ],
        Optional[ServiceConnectorResourcesModel],
    ]:
        """Validate and/or register an updated service connector.

        If the `resource_type`, `resource_id` and `expiration_seconds`
        parameters are set to their "empty" values (empty string for resource
        type and resource ID, 0 for expiration seconds), the existing values
        will be removed from the service connector. Setting them to None or
        omitting them will not affect the existing values.

        If supplied, the `configuration` parameter is a full replacement of the
        existing configuration rather than a partial update.

        Labels can be updated or removed by setting the label value to None.

        Args:
            name_id_or_prefix: The name, id or prefix of the service connector
                to update.
            name: The new name of the service connector.
            auth_method: The new authentication method of the service connector.
            resource_type: The new resource type for the service connector.
                If set to the empty string, the existing resource type will be
                removed.
            configuration: The new configuration of the service connector. If
                set, this needs to be a full replacement of the existing
                configuration rather than a partial update.
            resource_id: The new resource id of the service connector.
                If set to the empty string, the existing resource ID will be
                removed.
            description: The description of the service connector.
            expires_at: The new UTC expiration time of the service connector.
            expires_skew_tolerance: The allowed expiration skew for the service
                connector credentials.
            expiration_seconds: The expiration time of the service connector.
                If set to 0, the existing expiration time will be removed.
            labels: The service connector to update or remove. If a label value
                is set to None, the label will be removed.
            verify: Whether to verify that the service connector configuration
                and credentials can be used to gain access to the resource.
            list_resources: Whether to also list the resources that the service
                connector can give access to (if verify is True).
            update: Whether to update the service connector or not.

        Returns:
            The model of the registered service connector and the resources
            that the service connector can give access to (if verify is True).

        Raises:
            AuthorizationException: If the service connector verification
                fails due to invalid credentials or insufficient permissions.
        """
        from zenml.service_connectors.service_connector_registry import (
            service_connector_registry,
        )

        connector_model = self.get_service_connector(
            name_id_or_prefix,
            allow_name_prefix_match=False,
            load_secrets=True,
        )

        connector_instance: Optional[ServiceConnector] = None
        connector_resources: Optional[ServiceConnectorResourcesModel] = None

        if isinstance(connector_model.connector_type, str):
            connector = self.get_service_connector_type(
                connector_model.connector_type
            )
        else:
            connector = connector_model.connector_type

        resource_types: Optional[Union[str, List[str]]] = None
        if resource_type == "":
            resource_types = None
        elif resource_type is None:
            resource_types = connector_model.resource_types
        else:
            resource_types = resource_type

        if not resource_type and len(connector.resource_types) == 1:
            resource_types = connector.resource_types[0].resource_type

        if resource_id == "":
            resource_id = None
        elif resource_id is None:
            resource_id = connector_model.resource_id

        if expiration_seconds == 0:
            expiration_seconds = None
        elif expiration_seconds is None:
            expiration_seconds = connector_model.expiration_seconds

        connector_update = ServiceConnectorUpdate(
            name=name or connector_model.name,
            connector_type=connector.connector_type,
            description=description or connector_model.description,
            auth_method=auth_method or connector_model.auth_method,
            expires_at=expires_at,
            expires_skew_tolerance=expires_skew_tolerance,
            expiration_seconds=expiration_seconds,
        )

        # Validate and configure the resources
        if configuration is not None:
            # The supplied configuration is a drop-in replacement for the
            # existing configuration and secrets
            connector_update.validate_and_configure_resources(
                connector_type=connector,
                resource_types=resource_types,
                resource_id=resource_id,
                configuration=configuration,
            )
        else:
            connector_update.validate_and_configure_resources(
                connector_type=connector,
                resource_types=resource_types,
                resource_id=resource_id,
                configuration=connector_model.configuration,
                secrets=connector_model.secrets,
            )

        # Add the labels
        if labels is not None:
            # Apply the new label values, but don't keep any labels that
            # have been set to None in the update
            connector_update.labels = {
                **{
                    label: value
                    for label, value in connector_model.labels.items()
                    if label not in labels
                },
                **{
                    label: value
                    for label, value in labels.items()
                    if value is not None
                },
            }
        else:
            connector_update.labels = connector_model.labels

        if verify:
            # Prefer to verify the connector config server-side if the
            # implementation, if available there, because it ensures
            # that the connector can be shared with other users or used
            # from other machines and because some auth methods rely on the
            # server-side authentication environment

            # Convert the update model to a request model for validation
            connector_request_dict = connector_update.model_dump()
            connector_request = ServiceConnectorRequest.model_validate(
                connector_request_dict
            )

            if connector.remote:
                connector_resources = (
                    self.zen_store.verify_service_connector_config(
                        service_connector=connector_request,
                        list_resources=list_resources,
                    )
                )
            else:
                connector_instance = (
                    service_connector_registry.instantiate_connector(
                        model=connector_request,
                    )
                )
                connector_resources = connector_instance.verify(
                    list_resources=list_resources
                )

            if connector_resources.error:
                raise AuthorizationException(connector_resources.error)

            # For resource types that don't support multi-instances, it's
            # better to save the default resource ID in the connector, if
            # available. Otherwise, we'll need to instantiate the connector
            # again to get the default resource ID.
            connector_update.resource_id = (
                connector_update.resource_id
                or connector_resources.get_default_resource_id()
            )

        if not update:
            return connector_update, connector_resources

        # Update the model
        connector_response = self.zen_store.update_service_connector(
            service_connector_id=connector_model.id,
            update=connector_update,
        )

        if connector_resources:
            connector_resources.id = connector_response.id
            connector_resources.name = connector_response.name
            connector_resources.connector_type = (
                connector_response.connector_type
            )

        return connector_response, connector_resources

    def delete_service_connector(
        self,
        name_id_or_prefix: Union[str, UUID],
    ) -> None:
        """Deletes a registered service connector.

        Args:
            name_id_or_prefix: The ID or name of the service connector to delete.
        """
        service_connector = self.get_service_connector(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
        )

        self.zen_store.delete_service_connector(
            service_connector_id=service_connector.id
        )
        logger.info(
            "Removed service connector (type: %s) with name '%s'.",
            service_connector.type,
            service_connector.name,
        )

    def verify_service_connector(
        self,
        name_id_or_prefix: Union[UUID, str],
        resource_type: Optional[str] = None,
        resource_id: Optional[str] = None,
        list_resources: bool = True,
    ) -> "ServiceConnectorResourcesModel":
        """Verifies if a service connector has access to one or more resources.

        Args:
            name_id_or_prefix: The name, id or prefix of the service connector
                to verify.
            resource_type: The type of the resource for which to verify access.
                If not provided, the resource type from the service connector
                configuration will be used.
            resource_id: The ID of the resource for which to verify access. If
                not provided, the resource ID from the service connector
                configuration will be used.
            list_resources: Whether to list the resources that the service
                connector has access to.

        Returns:
            The list of resources that the service connector has access to,
            scoped to the supplied resource type and ID, if provided.

        Raises:
            AuthorizationException: If the service connector does not have
                access to the resources.
        """
        from zenml.service_connectors.service_connector_registry import (
            service_connector_registry,
        )

        # Get the service connector model
        service_connector = self.get_service_connector(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
        )

        connector_type = self.get_service_connector_type(
            service_connector.type
        )

        # Prefer to verify the connector config server-side if the
        # implementation if available there, because it ensures
        # that the connector can be shared with other users or used
        # from other machines and because some auth methods rely on the
        # server-side authentication environment
        if connector_type.remote:
            connector_resources = self.zen_store.verify_service_connector(
                service_connector_id=service_connector.id,
                resource_type=resource_type,
                resource_id=resource_id,
                list_resources=list_resources,
            )
        else:
            connector_instance = (
                service_connector_registry.instantiate_connector(
                    model=service_connector
                )
            )
            connector_resources = connector_instance.verify(
                resource_type=resource_type,
                resource_id=resource_id,
                list_resources=list_resources,
            )

        if connector_resources.error:
            raise AuthorizationException(connector_resources.error)

        return connector_resources

    def login_service_connector(
        self,
        name_id_or_prefix: Union[UUID, str],
        resource_type: Optional[str] = None,
        resource_id: Optional[str] = None,
        **kwargs: Any,
    ) -> "ServiceConnector":
        """Use a service connector to authenticate a local client/SDK.

        Args:
            name_id_or_prefix: The name, id or prefix of the service connector
                to use.
            resource_type: The type of the resource to connect to. If not
                provided, the resource type from the service connector
                configuration will be used.
            resource_id: The ID of a particular resource instance to configure
                the local client to connect to. If the connector instance is
                already configured with a resource ID that is not the same or
                equivalent to the one requested, a `ValueError` exception is
                raised. May be omitted for connectors and resource types that do
                not support multiple resource instances.
            kwargs: Additional implementation specific keyword arguments to use
                to configure the client.

        Returns:
            The service connector client instance that was used to configure the
            local client.
        """
        connector_client = self.get_service_connector_client(
            name_id_or_prefix=name_id_or_prefix,
            resource_type=resource_type,
            resource_id=resource_id,
            verify=False,
        )

        connector_client.configure_local_client(
            **kwargs,
        )

        return connector_client

    def get_service_connector_client(
        self,
        name_id_or_prefix: Union[UUID, str],
        resource_type: Optional[str] = None,
        resource_id: Optional[str] = None,
        verify: bool = False,
    ) -> "ServiceConnector":
        """Get the client side of a service connector instance to use with a local client.

        Args:
            name_id_or_prefix: The name, id or prefix of the service connector
                to use.
            resource_type: The type of the resource to connect to. If not
                provided, the resource type from the service connector
                configuration will be used.
            resource_id: The ID of a particular resource instance to configure
                the local client to connect to. If the connector instance is
                already configured with a resource ID that is not the same or
                equivalent to the one requested, a `ValueError` exception is
                raised. May be omitted for connectors and resource types that do
                not support multiple resource instances.
            verify: Whether to verify that the service connector configuration
                and credentials can be used to gain access to the resource.

        Returns:
            The client side of the indicated service connector instance that can
            be used to connect to the resource locally.
        """
        from zenml.service_connectors.service_connector_registry import (
            service_connector_registry,
        )

        # Get the service connector model
        service_connector = self.get_service_connector(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
        )

        connector_type = self.get_service_connector_type(
            service_connector.type
        )

        # Prefer to fetch the connector client from the server if the
        # implementation if available there, because some auth methods rely on
        # the server-side authentication environment
        if connector_type.remote:
            connector_client_model = (
                self.zen_store.get_service_connector_client(
                    service_connector_id=service_connector.id,
                    resource_type=resource_type,
                    resource_id=resource_id,
                )
            )

            connector_client = (
                service_connector_registry.instantiate_connector(
                    model=connector_client_model
                )
            )

            if verify:
                # Verify the connector client on the local machine, because the
                # server-side implementation may not be able to do so
                connector_client.verify()
        else:
            connector_instance = (
                service_connector_registry.instantiate_connector(
                    model=service_connector
                )
            )

            # Fetch the connector client
            connector_client = connector_instance.get_connector_client(
                resource_type=resource_type,
                resource_id=resource_id,
            )

        return connector_client

    def list_service_connector_resources(
        self,
        connector_type: Optional[str] = None,
        resource_type: Optional[str] = None,
        resource_id: Optional[str] = None,
    ) -> List[ServiceConnectorResourcesModel]:
        """List resources that can be accessed by service connectors.

        Args:
            connector_type: The type of service connector to filter by.
            resource_type: The type of resource to filter by.
            resource_id: The ID of a particular resource instance to filter by.

        Returns:
            The matching list of resources that available service
            connectors have access to.
        """
        return self.zen_store.list_service_connector_resources(
            ServiceConnectorFilter(
                connector_type=connector_type,
                resource_type=resource_type,
                resource_id=resource_id,
            )
        )

    def list_service_connector_types(
        self,
        connector_type: Optional[str] = None,
        resource_type: Optional[str] = None,
        auth_method: Optional[str] = None,
    ) -> List[ServiceConnectorTypeModel]:
        """Get a list of service connector types.

        Args:
            connector_type: Filter by connector type.
            resource_type: Filter by resource type.
            auth_method: Filter by authentication method.

        Returns:
            List of service connector types.
        """
        return self.zen_store.list_service_connector_types(
            connector_type=connector_type,
            resource_type=resource_type,
            auth_method=auth_method,
        )

    def get_service_connector_type(
        self,
        connector_type: str,
    ) -> ServiceConnectorTypeModel:
        """Returns the requested service connector type.

        Args:
            connector_type: the service connector type identifier.

        Returns:
            The requested service connector type.
        """
        return self.zen_store.get_service_connector_type(
            connector_type=connector_type,
        )

    #########
    # Model
    #########

    def create_model(
        self,
        name: str,
        license: Optional[str] = None,
        description: Optional[str] = None,
        audience: Optional[str] = None,
        use_cases: Optional[str] = None,
        limitations: Optional[str] = None,
        trade_offs: Optional[str] = None,
        ethics: Optional[str] = None,
        tags: Optional[List[str]] = None,
        save_models_to_registry: bool = True,
    ) -> ModelResponse:
        """Creates a new model in Model Control Plane.

        Args:
            name: The name of the model.
            license: The license under which the model is created.
            description: The description of the model.
            audience: The target audience of the model.
            use_cases: The use cases of the model.
            limitations: The known limitations of the model.
            trade_offs: The tradeoffs of the model.
            ethics: The ethical implications of the model.
            tags: Tags associated with the model.
            save_models_to_registry: Whether to save the model to the
                registry.

        Returns:
            The newly created model.
        """
        return self.zen_store.create_model(
            model=ModelRequest(
                name=name,
                license=license,
                description=description,
                audience=audience,
                use_cases=use_cases,
                limitations=limitations,
                trade_offs=trade_offs,
                ethics=ethics,
                tags=tags,
                project=self.active_project.id,
                save_models_to_registry=save_models_to_registry,
            )
        )

    def delete_model(
        self,
        model_name_or_id: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
    ) -> None:
        """Deletes a model from Model Control Plane.

        Args:
            model_name_or_id: name or id of the model to be deleted.
            project: The project name/ID to filter by.
        """
        model = self.get_model(
            model_name_or_id=model_name_or_id, project=project
        )
        self.zen_store.delete_model(model_id=model.id)

    def update_model(
        self,
        model_name_or_id: Union[str, UUID],
        name: Optional[str] = None,
        license: Optional[str] = None,
        description: Optional[str] = None,
        audience: Optional[str] = None,
        use_cases: Optional[str] = None,
        limitations: Optional[str] = None,
        trade_offs: Optional[str] = None,
        ethics: Optional[str] = None,
        add_tags: Optional[List[str]] = None,
        remove_tags: Optional[List[str]] = None,
        save_models_to_registry: Optional[bool] = None,
        project: Optional[Union[str, UUID]] = None,
    ) -> ModelResponse:
        """Updates an existing model in Model Control Plane.

        Args:
            model_name_or_id: name or id of the model to be deleted.
            name: The name of the model.
            license: The license under which the model is created.
            description: The description of the model.
            audience: The target audience of the model.
            use_cases: The use cases of the model.
            limitations: The known limitations of the model.
            trade_offs: The tradeoffs of the model.
            ethics: The ethical implications of the model.
            add_tags: Tags to add to the model.
            remove_tags: Tags to remove from to the model.
            save_models_to_registry: Whether to save the model to the
                registry.
            project: The project name/ID to filter by.

        Returns:
            The updated model.
        """
        model = self.get_model(
            model_name_or_id=model_name_or_id, project=project
        )
        return self.zen_store.update_model(
            model_id=model.id,
            model_update=ModelUpdate(
                name=name,
                license=license,
                description=description,
                audience=audience,
                use_cases=use_cases,
                limitations=limitations,
                trade_offs=trade_offs,
                ethics=ethics,
                add_tags=add_tags,
                remove_tags=remove_tags,
                save_models_to_registry=save_models_to_registry,
            ),
        )

    def get_model(
        self,
        model_name_or_id: Union[str, UUID],
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
        bypass_lazy_loader: bool = False,
    ) -> ModelResponse:
        """Get an existing model from Model Control Plane.

        Args:
            model_name_or_id: name or id of the model to be retrieved.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            bypass_lazy_loader: Whether to bypass the lazy loader.

        Returns:
            The model of interest.
        """
        if not bypass_lazy_loader:
            if cll := client_lazy_loader(
                "get_model",
                model_name_or_id=model_name_or_id,
                hydrate=hydrate,
                project=project,
            ):
                return cll  # type: ignore[return-value]

        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_model,
            list_method=self.list_models,
            name_id_or_prefix=model_name_or_id,
            project=project,
            hydrate=hydrate,
        )

    def list_models(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        id: Optional[Union[UUID, str]] = None,
        user: Optional[Union[UUID, str]] = None,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = False,
        tag: Optional[str] = None,
        tags: Optional[List[str]] = None,
    ) -> Page[ModelResponse]:
        """Get models by filter from Model Control Plane.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: The name of the model to filter by.
            id: The id of the model to filter by.
            user: Filter by user name/ID.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            tag: The tag of the model to filter by.
            tags: Tags to filter by.

        Returns:
            A page object with all models.
        """
        filter = ModelFilter(
            name=name,
            id=id,
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            created=created,
            updated=updated,
            tag=tag,
            tags=tags,
            user=user,
            project=project or self.active_project.id,
        )

        return self.zen_store.list_models(
            model_filter_model=filter, hydrate=hydrate
        )

    #################
    # Model Versions
    #################

    def create_model_version(
        self,
        model_name_or_id: Union[str, UUID],
        name: Optional[str] = None,
        description: Optional[str] = None,
        tags: Optional[List[str]] = None,
        project: Optional[Union[str, UUID]] = None,
    ) -> ModelVersionResponse:
        """Creates a new model version in Model Control Plane.

        Args:
            model_name_or_id: the name or id of the model to create model
                version in.
            name: the name of the Model Version to be created.
            description: the description of the Model Version to be created.
            tags: Tags associated with the model.
            project: The project name/ID to filter by.

        Returns:
            The newly created model version.
        """
        model = self.get_model(
            model_name_or_id=model_name_or_id, project=project
        )
        return self.zen_store.create_model_version(
            model_version=ModelVersionRequest(
                name=name,
                description=description,
                project=model.project_id,
                model=model.id,
                tags=tags,
            )
        )

    def delete_model_version(
        self,
        model_version_id: UUID,
    ) -> None:
        """Deletes a model version from Model Control Plane.

        Args:
            model_version_id: Id of the model version to be deleted.
        """
        self.zen_store.delete_model_version(
            model_version_id=model_version_id,
        )

    def get_model_version(
        self,
        model_name_or_id: Optional[Union[str, UUID]] = None,
        model_version_name_or_number_or_id: Optional[
            Union[str, int, ModelStages, UUID]
        ] = None,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
    ) -> ModelVersionResponse:
        """Get an existing model version from Model Control Plane.

        Args:
            model_name_or_id: name or id of the model containing the model
                version.
            model_version_name_or_number_or_id: name, id, stage or number of
                the model version to be retrieved. If skipped - latest version
                is retrieved.
            project: The project name/ID to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The model version of interest.

        Raises:
            RuntimeError: In case method inputs don't adhere to restrictions.
            KeyError: In case no model version with the identifiers exists.
            ValueError: In case retrieval is attempted using non UUID model version
                identifier and no model identifier provided.
        """
        if (
            not is_valid_uuid(model_version_name_or_number_or_id)
            and model_name_or_id is None
        ):
            raise ValueError(
                "No model identifier provided and model version identifier "
                f"`{model_version_name_or_number_or_id}` is not a valid UUID."
            )
        if cll := client_lazy_loader(
            "get_model_version",
            model_name_or_id=model_name_or_id,
            model_version_name_or_number_or_id=model_version_name_or_number_or_id,
            project=project,
            hydrate=hydrate,
        ):
            return cll  # type: ignore[return-value]

        if model_version_name_or_number_or_id is None:
            model_version_name_or_number_or_id = ModelStages.LATEST

        if isinstance(model_version_name_or_number_or_id, UUID):
            return self.zen_store.get_model_version(
                model_version_id=model_version_name_or_number_or_id,
                hydrate=hydrate,
            )
        elif isinstance(model_version_name_or_number_or_id, int):
            model_versions = self.zen_store.list_model_versions(
                model_version_filter_model=ModelVersionFilter(
                    model=model_name_or_id,
                    number=model_version_name_or_number_or_id,
                    project=project or self.active_project.id,
                ),
                hydrate=hydrate,
            ).items
        elif isinstance(model_version_name_or_number_or_id, str):
            if model_version_name_or_number_or_id == ModelStages.LATEST:
                model_versions = self.zen_store.list_model_versions(
                    model_version_filter_model=ModelVersionFilter(
                        model=model_name_or_id,
                        sort_by=f"{SorterOps.DESCENDING}:number",
                        project=project or self.active_project.id,
                    ),
                    hydrate=hydrate,
                ).items

                if len(model_versions) > 0:
                    model_versions = [model_versions[0]]
                else:
                    model_versions = []
            elif model_version_name_or_number_or_id in ModelStages.values():
                model_versions = self.zen_store.list_model_versions(
                    model_version_filter_model=ModelVersionFilter(
                        model=model_name_or_id,
                        stage=model_version_name_or_number_or_id,
                        project=project or self.active_project.id,
                    ),
                    hydrate=hydrate,
                ).items
            else:
                model_versions = self.zen_store.list_model_versions(
                    model_version_filter_model=ModelVersionFilter(
                        model=model_name_or_id,
                        name=model_version_name_or_number_or_id,
                        project=project or self.active_project.id,
                    ),
                    hydrate=hydrate,
                ).items
        else:
            raise RuntimeError(
                f"The model version identifier "
                f"`{model_version_name_or_number_or_id}` is not"
                f"of the correct type."
            )

        if len(model_versions) == 1:
            return model_versions[0]
        elif len(model_versions) == 0:
            raise KeyError(
                f"No model version found for model "
                f"`{model_name_or_id}` with version identifier "
                f"`{model_version_name_or_number_or_id}`."
            )
        else:
            raise RuntimeError(
                f"The model version identifier "
                f"`{model_version_name_or_number_or_id}` is not"
                f"unique for model `{model_name_or_id}`."
            )

    def list_model_versions(
        self,
        model: Optional[Union[str, UUID]] = None,
        model_name_or_id: Optional[Union[str, UUID]] = None,
        sort_by: str = "number",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        id: Optional[Union[UUID, str]] = None,
        number: Optional[int] = None,
        stage: Optional[Union[str, ModelStages]] = None,
        run_metadata: Optional[List[str]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
        tag: Optional[str] = None,
        tags: Optional[List[str]] = None,
        project: Optional[Union[str, UUID]] = None,
    ) -> Page[ModelVersionResponse]:
        """Get model versions by filter from Model Control Plane.

        Args:
            model: The model to filter by.
            model_name_or_id: name or id of the model containing the model
                version.
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: name or id of the model version.
            id: id of the model version.
            number: number of the model version.
            stage: stage of the model version.
            run_metadata: run metadata of the model version.
            user: Filter by user name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            tag: The tag to filter by.
            tags: Tags to filter by.
            project: The project name/ID to filter by.

        Returns:
            A page object with all model versions.
        """
        if model_name_or_id:
            logger.warning(
                "The `model_name_or_id` argument is deprecated. "
                "Please use the `model` argument instead."
            )
            if model is None:
                model = model_name_or_id
            else:
                logger.warning(
                    "Ignoring `model_name_or_id` argument as `model` argument "
                    "was also provided."
                )

        model_version_filter_model = ModelVersionFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            created=created,
            updated=updated,
            name=name,
            id=id,
            number=number,
            stage=stage,
            run_metadata=run_metadata,
            tag=tag,
            tags=tags,
            user=user,
            model=model,
            project=project or self.active_project.id,
        )

        return self.zen_store.list_model_versions(
            model_version_filter_model=model_version_filter_model,
            hydrate=hydrate,
        )

    def update_model_version(
        self,
        model_name_or_id: Union[str, UUID],
        version_name_or_id: Union[str, UUID],
        stage: Optional[Union[str, ModelStages]] = None,
        force: bool = False,
        name: Optional[str] = None,
        description: Optional[str] = None,
        add_tags: Optional[List[str]] = None,
        remove_tags: Optional[List[str]] = None,
        project: Optional[Union[str, UUID]] = None,
    ) -> ModelVersionResponse:
        """Get all model versions by filter.

        Args:
            model_name_or_id: The name or ID of the model containing model version.
            version_name_or_id: The name or ID of model version to be updated.
            stage: Target model version stage to be set.
            force: Whether existing model version in target stage should be
                silently archived or an error should be raised.
            name: Target model version name to be set.
            description: Target model version description to be set.
            add_tags: Tags to add to the model version.
            remove_tags: Tags to remove from to the model version.
            project: The project name/ID to filter by.

        Returns:
            An updated model version.
        """
        if not is_valid_uuid(model_name_or_id):
            model = self.get_model(model_name_or_id, project=project)
            model_name_or_id = model.id
            project = project or model.project_id
        if not is_valid_uuid(version_name_or_id):
            version_name_or_id = self.get_model_version(
                model_name_or_id, version_name_or_id, project=project
            ).id

        return self.zen_store.update_model_version(
            model_version_id=version_name_or_id,  # type:ignore[arg-type]
            model_version_update_model=ModelVersionUpdate(
                stage=stage,
                force=force,
                name=name,
                description=description,
                add_tags=add_tags,
                remove_tags=remove_tags,
            ),
        )

    #################################################
    # Model Versions Artifacts
    #################################################

    def list_model_version_artifact_links(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        model_version_id: Optional[Union[UUID, str]] = None,
        artifact_version_id: Optional[Union[UUID, str]] = None,
        artifact_name: Optional[str] = None,
        only_data_artifacts: Optional[bool] = None,
        only_model_artifacts: Optional[bool] = None,
        only_deployment_artifacts: Optional[bool] = None,
        has_custom_name: Optional[bool] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[ModelVersionArtifactResponse]:
        """Get model version to artifact links by filter in Model Control Plane.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            model_version_id: Use the model version id for filtering
            artifact_version_id: Use the artifact id for filtering
            artifact_name: Use the artifact name for filtering
            only_data_artifacts: Use to filter by data artifacts
            only_model_artifacts: Use to filter by model artifacts
            only_deployment_artifacts: Use to filter by deployment artifacts
            has_custom_name: Filter artifacts with/without custom names.
            user: Filter by user name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of all model version to artifact links.
        """
        return self.zen_store.list_model_version_artifact_links(
            ModelVersionArtifactFilter(
                sort_by=sort_by,
                logical_operator=logical_operator,
                page=page,
                size=size,
                created=created,
                updated=updated,
                model_version_id=model_version_id,
                artifact_version_id=artifact_version_id,
                artifact_name=artifact_name,
                only_data_artifacts=only_data_artifacts,
                only_model_artifacts=only_model_artifacts,
                only_deployment_artifacts=only_deployment_artifacts,
                has_custom_name=has_custom_name,
                user=user,
            ),
            hydrate=hydrate,
        )

    def delete_model_version_artifact_link(
        self, model_version_id: UUID, artifact_version_id: UUID
    ) -> None:
        """Delete model version to artifact link in Model Control Plane.

        Args:
            model_version_id: The id of the model version holding the link.
            artifact_version_id: The id of the artifact version to be deleted.

        Raises:
            RuntimeError: If more than one artifact link is found for given filters.
        """
        artifact_links = self.list_model_version_artifact_links(
            model_version_id=model_version_id,
            artifact_version_id=artifact_version_id,
        )
        if artifact_links.items:
            if artifact_links.total > 1:
                raise RuntimeError(
                    "More than one artifact link found for give model version "
                    f"`{model_version_id}` and artifact version "
                    f"`{artifact_version_id}`. This should not be happening and "
                    "might indicate a corrupted state of your ZenML database. "
                    "Please seek support via Community Slack."
                )
            self.zen_store.delete_model_version_artifact_link(
                model_version_id=model_version_id,
                model_version_artifact_link_name_or_id=artifact_links.items[
                    0
                ].id,
            )

    def delete_all_model_version_artifact_links(
        self, model_version_id: UUID, only_links: bool
    ) -> None:
        """Delete all model version to artifact links in Model Control Plane.

        Args:
            model_version_id: The id of the model version holding the link.
            only_links: If true, only delete the link to the artifact.
        """
        self.zen_store.delete_all_model_version_artifact_links(
            model_version_id, only_links
        )

    #################################################
    # Model Versions Pipeline Runs
    #
    # Only view capabilities are exposed via client.
    #################################################

    def list_model_version_pipeline_run_links(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        model_version_id: Optional[Union[UUID, str]] = None,
        pipeline_run_id: Optional[Union[UUID, str]] = None,
        pipeline_run_name: Optional[str] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[ModelVersionPipelineRunResponse]:
        """Get all model version to pipeline run links by filter.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            model_version_id: Use the model version id for filtering
            pipeline_run_id: Use the pipeline run id for filtering
            pipeline_run_name: Use the pipeline run name for filtering
            user: Filter by user name or ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response

        Returns:
            A page of all model version to pipeline run links.
        """
        return self.zen_store.list_model_version_pipeline_run_links(
            ModelVersionPipelineRunFilter(
                sort_by=sort_by,
                logical_operator=logical_operator,
                page=page,
                size=size,
                created=created,
                updated=updated,
                model_version_id=model_version_id,
                pipeline_run_id=pipeline_run_id,
                pipeline_run_name=pipeline_run_name,
                user=user,
            ),
            hydrate=hydrate,
        )

    # --------------------------- Authorized Devices ---------------------------

    def list_authorized_devices(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        expires: Optional[Union[datetime, str]] = None,
        client_id: Union[UUID, str, None] = None,
        status: Union[OAuthDeviceStatus, str, None] = None,
        trusted_device: Union[bool, str, None] = None,
        user: Optional[Union[UUID, str]] = None,
        failed_auth_attempts: Union[int, str, None] = None,
        last_login: Optional[Union[datetime, str, None]] = None,
        hydrate: bool = False,
    ) -> Page[OAuthDeviceResponse]:
        """List all authorized devices.

        Args:
            sort_by: The column to sort by.
            page: The page of items.
            size: The maximum size of all pages.
            logical_operator: Which logical operator to use [and, or].
            id: Use the id of the code repository to filter by.
            created: Use to filter by time of creation.
            updated: Use the last updated date for filtering.
            expires: Use the expiration date for filtering.
            client_id: Use the client id for filtering.
            status: Use the status for filtering.
            user: Filter by user name/ID.
            trusted_device: Use the trusted device flag for filtering.
            failed_auth_attempts: Use the failed auth attempts for filtering.
            last_login: Use the last login date for filtering.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of authorized devices matching the filter.
        """
        filter_model = OAuthDeviceFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            expires=expires,
            client_id=client_id,
            user=user,
            status=status,
            trusted_device=trusted_device,
            failed_auth_attempts=failed_auth_attempts,
            last_login=last_login,
        )
        return self.zen_store.list_authorized_devices(
            filter_model=filter_model,
            hydrate=hydrate,
        )

    def get_authorized_device(
        self,
        id_or_prefix: Union[UUID, str],
        allow_id_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> OAuthDeviceResponse:
        """Get an authorized device by id or prefix.

        Args:
            id_or_prefix: The ID or ID prefix of the authorized device.
            allow_id_prefix_match: If True, allow matching by ID prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The requested authorized device.

        Raises:
            KeyError: If no authorized device is found with the given ID or
                prefix.
        """
        if isinstance(id_or_prefix, str):
            try:
                id_or_prefix = UUID(id_or_prefix)
            except ValueError:
                if not allow_id_prefix_match:
                    raise KeyError(
                        f"No authorized device found with id or prefix "
                        f"'{id_or_prefix}'."
                    )
        if isinstance(id_or_prefix, UUID):
            return self.zen_store.get_authorized_device(
                id_or_prefix, hydrate=hydrate
            )
        return self._get_entity_by_prefix(
            get_method=self.zen_store.get_authorized_device,
            list_method=self.list_authorized_devices,
            partial_id_or_name=id_or_prefix,
            allow_name_prefix_match=False,
            hydrate=hydrate,
        )

    def update_authorized_device(
        self,
        id_or_prefix: Union[UUID, str],
        locked: Optional[bool] = None,
    ) -> OAuthDeviceResponse:
        """Update an authorized device.

        Args:
            id_or_prefix: The ID or ID prefix of the authorized device.
            locked: Whether to lock or unlock the authorized device.

        Returns:
            The updated authorized device.
        """
        device = self.get_authorized_device(
            id_or_prefix=id_or_prefix, allow_id_prefix_match=False
        )
        return self.zen_store.update_authorized_device(
            device_id=device.id,
            update=OAuthDeviceUpdate(
                locked=locked,
            ),
        )

    def delete_authorized_device(
        self,
        id_or_prefix: Union[str, UUID],
    ) -> None:
        """Delete an authorized device.

        Args:
            id_or_prefix: The ID or ID prefix of the authorized device.
        """
        device = self.get_authorized_device(
            id_or_prefix=id_or_prefix,
            allow_id_prefix_match=False,
        )
        self.zen_store.delete_authorized_device(device.id)

    # --------------------------- Trigger Executions ---------------------------

    def get_trigger_execution(
        self,
        trigger_execution_id: UUID,
        hydrate: bool = True,
    ) -> TriggerExecutionResponse:
        """Get a trigger execution by ID.

        Args:
            trigger_execution_id: The ID of the trigger execution to get.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The trigger execution.
        """
        return self.zen_store.get_trigger_execution(
            trigger_execution_id=trigger_execution_id, hydrate=hydrate
        )

    def list_trigger_executions(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        trigger_id: Optional[UUID] = None,
        user: Optional[Union[UUID, str]] = None,
        project: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[TriggerExecutionResponse]:
        """List all trigger executions matching the given filter criteria.

        Args:
            sort_by: The column to sort by.
            page: The page of items.
            size: The maximum size of all pages.
            logical_operator: Which logical operator to use [and, or].
            trigger_id: ID of the trigger to filter by.
            user: Filter by user name/ID.
            project: Filter by project name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A list of all trigger executions matching the filter criteria.
        """
        filter_model = TriggerExecutionFilter(
            trigger_id=trigger_id,
            sort_by=sort_by,
            page=page,
            size=size,
            user=user,
            logical_operator=logical_operator,
            project=project or self.active_project.id,
        )
        return self.zen_store.list_trigger_executions(
            trigger_execution_filter_model=filter_model, hydrate=hydrate
        )

    def delete_trigger_execution(self, trigger_execution_id: UUID) -> None:
        """Delete a trigger execution.

        Args:
            trigger_execution_id: The ID of the trigger execution to delete.
        """
        self.zen_store.delete_trigger_execution(
            trigger_execution_id=trigger_execution_id
        )

    # ---- utility prefix matching get functions -----

    def _get_entity_by_id_or_name_or_prefix(
        self,
        get_method: Callable[..., AnyResponse],
        list_method: Callable[..., Page[AnyResponse]],
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
        **kwargs: Any,
    ) -> AnyResponse:
        """Fetches an entity using the id, name, or partial id/name.

        Args:
            get_method: The method to use to fetch the entity by id.
            list_method: The method to use to fetch all entities.
            name_id_or_prefix: The id, name or partial id of the entity to
                fetch.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            project: The project name/ID to filter by.
            **kwargs: Additional keyword arguments to pass to the get and list
                methods.

        Returns:
            The entity with the given name, id or partial id.

        Raises:
            ZenKeyError: If there is more than one entity with that name
                or id prefix.
        """
        from zenml.utils.uuid_utils import is_valid_uuid

        entity_label = get_method.__name__.replace("get_", "") + "s"

        # First interpret as full UUID
        if is_valid_uuid(name_id_or_prefix):
            return get_method(name_id_or_prefix, hydrate=hydrate, **kwargs)

        # If not a UUID, try to find by name
        assert not isinstance(name_id_or_prefix, UUID)
        list_kwargs: Dict[str, Any] = dict(
            name=f"equals:{name_id_or_prefix}",
            hydrate=hydrate,
            **kwargs,
        )
        scope = ""
        if project:
            scope = f"in project {project} "
            list_kwargs["project"] = project
        entity = list_method(**list_kwargs)

        # If only a single entity is found, return it
        if entity.total == 1:
            return entity.items[0]

        # If still no match, try with prefix now
        if entity.total == 0:
            return self._get_entity_by_prefix(
                get_method=get_method,
                list_method=list_method,
                partial_id_or_name=name_id_or_prefix,
                allow_name_prefix_match=allow_name_prefix_match,
                project=project,
                hydrate=hydrate,
            )

        # If more than one entity with the same name is found, raise an error.
        formatted_entity_items = [
            f"- {item.name}: (id: {item.id})\n"
            if hasattr(item, "name")
            else f"- {item.id}\n"
            for item in entity.items
        ]
        raise ZenKeyError(
            f"{entity.total} {entity_label} have been found {scope}that have "
            f"a name that matches the provided "
            f"string '{name_id_or_prefix}':\n"
            f"{formatted_entity_items}.\n"
            f"Please use the id to uniquely identify "
            f"only one of the {entity_label}s."
        )

    def _get_entity_version_by_id_or_name_or_prefix(
        self,
        get_method: Callable[..., AnyResponse],
        list_method: Callable[..., Page[AnyResponse]],
        name_id_or_prefix: Union[str, UUID],
        version: Optional[str],
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
    ) -> "AnyResponse":
        from zenml.utils.uuid_utils import is_valid_uuid

        entity_label = get_method.__name__.replace("get_", "") + "s"

        if is_valid_uuid(name_id_or_prefix):
            if version:
                logger.warning(
                    "You specified both an ID as well as a version of the "
                    f"{entity_label}. Ignoring the version and fetching the "
                    f"{entity_label} by ID."
                )
            if not isinstance(name_id_or_prefix, UUID):
                name_id_or_prefix = UUID(name_id_or_prefix, version=4)

            return get_method(name_id_or_prefix, hydrate=hydrate)

        assert not isinstance(name_id_or_prefix, UUID)
        list_kwargs: Dict[str, Any] = dict(
            size=1,
            sort_by="desc:created",
            name=name_id_or_prefix,
            version=version,
            hydrate=hydrate,
        )
        scope = ""
        if project:
            scope = f" in project {project}"
            list_kwargs["project"] = project
        exact_name_matches = list_method(**list_kwargs)

        if len(exact_name_matches) == 1:
            # If the name matches exactly, use the explicitly specified version
            # or fallback to the latest if not given
            return exact_name_matches.items[0]

        partial_id_matches = list_method(
            id=f"startswith:{name_id_or_prefix}",
            hydrate=hydrate,
        )
        if partial_id_matches.total == 1:
            if version:
                logger.warning(
                    "You specified both a partial ID as well as a version of "
                    f"the {entity_label}. Ignoring the version and fetching "
                    f"the {entity_label} by partial ID."
                )
            return partial_id_matches[0]
        elif partial_id_matches.total == 0:
            raise KeyError(
                f"No {entity_label} found for name, ID or prefix "
                f"{name_id_or_prefix}{scope}."
            )
        else:
            raise ZenKeyError(
                f"{partial_id_matches.total} {entity_label} have been found"
                f"{scope} that have an id prefix that matches the provided "
                f"string '{name_id_or_prefix}':\n"
                f"{partial_id_matches.items}.\n"
                f"Please provide more characters to uniquely identify "
                f"only one of the {entity_label}s."
            )

    def _get_entity_by_prefix(
        self,
        get_method: Callable[..., AnyResponse],
        list_method: Callable[..., Page[AnyResponse]],
        partial_id_or_name: str,
        allow_name_prefix_match: bool,
        project: Optional[Union[str, UUID]] = None,
        hydrate: bool = True,
        **kwargs: Any,
    ) -> AnyResponse:
        """Fetches an entity using a partial ID or name.

        Args:
            get_method: The method to use to fetch the entity by id.
            list_method: The method to use to fetch all entities.
            partial_id_or_name: The partial ID or name of the entity to fetch.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            project: The project name/ID to filter by.
            **kwargs: Additional keyword arguments to pass to the get and list
                methods.

        Returns:
            The entity with the given partial ID or name.

        Raises:
            KeyError: If no entity with the given partial ID or name is found.
            ZenKeyError: If there is more than one entity with that partial ID
                or name.
        """
        list_method_args: Dict[str, Any] = {
            "logical_operator": LogicalOperators.OR,
            "id": f"startswith:{partial_id_or_name}",
            "hydrate": hydrate,
            **kwargs,
        }
        if allow_name_prefix_match:
            list_method_args["name"] = f"startswith:{partial_id_or_name}"
        scope = ""
        if project:
            scope = f"in project {project} "
            list_method_args["project"] = project

        entity = list_method(**list_method_args)

        # If only a single entity is found, return it.
        if entity.total == 1:
            return entity.items[0]

        irregular_plurals = {"code_repository": "code_repositories"}
        entity_label = irregular_plurals.get(
            get_method.__name__.replace("get_", ""),
            get_method.__name__.replace("get_", "") + "s",
        )

        prefix_description = (
            "a name/ID prefix" if allow_name_prefix_match else "an ID prefix"
        )
        # If no entity is found, raise an error.
        if entity.total == 0:
            raise KeyError(
                f"No {entity_label} have been found{scope} that have "
                f"{prefix_description} that matches the provided string "
                f"'{partial_id_or_name}'."
            )

        # If more than one entity is found, raise an error.
        ambiguous_entities: List[str] = []
        for model in entity.items:
            model_name = getattr(model, "name", None)
            if model_name:
                ambiguous_entities.append(f"{model_name}: {model.id}")
            else:
                ambiguous_entities.append(str(model.id))
        raise ZenKeyError(
            f"{entity.total} {entity_label} have been found{scope} that have "
            f"{prefix_description} that matches the provided "
            f"string '{partial_id_or_name}':\n"
            f"{ambiguous_entities}.\n"
            f"Please provide more characters to uniquely identify "
            f"only one of the {entity_label}s."
        )

    # ---------------------------- Service Accounts ----------------------------

    def create_service_account(
        self,
        name: str,
        description: str = "",
    ) -> ServiceAccountResponse:
        """Create a new service account.

        Args:
            name: The name of the service account.
            description: The description of the service account.

        Returns:
            The created service account.
        """
        service_account = ServiceAccountRequest(
            name=name, description=description, active=True
        )
        created_service_account = self.zen_store.create_service_account(
            service_account=service_account
        )

        return created_service_account

    def get_service_account(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> ServiceAccountResponse:
        """Gets a service account.

        Args:
            name_id_or_prefix: The name or ID of the service account.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The ServiceAccount
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_service_account,
            list_method=self.list_service_accounts,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_service_accounts(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        description: Optional[str] = None,
        active: Optional[bool] = None,
        hydrate: bool = False,
    ) -> Page[ServiceAccountResponse]:
        """List all service accounts.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of stacks to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: Use the service account name for filtering
            description: Use the service account description for filtering
            active: Use the service account active status for filtering
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The list of service accounts matching the filter description.
        """
        return self.zen_store.list_service_accounts(
            ServiceAccountFilter(
                sort_by=sort_by,
                page=page,
                size=size,
                logical_operator=logical_operator,
                id=id,
                created=created,
                updated=updated,
                name=name,
                description=description,
                active=active,
            ),
            hydrate=hydrate,
        )

    def update_service_account(
        self,
        name_id_or_prefix: Union[str, UUID],
        updated_name: Optional[str] = None,
        description: Optional[str] = None,
        active: Optional[bool] = None,
    ) -> ServiceAccountResponse:
        """Update a service account.

        Args:
            name_id_or_prefix: The name or ID of the service account to update.
            updated_name: The new name of the service account.
            description: The new description of the service account.
            active: The new active status of the service account.

        Returns:
            The updated service account.
        """
        service_account = self.get_service_account(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )
        service_account_update = ServiceAccountUpdate(
            name=updated_name,
            description=description,
            active=active,
        )

        return self.zen_store.update_service_account(
            service_account_name_or_id=service_account.id,
            service_account_update=service_account_update,
        )

    def delete_service_account(
        self,
        name_id_or_prefix: Union[str, UUID],
    ) -> None:
        """Delete a service account.

        Args:
            name_id_or_prefix: The name or ID of the service account to delete.
        """
        service_account = self.get_service_account(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )
        self.zen_store.delete_service_account(
            service_account_name_or_id=service_account.id
        )

    # -------------------------------- API Keys --------------------------------

    def create_api_key(
        self,
        service_account_name_id_or_prefix: Union[str, UUID],
        name: str,
        description: str = "",
        set_key: bool = False,
    ) -> APIKeyResponse:
        """Create a new API key and optionally set it as the active API key.

        Args:
            service_account_name_id_or_prefix: The name, ID or prefix of the
                service account to create the API key for.
            name: Name of the API key.
            description: The description of the API key.
            set_key: Whether to set the created API key as the active API key.

        Returns:
            The created API key.
        """
        service_account = self.get_service_account(
            name_id_or_prefix=service_account_name_id_or_prefix,
            allow_name_prefix_match=False,
        )
        request = APIKeyRequest(
            name=name,
            description=description,
        )
        api_key = self.zen_store.create_api_key(
            service_account_id=service_account.id, api_key=request
        )
        assert api_key.key is not None

        if set_key:
            self.set_api_key(key=api_key.key)

        return api_key

    def set_api_key(self, key: str) -> None:
        """Configure the client with an API key.

        Args:
            key: The API key to use.

        Raises:
            NotImplementedError: If the client is not connected to a ZenML
                server.
        """
        from zenml.login.credentials_store import get_credentials_store
        from zenml.zen_stores.rest_zen_store import RestZenStore

        zen_store = self.zen_store
        if not zen_store.TYPE == StoreType.REST:
            raise NotImplementedError(
                "API key configuration is only supported if connected to a "
                "ZenML server."
            )

        credentials_store = get_credentials_store()
        assert isinstance(zen_store, RestZenStore)

        credentials_store.set_api_key(server_url=zen_store.url, api_key=key)

        # Force a re-authentication to start using the new API key
        # right away.
        zen_store.authenticate(force=True)

    def list_api_keys(
        self,
        service_account_name_id_or_prefix: Union[str, UUID],
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        description: Optional[str] = None,
        active: Optional[bool] = None,
        last_login: Optional[Union[datetime, str]] = None,
        last_rotated: Optional[Union[datetime, str]] = None,
        hydrate: bool = False,
    ) -> Page[APIKeyResponse]:
        """List all API keys.

        Args:
            service_account_name_id_or_prefix: The name, ID or prefix of the
                service account to list the API keys for.
            sort_by: The column to sort by.
            page: The page of items.
            size: The maximum size of all pages.
            logical_operator: Which logical operator to use [and, or].
            id: Use the id of the API key to filter by.
            created: Use to filter by time of creation.
            updated: Use the last updated date for filtering.
            name: The name of the API key to filter by.
            description: The description of the API key to filter by.
            active: Whether the API key is active or not.
            last_login: The last time the API key was used.
            last_rotated: The last time the API key was rotated.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of API keys matching the filter description.
        """
        service_account = self.get_service_account(
            name_id_or_prefix=service_account_name_id_or_prefix,
            allow_name_prefix_match=False,
        )
        filter_model = APIKeyFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            description=description,
            active=active,
            last_login=last_login,
            last_rotated=last_rotated,
        )
        return self.zen_store.list_api_keys(
            service_account_id=service_account.id,
            filter_model=filter_model,
            hydrate=hydrate,
        )

    def get_api_key(
        self,
        service_account_name_id_or_prefix: Union[str, UUID],
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> APIKeyResponse:
        """Get an API key by name, id or prefix.

        Args:
            service_account_name_id_or_prefix: The name, ID or prefix of the
                service account to get the API key for.
            name_id_or_prefix: The name, ID or ID prefix of the API key.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The API key.
        """
        service_account = self.get_service_account(
            name_id_or_prefix=service_account_name_id_or_prefix,
            allow_name_prefix_match=False,
        )

        def get_api_key_method(
            api_key_name_or_id: str, hydrate: bool = True
        ) -> APIKeyResponse:
            return self.zen_store.get_api_key(
                service_account_id=service_account.id,
                api_key_name_or_id=api_key_name_or_id,
                hydrate=hydrate,
            )

        def list_api_keys_method(
            hydrate: bool = True,
            **filter_args: Any,
        ) -> Page[APIKeyResponse]:
            return self.list_api_keys(
                service_account_name_id_or_prefix=service_account.id,
                hydrate=hydrate,
                **filter_args,
            )

        return self._get_entity_by_id_or_name_or_prefix(
            get_method=get_api_key_method,
            list_method=list_api_keys_method,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def update_api_key(
        self,
        service_account_name_id_or_prefix: Union[str, UUID],
        name_id_or_prefix: Union[UUID, str],
        name: Optional[str] = None,
        description: Optional[str] = None,
        active: Optional[bool] = None,
    ) -> APIKeyResponse:
        """Update an API key.

        Args:
            service_account_name_id_or_prefix: The name, ID or prefix of the
                service account to update the API key for.
            name_id_or_prefix: Name, ID or prefix of the API key to update.
            name: New name of the API key.
            description: New description of the API key.
            active: Whether the API key is active or not.

        Returns:
            The updated API key.
        """
        api_key = self.get_api_key(
            service_account_name_id_or_prefix=service_account_name_id_or_prefix,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
        )
        update = APIKeyUpdate(
            name=name, description=description, active=active
        )
        return self.zen_store.update_api_key(
            service_account_id=api_key.service_account.id,
            api_key_name_or_id=api_key.id,
            api_key_update=update,
        )

    def rotate_api_key(
        self,
        service_account_name_id_or_prefix: Union[str, UUID],
        name_id_or_prefix: Union[UUID, str],
        retain_period_minutes: int = 0,
        set_key: bool = False,
    ) -> APIKeyResponse:
        """Rotate an API key.

        Args:
            service_account_name_id_or_prefix: The name, ID or prefix of the
                service account to rotate the API key for.
            name_id_or_prefix: Name, ID or prefix of the API key to update.
            retain_period_minutes: The number of minutes to retain the old API
                key for. If set to 0, the old API key will be invalidated.
            set_key: Whether to set the rotated API key as the active API key.

        Returns:
            The updated API key.
        """
        api_key = self.get_api_key(
            service_account_name_id_or_prefix=service_account_name_id_or_prefix,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
        )
        rotate_request = APIKeyRotateRequest(
            retain_period_minutes=retain_period_minutes
        )
        new_key = self.zen_store.rotate_api_key(
            service_account_id=api_key.service_account.id,
            api_key_name_or_id=api_key.id,
            rotate_request=rotate_request,
        )
        assert new_key.key is not None
        if set_key:
            self.set_api_key(key=new_key.key)

        return new_key

    def delete_api_key(
        self,
        service_account_name_id_or_prefix: Union[str, UUID],
        name_id_or_prefix: Union[str, UUID],
    ) -> None:
        """Delete an API key.

        Args:
            service_account_name_id_or_prefix: The name, ID or prefix of the
                service account to delete the API key for.
            name_id_or_prefix: The name, ID or prefix of the API key.
        """
        api_key = self.get_api_key(
            service_account_name_id_or_prefix=service_account_name_id_or_prefix,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
        )
        self.zen_store.delete_api_key(
            service_account_id=api_key.service_account.id,
            api_key_name_or_id=api_key.id,
        )

    # ---------------------------------- Tags ----------------------------------
    def create_tag(
        self,
        name: str,
        exclusive: bool = False,
        color: Optional[Union[str, ColorVariants]] = None,
    ) -> TagResponse:
        """Creates a new tag.

        Args:
            name: the name of the tag.
            exclusive: the boolean to decide whether the tag is an exclusive tag.
                An exclusive tag means that the tag can exist only for a single:
                    - pipeline run within the scope of a pipeline
                    - artifact version within the scope of an artifact
                    - run template
            color: the color of the tag

        Returns:
            The newly created tag.
        """
        request_model = TagRequest(name=name, exclusive=exclusive)

        if color is not None:
            request_model.color = ColorVariants(color)

        return self.zen_store.create_tag(tag=request_model)

    def delete_tag(
        self,
        tag_name_or_id: Union[str, UUID],
    ) -> None:
        """Deletes a tag.

        Args:
            tag_name_or_id: name or id of the tag to be deleted.
        """
        self.zen_store.delete_tag(
            tag_name_or_id=tag_name_or_id,
        )

    def update_tag(
        self,
        tag_name_or_id: Union[str, UUID],
        name: Optional[str] = None,
        exclusive: Optional[bool] = None,
        color: Optional[Union[str, ColorVariants]] = None,
    ) -> TagResponse:
        """Updates an existing tag.

        Args:
            tag_name_or_id: name or UUID of the tag to be updated.
            name: the name of the tag.
            exclusive: the boolean to decide whether the tag is an exclusive tag.
                An exclusive tag means that the tag can exist only for a single:
                    - pipeline run within the scope of a pipeline
                    - artifact version within the scope of an artifact
                    - run template
            color: the color of the tag

        Returns:
            The updated tag.
        """
        update_model = TagUpdate()

        if name is not None:
            update_model.name = name

        if exclusive is not None:
            update_model.exclusive = exclusive

        if color is not None:
            if isinstance(color, str):
                update_model.color = ColorVariants(color)
            else:
                update_model.color = color

        return self.zen_store.update_tag(
            tag_name_or_id=tag_name_or_id,
            tag_update_model=update_model,
        )

    def get_tag(
        self,
        tag_name_or_id: Union[str, UUID],
        hydrate: bool = True,
    ) -> TagResponse:
        """Get an existing tag.

        Args:
            tag_name_or_id: name or id of the tag to be retrieved.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The tag of interest.
        """
        return self.zen_store.get_tag(
            tag_name_or_id=tag_name_or_id,
            hydrate=hydrate,
        )

    def list_tags(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        user: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        color: Optional[Union[str, ColorVariants]] = None,
        exclusive: Optional[bool] = None,
        resource_type: Optional[Union[str, TaggableResourceTypes]] = None,
        hydrate: bool = False,
    ) -> Page[TagResponse]:
        """Get tags by filter.

        Args:
            sort_by: The column to sort by.
            page: The page of items.
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or].
            id: Use the id of stacks to filter by.
            user: Use the user to filter by.
            created: Use to filter by time of creation.
            updated: Use the last updated date for filtering.
            name: The name of the tag.
            color: The color of the tag.
            exclusive: Flag indicating whether the tag is exclusive.
            resource_type: Filter tags associated with a specific resource type.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of all tags.
        """
        return self.zen_store.list_tags(
            tag_filter_model=TagFilter(
                sort_by=sort_by,
                page=page,
                size=size,
                logical_operator=logical_operator,
                id=id,
                user=user,
                created=created,
                updated=updated,
                name=name,
                color=color,
                exclusive=exclusive,
                resource_type=resource_type,
            ),
            hydrate=hydrate,
        )

    def attach_tag(
        self,
        tag_name_or_id: Union[str, UUID],
        resources: List[TagResource],
    ) -> None:
        """Attach a tag to resources.

        Args:
            tag_name_or_id: name or id of the tag to be attached.
            resources: the resources to attach the tag to.
        """
        if isinstance(tag_name_or_id, str):
            try:
                tag_model = self.create_tag(name=tag_name_or_id)
            except EntityExistsError:
                tag_model = self.get_tag(tag_name_or_id)
        else:
            tag_model = self.get_tag(tag_name_or_id)

        self.zen_store.batch_create_tag_resource(
            tag_resources=[
                TagResourceRequest(
                    tag_id=tag_model.id,
                    resource_id=resource.id,
                    resource_type=resource.type,
                )
                for resource in resources
            ]
        )

    def detach_tag(
        self,
        tag_name_or_id: Union[str, UUID],
        resources: List[TagResource],
    ) -> None:
        """Detach a tag from resources.

        Args:
            tag_name_or_id: name or id of the tag to be detached.
            resources: the resources to detach the tag from.
        """
        tag_model = self.get_tag(tag_name_or_id)

        self.zen_store.batch_delete_tag_resource(
            tag_resources=[
                TagResourceRequest(
                    tag_id=tag_model.id,
                    resource_id=resource.id,
                    resource_type=resource.type,
                )
                for resource in resources
            ]
        )

active_project property

Get the currently active project of the local client.

If no active project is configured locally for the client, the active project in the global configuration is used instead.

Returns:

Type Description
ProjectResponse

The active project.

Raises:

Type Description
RuntimeError

If the active project is not set.

active_stack property

The active stack for this client.

Returns:

Type Description
Stack

The active stack for this client.

active_stack_model property

The model of the active stack for this client.

If no active stack is configured locally for the client, the active stack in the global configuration is used instead.

Returns:

Type Description
StackResponse

The model of the active stack for this client.

active_user property

Get the user that is currently in use.

Returns:

Type Description
UserResponse

The active user.

config_directory property

The configuration directory of this client.

Returns:

Type Description
Optional[Path]

The configuration directory of this client, or None, if the

Optional[Path]

client doesn't have an active root.

root property

The root directory of this client.

Returns:

Type Description
Optional[Path]

The root directory of this client, or None, if the client

Optional[Path]

has not been initialized.

uses_local_configuration property

Check if the client is using a local configuration.

Returns:

Type Description
bool

True if the client is using a local configuration,

bool

False otherwise.

zen_store property

Shortcut to return the global zen store.

Returns:

Type Description
BaseZenStore

The global zen store.

__init__(root=None)

Initializes the global client instance.

Client is a singleton class: only one instance can exist. Calling this constructor multiple times will always yield the same instance (see the exception below).

The root argument is only meant for internal use and testing purposes. User code must never pass them to the constructor. When a custom root value is passed, an anonymous Client instance is created and returned independently of the Client singleton and that will have no effect as far as the rest of the ZenML core code is concerned.

Instead of creating a new Client instance to reflect a different repository root, to change the active root in the global Client, call Client().activate_root(<new-root>).

Parameters:

Name Type Description Default
root Optional[Path]

(internal use) custom root directory for the client. If no path is given, the repository root is determined using the environment variable ZENML_REPOSITORY_PATH (if set) and by recursively searching in the parent directories of the current working directory. Only used to initialize new clients internally.

None
Source code in src/zenml/client.py
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
def __init__(
    self,
    root: Optional[Path] = None,
) -> None:
    """Initializes the global client instance.

    Client is a singleton class: only one instance can exist. Calling
    this constructor multiple times will always yield the same instance (see
    the exception below).

    The `root` argument is only meant for internal use and testing purposes.
    User code must never pass them to the constructor.
    When a custom `root` value is passed, an anonymous Client instance
    is created and returned independently of the Client singleton and
    that will have no effect as far as the rest of the ZenML core code is
    concerned.

    Instead of creating a new Client instance to reflect a different
    repository root, to change the active root in the global Client,
    call `Client().activate_root(<new-root>)`.

    Args:
        root: (internal use) custom root directory for the client. If
            no path is given, the repository root is determined using the
            environment variable `ZENML_REPOSITORY_PATH` (if set) and by
            recursively searching in the parent directories of the
            current working directory. Only used to initialize new
            clients internally.
    """
    self._root: Optional[Path] = None
    self._config: Optional[ClientConfiguration] = None

    self._set_active_root(root)

activate_root(root=None)

Set the active repository root directory.

Parameters:

Name Type Description Default
root Optional[Path]

The path to set as the active repository root. If not set, the repository root is determined using the environment variable ZENML_REPOSITORY_PATH (if set) and by recursively searching in the parent directories of the current working directory.

None
Source code in src/zenml/client.py
670
671
672
673
674
675
676
677
678
679
680
def activate_root(self, root: Optional[Path] = None) -> None:
    """Set the active repository root directory.

    Args:
        root: The path to set as the active repository root. If not set,
            the repository root is determined using the environment
            variable `ZENML_REPOSITORY_PATH` (if set) and by recursively
            searching in the parent directories of the current working
            directory.
    """
    self._set_active_root(root)

activate_stack(stack_name_id_or_prefix)

Sets the stack as active.

Parameters:

Name Type Description Default
stack_name_id_or_prefix Union[str, UUID]

Model of the stack to activate.

required

Raises:

Type Description
KeyError

If the stack is not registered.

Source code in src/zenml/client.py
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
def activate_stack(
    self, stack_name_id_or_prefix: Union[str, UUID]
) -> None:
    """Sets the stack as active.

    Args:
        stack_name_id_or_prefix: Model of the stack to activate.

    Raises:
        KeyError: If the stack is not registered.
    """
    # Make sure the stack is registered
    try:
        stack = self.get_stack(name_id_or_prefix=stack_name_id_or_prefix)
    except KeyError as e:
        raise KeyError(
            f"Stack '{stack_name_id_or_prefix}' cannot be activated since "
            f"it is not registered yet. Please register it first."
        ) from e

    if self._config:
        self._config.set_active_stack(stack=stack)

    else:
        # set the active stack globally only if the client doesn't use
        # a local configuration
        GlobalConfiguration().set_active_stack(stack=stack)

attach_tag(tag_name_or_id, resources)

Attach a tag to resources.

Parameters:

Name Type Description Default
tag_name_or_id Union[str, UUID]

name or id of the tag to be attached.

required
resources List[TagResource]

the resources to attach the tag to.

required
Source code in src/zenml/client.py
7840
7841
7842
7843
7844
7845
7846
7847
7848
7849
7850
7851
7852
7853
7854
7855
7856
7857
7858
7859
7860
7861
7862
7863
7864
7865
7866
7867
7868
def attach_tag(
    self,
    tag_name_or_id: Union[str, UUID],
    resources: List[TagResource],
) -> None:
    """Attach a tag to resources.

    Args:
        tag_name_or_id: name or id of the tag to be attached.
        resources: the resources to attach the tag to.
    """
    if isinstance(tag_name_or_id, str):
        try:
            tag_model = self.create_tag(name=tag_name_or_id)
        except EntityExistsError:
            tag_model = self.get_tag(tag_name_or_id)
    else:
        tag_model = self.get_tag(tag_name_or_id)

    self.zen_store.batch_create_tag_resource(
        tag_resources=[
            TagResourceRequest(
                tag_id=tag_model.id,
                resource_id=resource.id,
                resource_type=resource.type,
            )
            for resource in resources
        ]
    )

backup_secrets(ignore_errors=True, delete_secrets=False)

Backs up all secrets to the configured backup secrets store.

Parameters:

Name Type Description Default
ignore_errors bool

Whether to ignore individual errors during the backup process and attempt to backup all secrets.

True
delete_secrets bool

Whether to delete the secrets that have been successfully backed up from the primary secrets store. Setting this flag effectively moves all secrets from the primary secrets store to the backup secrets store.

False
Source code in src/zenml/client.py
5032
5033
5034
5035
5036
5037
5038
5039
5040
5041
5042
5043
5044
5045
5046
5047
5048
5049
def backup_secrets(
    self,
    ignore_errors: bool = True,
    delete_secrets: bool = False,
) -> None:
    """Backs up all secrets to the configured backup secrets store.

    Args:
        ignore_errors: Whether to ignore individual errors during the backup
            process and attempt to backup all secrets.
        delete_secrets: Whether to delete the secrets that have been
            successfully backed up from the primary secrets store. Setting
            this flag effectively moves all secrets from the primary secrets
            store to the backup secrets store.
    """
    self.zen_store.backup_secrets(
        ignore_errors=ignore_errors, delete_secrets=delete_secrets
    )

create_action(name, flavor, action_type, configuration, service_account_id, auth_window=None, description='')

Create an action.

Parameters:

Name Type Description Default
name str

The name of the action.

required
flavor str

The flavor of the action,

required
action_type PluginSubType

The action subtype.

required
configuration Dict[str, Any]

The action configuration.

required
service_account_id UUID

The service account that is used to execute the action.

required
auth_window Optional[int]

The time window in minutes for which the service account is authorized to execute the action. Set this to 0 to authorize the service account indefinitely (not recommended).

None
description str

The description of the action.

''

Returns:

Type Description
ActionResponse

The created action

Source code in src/zenml/client.py
2950
2951
2952
2953
2954
2955
2956
2957
2958
2959
2960
2961
2962
2963
2964
2965
2966
2967
2968
2969
2970
2971
2972
2973
2974
2975
2976
2977
2978
2979
2980
2981
2982
2983
2984
2985
2986
2987
2988
2989
@_fail_for_sql_zen_store
def create_action(
    self,
    name: str,
    flavor: str,
    action_type: PluginSubType,
    configuration: Dict[str, Any],
    service_account_id: UUID,
    auth_window: Optional[int] = None,
    description: str = "",
) -> ActionResponse:
    """Create an action.

    Args:
        name: The name of the action.
        flavor: The flavor of the action,
        action_type: The action subtype.
        configuration: The action configuration.
        service_account_id: The service account that is used to execute the
            action.
        auth_window: The time window in minutes for which the service
            account is authorized to execute the action. Set this to 0 to
            authorize the service account indefinitely (not recommended).
        description: The description of the action.

    Returns:
        The created action
    """
    action = ActionRequest(
        name=name,
        description=description,
        flavor=flavor,
        plugin_subtype=action_type,
        configuration=configuration,
        service_account_id=service_account_id,
        auth_window=auth_window,
        project=self.active_project.id,
    )

    return self.zen_store.create_action(action=action)

create_api_key(service_account_name_id_or_prefix, name, description='', set_key=False)

Create a new API key and optionally set it as the active API key.

Parameters:

Name Type Description Default
service_account_name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the service account to create the API key for.

required
name str

Name of the API key.

required
description str

The description of the API key.

''
set_key bool

Whether to set the created API key as the active API key.

False

Returns:

Type Description
APIKeyResponse

The created API key.

Source code in src/zenml/client.py
7405
7406
7407
7408
7409
7410
7411
7412
7413
7414
7415
7416
7417
7418
7419
7420
7421
7422
7423
7424
7425
7426
7427
7428
7429
7430
7431
7432
7433
7434
7435
7436
7437
7438
7439
7440
def create_api_key(
    self,
    service_account_name_id_or_prefix: Union[str, UUID],
    name: str,
    description: str = "",
    set_key: bool = False,
) -> APIKeyResponse:
    """Create a new API key and optionally set it as the active API key.

    Args:
        service_account_name_id_or_prefix: The name, ID or prefix of the
            service account to create the API key for.
        name: Name of the API key.
        description: The description of the API key.
        set_key: Whether to set the created API key as the active API key.

    Returns:
        The created API key.
    """
    service_account = self.get_service_account(
        name_id_or_prefix=service_account_name_id_or_prefix,
        allow_name_prefix_match=False,
    )
    request = APIKeyRequest(
        name=name,
        description=description,
    )
    api_key = self.zen_store.create_api_key(
        service_account_id=service_account.id, api_key=request
    )
    assert api_key.key is not None

    if set_key:
        self.set_api_key(key=api_key.key)

    return api_key

create_code_repository(name, config, source, description=None, logo_url=None)

Create a new code repository.

Parameters:

Name Type Description Default
name str

Name of the code repository.

required
config Dict[str, Any]

The configuration for the code repository.

required
source Source

The code repository implementation source.

required
description Optional[str]

The code repository description.

None
logo_url Optional[str]

URL of a logo (png, jpg or svg) for the code repository.

None

Returns:

Type Description
CodeRepositoryResponse

The created code repository.

Source code in src/zenml/client.py
5099
5100
5101
5102
5103
5104
5105
5106
5107
5108
5109
5110
5111
5112
5113
5114
5115
5116
5117
5118
5119
5120
5121
5122
5123
5124
5125
5126
5127
5128
5129
5130
def create_code_repository(
    self,
    name: str,
    config: Dict[str, Any],
    source: Source,
    description: Optional[str] = None,
    logo_url: Optional[str] = None,
) -> CodeRepositoryResponse:
    """Create a new code repository.

    Args:
        name: Name of the code repository.
        config: The configuration for the code repository.
        source: The code repository implementation source.
        description: The code repository description.
        logo_url: URL of a logo (png, jpg or svg) for the code repository.

    Returns:
        The created code repository.
    """
    self._validate_code_repository_config(source=source, config=config)
    repo_request = CodeRepositoryRequest(
        project=self.active_project.id,
        name=name,
        config=config,
        source=source,
        description=description,
        logo_url=logo_url,
    )
    return self.zen_store.create_code_repository(
        code_repository=repo_request
    )

create_event_source(name, configuration, flavor, event_source_subtype, description='')

Registers an event source.

Parameters:

Name Type Description Default
name str

The name of the event source to create.

required
configuration Dict[str, Any]

Configuration for this event source.

required
flavor str

The flavor of event source.

required
event_source_subtype PluginSubType

The event source subtype.

required
description str

The description of the event source.

''

Returns:

Type Description
EventSourceResponse

The model of the registered event source.

Source code in src/zenml/client.py
2750
2751
2752
2753
2754
2755
2756
2757
2758
2759
2760
2761
2762
2763
2764
2765
2766
2767
2768
2769
2770
2771
2772
2773
2774
2775
2776
2777
2778
2779
2780
2781
@_fail_for_sql_zen_store
def create_event_source(
    self,
    name: str,
    configuration: Dict[str, Any],
    flavor: str,
    event_source_subtype: PluginSubType,
    description: str = "",
) -> EventSourceResponse:
    """Registers an event source.

    Args:
        name: The name of the event source to create.
        configuration: Configuration for this event source.
        flavor: The flavor of event source.
        event_source_subtype: The event source subtype.
        description: The description of the event source.

    Returns:
        The model of the registered event source.
    """
    event_source = EventSourceRequest(
        name=name,
        configuration=configuration,
        description=description,
        flavor=flavor,
        plugin_type=PluginType.EVENT_SOURCE,
        plugin_subtype=event_source_subtype,
        project=self.active_project.id,
    )

    return self.zen_store.create_event_source(event_source=event_source)

create_flavor(source, component_type)

Creates a new flavor.

Parameters:

Name Type Description Default
source str

The flavor to create.

required
component_type StackComponentType

The type of the flavor.

required

Returns:

Type Description
FlavorResponse

The created flavor (in model form).

Raises:

Type Description
ValueError

in case the config_schema of the flavor is too large.

Source code in src/zenml/client.py
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
2184
2185
2186
2187
2188
2189
2190
2191
2192
def create_flavor(
    self,
    source: str,
    component_type: StackComponentType,
) -> FlavorResponse:
    """Creates a new flavor.

    Args:
        source: The flavor to create.
        component_type: The type of the flavor.

    Returns:
        The created flavor (in model form).

    Raises:
        ValueError: in case the config_schema of the flavor is too large.
    """
    from zenml.stack.flavor import validate_flavor_source

    flavor = validate_flavor_source(
        source=source, component_type=component_type
    )()

    if len(flavor.config_schema) > TEXT_FIELD_MAX_LENGTH:
        raise ValueError(
            "Json representation of configuration schema"
            "exceeds max length. This could be caused by an"
            "overly long docstring on the flavors "
            "configuration class' docstring."
        )

    flavor_request = flavor.to_model(integration="custom", is_custom=True)
    return self.zen_store.create_flavor(flavor=flavor_request)

create_model(name, license=None, description=None, audience=None, use_cases=None, limitations=None, trade_offs=None, ethics=None, tags=None, save_models_to_registry=True)

Creates a new model in Model Control Plane.

Parameters:

Name Type Description Default
name str

The name of the model.

required
license Optional[str]

The license under which the model is created.

None
description Optional[str]

The description of the model.

None
audience Optional[str]

The target audience of the model.

None
use_cases Optional[str]

The use cases of the model.

None
limitations Optional[str]

The known limitations of the model.

None
trade_offs Optional[str]

The tradeoffs of the model.

None
ethics Optional[str]

The ethical implications of the model.

None
tags Optional[List[str]]

Tags associated with the model.

None
save_models_to_registry bool

Whether to save the model to the registry.

True

Returns:

Type Description
ModelResponse

The newly created model.

Source code in src/zenml/client.py
6118
6119
6120
6121
6122
6123
6124
6125
6126
6127
6128
6129
6130
6131
6132
6133
6134
6135
6136
6137
6138
6139
6140
6141
6142
6143
6144
6145
6146
6147
6148
6149
6150
6151
6152
6153
6154
6155
6156
6157
6158
6159
6160
6161
6162
6163
def create_model(
    self,
    name: str,
    license: Optional[str] = None,
    description: Optional[str] = None,
    audience: Optional[str] = None,
    use_cases: Optional[str] = None,
    limitations: Optional[str] = None,
    trade_offs: Optional[str] = None,
    ethics: Optional[str] = None,
    tags: Optional[List[str]] = None,
    save_models_to_registry: bool = True,
) -> ModelResponse:
    """Creates a new model in Model Control Plane.

    Args:
        name: The name of the model.
        license: The license under which the model is created.
        description: The description of the model.
        audience: The target audience of the model.
        use_cases: The use cases of the model.
        limitations: The known limitations of the model.
        trade_offs: The tradeoffs of the model.
        ethics: The ethical implications of the model.
        tags: Tags associated with the model.
        save_models_to_registry: Whether to save the model to the
            registry.

    Returns:
        The newly created model.
    """
    return self.zen_store.create_model(
        model=ModelRequest(
            name=name,
            license=license,
            description=description,
            audience=audience,
            use_cases=use_cases,
            limitations=limitations,
            trade_offs=trade_offs,
            ethics=ethics,
            tags=tags,
            project=self.active_project.id,
            save_models_to_registry=save_models_to_registry,
        )
    )

create_model_version(model_name_or_id, name=None, description=None, tags=None, project=None)

Creates a new model version in Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id Union[str, UUID]

the name or id of the model to create model version in.

required
name Optional[str]

the name of the Model Version to be created.

None
description Optional[str]

the description of the Model Version to be created.

None
tags Optional[List[str]]

Tags associated with the model.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Returns:

Type Description
ModelVersionResponse

The newly created model version.

Source code in src/zenml/client.py
6334
6335
6336
6337
6338
6339
6340
6341
6342
6343
6344
6345
6346
6347
6348
6349
6350
6351
6352
6353
6354
6355
6356
6357
6358
6359
6360
6361
6362
6363
6364
6365
6366
def create_model_version(
    self,
    model_name_or_id: Union[str, UUID],
    name: Optional[str] = None,
    description: Optional[str] = None,
    tags: Optional[List[str]] = None,
    project: Optional[Union[str, UUID]] = None,
) -> ModelVersionResponse:
    """Creates a new model version in Model Control Plane.

    Args:
        model_name_or_id: the name or id of the model to create model
            version in.
        name: the name of the Model Version to be created.
        description: the description of the Model Version to be created.
        tags: Tags associated with the model.
        project: The project name/ID to filter by.

    Returns:
        The newly created model version.
    """
    model = self.get_model(
        model_name_or_id=model_name_or_id, project=project
    )
    return self.zen_store.create_model_version(
        model_version=ModelVersionRequest(
            name=name,
            description=description,
            project=model.project_id,
            model=model.id,
            tags=tags,
        )
    )

create_project(name, description, display_name=None)

Create a new project.

Parameters:

Name Type Description Default
name str

Name of the project.

required
description str

Description of the project.

required
display_name Optional[str]

Display name of the project.

None

Returns:

Type Description
ProjectResponse

The created project.

Source code in src/zenml/client.py
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
def create_project(
    self,
    name: str,
    description: str,
    display_name: Optional[str] = None,
) -> ProjectResponse:
    """Create a new project.

    Args:
        name: Name of the project.
        description: Description of the project.
        display_name: Display name of the project.

    Returns:
        The created project.
    """
    return self.zen_store.create_project(
        ProjectRequest(
            name=name,
            description=description,
            display_name=display_name or "",
        )
    )

create_run_metadata(metadata, resources, stack_component_id=None, publisher_step_id=None)

Create run metadata.

Parameters:

Name Type Description Default
metadata Dict[str, MetadataType]

The metadata to create as a dictionary of key-value pairs.

required
resources List[RunMetadataResource]

The list of IDs and types of the resources for that the metadata was produced.

required
stack_component_id Optional[UUID]

The ID of the stack component that produced the metadata.

None
publisher_step_id Optional[UUID]

The ID of the step execution that publishes this metadata automatically.

None
Source code in src/zenml/client.py
4583
4584
4585
4586
4587
4588
4589
4590
4591
4592
4593
4594
4595
4596
4597
4598
4599
4600
4601
4602
4603
4604
4605
4606
4607
4608
4609
4610
4611
4612
4613
4614
4615
4616
4617
4618
4619
4620
4621
4622
4623
4624
4625
4626
4627
4628
4629
4630
4631
4632
4633
def create_run_metadata(
    self,
    metadata: Dict[str, "MetadataType"],
    resources: List[RunMetadataResource],
    stack_component_id: Optional[UUID] = None,
    publisher_step_id: Optional[UUID] = None,
) -> None:
    """Create run metadata.

    Args:
        metadata: The metadata to create as a dictionary of key-value pairs.
        resources: The list of IDs and types of the resources for that the
            metadata was produced.
        stack_component_id: The ID of the stack component that produced
            the metadata.
        publisher_step_id: The ID of the step execution that publishes
            this metadata automatically.
    """
    from zenml.metadata.metadata_types import get_metadata_type

    values: Dict[str, "MetadataType"] = {}
    types: Dict[str, "MetadataTypeEnum"] = {}
    for key, value in metadata.items():
        # Skip metadata that is too large to be stored in the database.
        if len(json.dumps(value)) > TEXT_FIELD_MAX_LENGTH:
            logger.warning(
                f"Metadata value for key '{key}' is too large to be "
                "stored in the database. Skipping."
            )
            continue
        # Skip metadata that is not of a supported type.
        try:
            metadata_type = get_metadata_type(value)
        except ValueError as e:
            logger.warning(
                f"Metadata value for key '{key}' is not of a supported "
                f"type. Skipping. Full error: {e}"
            )
            continue
        values[key] = value
        types[key] = metadata_type

    run_metadata = RunMetadataRequest(
        project=self.active_project.id,
        resources=resources,
        stack_component_id=stack_component_id,
        publisher_step_id=publisher_step_id,
        values=values,
        types=types,
    )
    self.zen_store.create_run_metadata(run_metadata)

create_run_template(name, deployment_id, description=None, tags=None)

Create a run template.

Parameters:

Name Type Description Default
name str

The name of the run template.

required
deployment_id UUID

ID of the deployment which this template should be based off of.

required
description Optional[str]

The description of the run template.

None
tags Optional[List[str]]

Tags associated with the run template.

None

Returns:

Type Description
RunTemplateResponse

The created run template.

Source code in src/zenml/client.py
3497
3498
3499
3500
3501
3502
3503
3504
3505
3506
3507
3508
3509
3510
3511
3512
3513
3514
3515
3516
3517
3518
3519
3520
3521
3522
3523
3524
def create_run_template(
    self,
    name: str,
    deployment_id: UUID,
    description: Optional[str] = None,
    tags: Optional[List[str]] = None,
) -> RunTemplateResponse:
    """Create a run template.

    Args:
        name: The name of the run template.
        deployment_id: ID of the deployment which this template should be
            based off of.
        description: The description of the run template.
        tags: Tags associated with the run template.

    Returns:
        The created run template.
    """
    return self.zen_store.create_run_template(
        template=RunTemplateRequest(
            name=name,
            description=description,
            source_deployment_id=deployment_id,
            tags=tags,
            project=self.active_project.id,
        )
    )

create_secret(name, values, private=False)

Creates a new secret.

Parameters:

Name Type Description Default
name str

The name of the secret.

required
values Dict[str, str]

The values of the secret.

required
private bool

Whether the secret is private. A private secret is only accessible to the user who created it.

False

Returns:

Type Description
SecretResponse

The created secret (in model form).

Raises:

Type Description
NotImplementedError

If centralized secrets management is not enabled.

Source code in src/zenml/client.py
4637
4638
4639
4640
4641
4642
4643
4644
4645
4646
4647
4648
4649
4650
4651
4652
4653
4654
4655
4656
4657
4658
4659
4660
4661
4662
4663
4664
4665
4666
4667
4668
4669
def create_secret(
    self,
    name: str,
    values: Dict[str, str],
    private: bool = False,
) -> SecretResponse:
    """Creates a new secret.

    Args:
        name: The name of the secret.
        values: The values of the secret.
        private: Whether the secret is private. A private secret is only
            accessible to the user who created it.

    Returns:
        The created secret (in model form).

    Raises:
        NotImplementedError: If centralized secrets management is not
            enabled.
    """
    create_secret_request = SecretRequest(
        name=name,
        values=values,
        private=private,
    )
    try:
        return self.zen_store.create_secret(secret=create_secret_request)
    except NotImplementedError:
        raise NotImplementedError(
            "centralized secrets management is not supported or explicitly "
            "disabled in the target ZenML deployment."
        )

create_service(config, service_type, model_version_id=None)

Registers a service.

Parameters:

Name Type Description Default
config ServiceConfig

The configuration of the service.

required
service_type ServiceType

The type of the service.

required
model_version_id Optional[UUID]

The ID of the model version to associate with the service.

None

Returns:

Type Description
ServiceResponse

The registered service.

Source code in src/zenml/client.py
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
def create_service(
    self,
    config: "ServiceConfig",
    service_type: ServiceType,
    model_version_id: Optional[UUID] = None,
) -> ServiceResponse:
    """Registers a service.

    Args:
        config: The configuration of the service.
        service_type: The type of the service.
        model_version_id: The ID of the model version to associate with the
            service.

    Returns:
        The registered service.
    """
    service_request = ServiceRequest(
        name=config.service_name,
        service_type=service_type,
        config=config.model_dump(),
        project=self.active_project.id,
        model_version_id=model_version_id,
    )
    # Register the service
    return self.zen_store.create_service(service_request)

create_service_account(name, description='')

Create a new service account.

Parameters:

Name Type Description Default
name str

The name of the service account.

required
description str

The description of the service account.

''

Returns:

Type Description
ServiceAccountResponse

The created service account.

Source code in src/zenml/client.py
7258
7259
7260
7261
7262
7263
7264
7265
7266
7267
7268
7269
7270
7271
7272
7273
7274
7275
7276
7277
7278
7279
def create_service_account(
    self,
    name: str,
    description: str = "",
) -> ServiceAccountResponse:
    """Create a new service account.

    Args:
        name: The name of the service account.
        description: The description of the service account.

    Returns:
        The created service account.
    """
    service_account = ServiceAccountRequest(
        name=name, description=description, active=True
    )
    created_service_account = self.zen_store.create_service_account(
        service_account=service_account
    )

    return created_service_account

create_service_connector(name, connector_type, resource_type=None, auth_method=None, configuration=None, resource_id=None, description='', expiration_seconds=None, expires_at=None, expires_skew_tolerance=None, labels=None, auto_configure=False, verify=True, list_resources=True, register=True)

Create, validate and/or register a service connector.

Parameters:

Name Type Description Default
name str

The name of the service connector.

required
connector_type str

The service connector type.

required
auth_method Optional[str]

The authentication method of the service connector. May be omitted if auto-configuration is used.

None
resource_type Optional[str]

The resource type for the service connector.

None
configuration Optional[Dict[str, str]]

The configuration of the service connector.

None
resource_id Optional[str]

The resource id of the service connector.

None
description str

The description of the service connector.

''
expiration_seconds Optional[int]

The expiration time of the service connector.

None
expires_at Optional[datetime]

The expiration time of the service connector.

None
expires_skew_tolerance Optional[int]

The allowed expiration skew for the service connector credentials.

None
labels Optional[Dict[str, str]]

The labels of the service connector.

None
auto_configure bool

Whether to automatically configure the service connector from the local environment.

False
verify bool

Whether to verify that the service connector configuration and credentials can be used to gain access to the resource.

True
list_resources bool

Whether to also list the resources that the service connector can give access to (if verify is True).

True
register bool

Whether to register the service connector or not.

True

Returns:

Type Description
Optional[Union[ServiceConnectorResponse, ServiceConnectorRequest]]

The model of the registered service connector and the resources

Optional[ServiceConnectorResourcesModel]

that the service connector can give access to (if verify is True).

Raises:

Type Description
ValueError

If the arguments are invalid.

KeyError

If the service connector type is not found.

NotImplementedError

If auto-configuration is not supported or not implemented for the service connector type.

AuthorizationException

If the connector verification failed due to authorization issues.

Source code in src/zenml/client.py
5280
5281
5282
5283
5284
5285
5286
5287
5288
5289
5290
5291
5292
5293
5294
5295
5296
5297
5298
5299
5300
5301
5302
5303
5304
5305
5306
5307
5308
5309
5310
5311
5312
5313
5314
5315
5316
5317
5318
5319
5320
5321
5322
5323
5324
5325
5326
5327
5328
5329
5330
5331
5332
5333
5334
5335
5336
5337
5338
5339
5340
5341
5342
5343
5344
5345
5346
5347
5348
5349
5350
5351
5352
5353
5354
5355
5356
5357
5358
5359
5360
5361
5362
5363
5364
5365
5366
5367
5368
5369
5370
5371
5372
5373
5374
5375
5376
5377
5378
5379
5380
5381
5382
5383
5384
5385
5386
5387
5388
5389
5390
5391
5392
5393
5394
5395
5396
5397
5398
5399
5400
5401
5402
5403
5404
5405
5406
5407
5408
5409
5410
5411
5412
5413
5414
5415
5416
5417
5418
5419
5420
5421
5422
5423
5424
5425
5426
5427
5428
5429
5430
5431
5432
5433
5434
5435
5436
5437
5438
5439
5440
5441
5442
5443
5444
5445
5446
5447
5448
5449
5450
5451
5452
5453
5454
5455
5456
5457
5458
5459
5460
5461
5462
5463
5464
5465
5466
5467
5468
5469
5470
5471
5472
5473
5474
5475
5476
5477
5478
5479
5480
5481
5482
5483
5484
5485
5486
5487
5488
5489
5490
5491
5492
5493
5494
5495
5496
5497
def create_service_connector(
    self,
    name: str,
    connector_type: str,
    resource_type: Optional[str] = None,
    auth_method: Optional[str] = None,
    configuration: Optional[Dict[str, str]] = None,
    resource_id: Optional[str] = None,
    description: str = "",
    expiration_seconds: Optional[int] = None,
    expires_at: Optional[datetime] = None,
    expires_skew_tolerance: Optional[int] = None,
    labels: Optional[Dict[str, str]] = None,
    auto_configure: bool = False,
    verify: bool = True,
    list_resources: bool = True,
    register: bool = True,
) -> Tuple[
    Optional[
        Union[
            ServiceConnectorResponse,
            ServiceConnectorRequest,
        ]
    ],
    Optional[ServiceConnectorResourcesModel],
]:
    """Create, validate and/or register a service connector.

    Args:
        name: The name of the service connector.
        connector_type: The service connector type.
        auth_method: The authentication method of the service connector.
            May be omitted if auto-configuration is used.
        resource_type: The resource type for the service connector.
        configuration: The configuration of the service connector.
        resource_id: The resource id of the service connector.
        description: The description of the service connector.
        expiration_seconds: The expiration time of the service connector.
        expires_at: The expiration time of the service connector.
        expires_skew_tolerance: The allowed expiration skew for the service
            connector credentials.
        labels: The labels of the service connector.
        auto_configure: Whether to automatically configure the service
            connector from the local environment.
        verify: Whether to verify that the service connector configuration
            and credentials can be used to gain access to the resource.
        list_resources: Whether to also list the resources that the service
            connector can give access to (if verify is True).
        register: Whether to register the service connector or not.

    Returns:
        The model of the registered service connector and the resources
        that the service connector can give access to (if verify is True).

    Raises:
        ValueError: If the arguments are invalid.
        KeyError: If the service connector type is not found.
        NotImplementedError: If auto-configuration is not supported or
            not implemented for the service connector type.
        AuthorizationException: If the connector verification failed due
            to authorization issues.
    """
    from zenml.service_connectors.service_connector_registry import (
        service_connector_registry,
    )

    connector_instance: Optional[ServiceConnector] = None
    connector_resources: Optional[ServiceConnectorResourcesModel] = None

    # Get the service connector type class
    try:
        connector = self.zen_store.get_service_connector_type(
            connector_type=connector_type,
        )
    except KeyError:
        raise KeyError(
            f"Service connector type {connector_type} not found."
            "Please check that you have installed all required "
            "Python packages and ZenML integrations and try again."
        )

    if not resource_type and len(connector.resource_types) == 1:
        resource_type = connector.resource_types[0].resource_type

    # If auto_configure is set, we will try to automatically configure the
    # service connector from the local environment
    if auto_configure:
        if not connector.supports_auto_configuration:
            raise NotImplementedError(
                f"The {connector.name} service connector type "
                "does not support auto-configuration."
            )
        if not connector.local:
            raise NotImplementedError(
                f"The {connector.name} service connector type "
                "implementation is not available locally. Please "
                "check that you have installed all required Python "
                "packages and ZenML integrations and try again, or "
                "skip auto-configuration."
            )

        assert connector.connector_class is not None

        connector_instance = connector.connector_class.auto_configure(
            resource_type=resource_type,
            auth_method=auth_method,
            resource_id=resource_id,
        )
        assert connector_instance is not None
        connector_request = connector_instance.to_model(
            name=name,
            description=description or "",
            labels=labels,
        )

        if verify:
            # Prefer to verify the connector config server-side if the
            # implementation if available there, because it ensures
            # that the connector can be shared with other users or used
            # from other machines and because some auth methods rely on the
            # server-side authentication environment
            if connector.remote:
                connector_resources = (
                    self.zen_store.verify_service_connector_config(
                        connector_request,
                        list_resources=list_resources,
                    )
                )
            else:
                connector_resources = connector_instance.verify(
                    list_resources=list_resources,
                )

            if connector_resources.error:
                # Raise an exception if the connector verification failed
                raise AuthorizationException(connector_resources.error)

    else:
        if not auth_method:
            if len(connector.auth_methods) == 1:
                auth_method = connector.auth_methods[0].auth_method
            else:
                raise ValueError(
                    f"Multiple authentication methods are available for "
                    f"the {connector.name} service connector type. Please "
                    f"specify one of the following: "
                    f"{list(connector.auth_method_dict.keys())}."
                )

        connector_request = ServiceConnectorRequest(
            name=name,
            connector_type=connector_type,
            description=description,
            auth_method=auth_method,
            expiration_seconds=expiration_seconds,
            expires_at=expires_at,
            expires_skew_tolerance=expires_skew_tolerance,
            labels=labels or {},
        )
        # Validate and configure the resources
        connector_request.validate_and_configure_resources(
            connector_type=connector,
            resource_types=resource_type,
            resource_id=resource_id,
            configuration=configuration,
        )
        if verify:
            # Prefer to verify the connector config server-side if the
            # implementation if available there, because it ensures
            # that the connector can be shared with other users or used
            # from other machines and because some auth methods rely on the
            # server-side authentication environment
            if connector.remote:
                connector_resources = (
                    self.zen_store.verify_service_connector_config(
                        connector_request,
                        list_resources=list_resources,
                    )
                )
            else:
                connector_instance = (
                    service_connector_registry.instantiate_connector(
                        model=connector_request
                    )
                )
                connector_resources = connector_instance.verify(
                    list_resources=list_resources,
                )

            if connector_resources.error:
                # Raise an exception if the connector verification failed
                raise AuthorizationException(connector_resources.error)

            # For resource types that don't support multi-instances, it's
            # better to save the default resource ID in the connector, if
            # available. Otherwise, we'll need to instantiate the connector
            # again to get the default resource ID.
            connector_request.resource_id = (
                connector_request.resource_id
                or connector_resources.get_default_resource_id()
            )

    if not register:
        return connector_request, connector_resources

    # Register the new model
    connector_response = self.zen_store.create_service_connector(
        service_connector=connector_request
    )

    if connector_resources:
        connector_resources.id = connector_response.id
        connector_resources.name = connector_response.name
        connector_resources.connector_type = (
            connector_response.connector_type
        )

    return connector_response, connector_resources

create_stack(name, components, stack_spec_file=None, labels=None)

Registers a stack and its components.

Parameters:

Name Type Description Default
name str

The name of the stack to register.

required
components Mapping[StackComponentType, Union[str, UUID]]

dictionary which maps component types to component names

required
stack_spec_file Optional[str]

path to the stack spec file

None
labels Optional[Dict[str, Any]]

The labels of the stack.

None

Returns:

Type Description
StackResponse

The model of the registered stack.

Source code in src/zenml/client.py
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
def create_stack(
    self,
    name: str,
    components: Mapping[StackComponentType, Union[str, UUID]],
    stack_spec_file: Optional[str] = None,
    labels: Optional[Dict[str, Any]] = None,
) -> StackResponse:
    """Registers a stack and its components.

    Args:
        name: The name of the stack to register.
        components: dictionary which maps component types to component names
        stack_spec_file: path to the stack spec file
        labels: The labels of the stack.

    Returns:
        The model of the registered stack.
    """
    stack_components = {}

    for c_type, c_identifier in components.items():
        # Skip non-existent components.
        if not c_identifier:
            continue

        # Get the component.
        component = self.get_stack_component(
            name_id_or_prefix=c_identifier,
            component_type=c_type,
        )
        stack_components[c_type] = [component.id]

    stack = StackRequest(
        name=name,
        components=stack_components,
        stack_spec_path=stack_spec_file,
        labels=labels,
    )

    self._validate_stack_configuration(stack=stack)

    return self.zen_store.create_stack(stack=stack)

create_stack_component(name, flavor, component_type, configuration, labels=None)

Registers a stack component.

Parameters:

Name Type Description Default
name str

The name of the stack component.

required
flavor str

The flavor of the stack component.

required
component_type StackComponentType

The type of the stack component.

required
configuration Dict[str, str]

The configuration of the stack component.

required
labels Optional[Dict[str, Any]]

The labels of the stack component.

None

Returns:

Type Description
ComponentResponse

The model of the registered component.

Source code in src/zenml/client.py
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
def create_stack_component(
    self,
    name: str,
    flavor: str,
    component_type: StackComponentType,
    configuration: Dict[str, str],
    labels: Optional[Dict[str, Any]] = None,
) -> "ComponentResponse":
    """Registers a stack component.

    Args:
        name: The name of the stack component.
        flavor: The flavor of the stack component.
        component_type: The type of the stack component.
        configuration: The configuration of the stack component.
        labels: The labels of the stack component.

    Returns:
        The model of the registered component.
    """
    from zenml.stack.utils import (
        validate_stack_component_config,
        warn_if_config_server_mismatch,
    )

    validated_config = validate_stack_component_config(
        configuration_dict=configuration,
        flavor=flavor,
        component_type=component_type,
        # Always enforce validation of custom flavors
        validate_custom_flavors=True,
    )
    # Guaranteed to not be None by setting
    # `validate_custom_flavors=True` above
    assert validated_config is not None
    warn_if_config_server_mismatch(validated_config)

    create_component_model = ComponentRequest(
        name=name,
        type=component_type,
        flavor=flavor,
        configuration=validated_config.model_dump(
            mode="json", exclude_unset=True
        ),
        labels=labels,
    )

    # Register the new model
    return self.zen_store.create_stack_component(
        component=create_component_model
    )

create_tag(name, exclusive=False, color=None)

Creates a new tag.

Parameters:

Name Type Description Default
name str

the name of the tag.

required
exclusive bool

the boolean to decide whether the tag is an exclusive tag. An exclusive tag means that the tag can exist only for a single: - pipeline run within the scope of a pipeline - artifact version within the scope of an artifact - run template

False
color Optional[Union[str, ColorVariants]]

the color of the tag

None

Returns:

Type Description
TagResponse

The newly created tag.

Source code in src/zenml/client.py
7684
7685
7686
7687
7688
7689
7690
7691
7692
7693
7694
7695
7696
7697
7698
7699
7700
7701
7702
7703
7704
7705
7706
7707
7708
7709
def create_tag(
    self,
    name: str,
    exclusive: bool = False,
    color: Optional[Union[str, ColorVariants]] = None,
) -> TagResponse:
    """Creates a new tag.

    Args:
        name: the name of the tag.
        exclusive: the boolean to decide whether the tag is an exclusive tag.
            An exclusive tag means that the tag can exist only for a single:
                - pipeline run within the scope of a pipeline
                - artifact version within the scope of an artifact
                - run template
        color: the color of the tag

    Returns:
        The newly created tag.
    """
    request_model = TagRequest(name=name, exclusive=exclusive)

    if color is not None:
        request_model.color = ColorVariants(color)

    return self.zen_store.create_tag(tag=request_model)

create_trigger(name, event_source_id, event_filter, action_id, description='')

Registers a trigger.

Parameters:

Name Type Description Default
name str

The name of the trigger to create.

required
event_source_id UUID

The id of the event source id

required
event_filter Dict[str, Any]

The event filter

required
action_id UUID

The ID of the action that should be triggered.

required
description str

The description of the trigger

''

Returns:

Type Description
TriggerResponse

The created trigger.

Source code in src/zenml/client.py
3145
3146
3147
3148
3149
3150
3151
3152
3153
3154
3155
3156
3157
3158
3159
3160
3161
3162
3163
3164
3165
3166
3167
3168
3169
3170
3171
3172
3173
3174
3175
@_fail_for_sql_zen_store
def create_trigger(
    self,
    name: str,
    event_source_id: UUID,
    event_filter: Dict[str, Any],
    action_id: UUID,
    description: str = "",
) -> TriggerResponse:
    """Registers a trigger.

    Args:
        name: The name of the trigger to create.
        event_source_id: The id of the event source id
        event_filter: The event filter
        action_id: The ID of the action that should be triggered.
        description: The description of the trigger

    Returns:
        The created trigger.
    """
    trigger = TriggerRequest(
        name=name,
        description=description,
        event_source_id=event_source_id,
        event_filter=event_filter,
        action_id=action_id,
        project=self.active_project.id,
    )

    return self.zen_store.create_trigger(trigger=trigger)

create_user(name, password=None, is_admin=False)

Create a new user.

Parameters:

Name Type Description Default
name str

The name of the user.

required
password Optional[str]

The password of the user. If not provided, the user will be created with empty password.

None
is_admin bool

Whether the user should be an admin.

False

Returns:

Type Description
UserResponse

The model of the created user.

Source code in src/zenml/client.py
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
def create_user(
    self,
    name: str,
    password: Optional[str] = None,
    is_admin: bool = False,
) -> UserResponse:
    """Create a new user.

    Args:
        name: The name of the user.
        password: The password of the user. If not provided, the user will
            be created with empty password.
        is_admin: Whether the user should be an admin.

    Returns:
        The model of the created user.
    """
    user = UserRequest(
        name=name, password=password or None, is_admin=is_admin
    )
    user.active = (
        password != "" if self.zen_store.type != StoreType.REST else True
    )
    created_user = self.zen_store.create_user(user=user)

    return created_user

deactivate_user(name_id_or_prefix)

Deactivate a user and generate an activation token.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name or ID of the user to reset.

required

Returns:

Type Description
UserResponse

The deactivated user.

Source code in src/zenml/client.py
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
@_fail_for_sql_zen_store
def deactivate_user(self, name_id_or_prefix: str) -> "UserResponse":
    """Deactivate a user and generate an activation token.

    Args:
        name_id_or_prefix: The name or ID of the user to reset.

    Returns:
        The deactivated user.
    """
    from zenml.zen_stores.rest_zen_store import RestZenStore

    user = self.get_user(name_id_or_prefix, allow_name_prefix_match=False)
    assert isinstance(self.zen_store, RestZenStore)
    return self.zen_store.deactivate_user(user_name_or_id=user.name)

delete_action(name_id_or_prefix, project=None)

Delete an action.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix id of the action to delete.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
3121
3122
3123
3124
3125
3126
3127
3128
3129
3130
3131
3132
3133
3134
3135
3136
3137
3138
3139
3140
3141
@_fail_for_sql_zen_store
def delete_action(
    self,
    name_id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Delete an action.

    Args:
        name_id_or_prefix: The name, id or prefix id of the action
            to delete.
        project: The project name/ID to filter by.
    """
    action = self.get_action(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
        project=project,
    )

    self.zen_store.delete_action(action_id=action.id)
    logger.info("Deleted action with name '%s'.", action.name)

Delete all model version to artifact links in Model Control Plane.

Parameters:

Name Type Description Default
model_version_id UUID

The id of the model version holding the link.

required
only_links bool

If true, only delete the link to the artifact.

required
Source code in src/zenml/client.py
6732
6733
6734
6735
6736
6737
6738
6739
6740
6741
6742
6743
def delete_all_model_version_artifact_links(
    self, model_version_id: UUID, only_links: bool
) -> None:
    """Delete all model version to artifact links in Model Control Plane.

    Args:
        model_version_id: The id of the model version holding the link.
        only_links: If true, only delete the link to the artifact.
    """
    self.zen_store.delete_all_model_version_artifact_links(
        model_version_id, only_links
    )

delete_api_key(service_account_name_id_or_prefix, name_id_or_prefix)

Delete an API key.

Parameters:

Name Type Description Default
service_account_name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the service account to delete the API key for.

required
name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the API key.

required
Source code in src/zenml/client.py
7661
7662
7663
7664
7665
7666
7667
7668
7669
7670
7671
7672
7673
7674
7675
7676
7677
7678
7679
7680
7681
def delete_api_key(
    self,
    service_account_name_id_or_prefix: Union[str, UUID],
    name_id_or_prefix: Union[str, UUID],
) -> None:
    """Delete an API key.

    Args:
        service_account_name_id_or_prefix: The name, ID or prefix of the
            service account to delete the API key for.
        name_id_or_prefix: The name, ID or prefix of the API key.
    """
    api_key = self.get_api_key(
        service_account_name_id_or_prefix=service_account_name_id_or_prefix,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
    )
    self.zen_store.delete_api_key(
        service_account_id=api_key.service_account.id,
        api_key_name_or_id=api_key.id,
    )

delete_artifact(name_id_or_prefix, project=None)

Delete an artifact.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the artifact to delete.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
4236
4237
4238
4239
4240
4241
4242
4243
4244
4245
4246
4247
4248
4249
4250
4251
4252
def delete_artifact(
    self,
    name_id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Delete an artifact.

    Args:
        name_id_or_prefix: The name, ID or prefix of the artifact to delete.
        project: The project name/ID to filter by.
    """
    artifact = self.get_artifact(
        name_id_or_prefix=name_id_or_prefix,
        project=project,
    )
    self.zen_store.delete_artifact(artifact_id=artifact.id)
    logger.info(f"Deleted artifact '{artifact.name}'.")

delete_artifact_version(name_id_or_prefix, version=None, delete_metadata=True, delete_from_artifact_store=False, project=None)

Delete an artifact version.

By default, this will delete only the metadata of the artifact from the database, not the actual object stored in the artifact store.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The ID of artifact version or name or prefix of the artifact to delete.

required
version Optional[str]

The version of the artifact to delete.

None
delete_metadata bool

If True, delete the metadata of the artifact version from the database.

True
delete_from_artifact_store bool

If True, delete the artifact object itself from the artifact store.

False
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
4477
4478
4479
4480
4481
4482
4483
4484
4485
4486
4487
4488
4489
4490
4491
4492
4493
4494
4495
4496
4497
4498
4499
4500
4501
4502
4503
4504
4505
4506
4507
4508
4509
4510
def delete_artifact_version(
    self,
    name_id_or_prefix: Union[str, UUID],
    version: Optional[str] = None,
    delete_metadata: bool = True,
    delete_from_artifact_store: bool = False,
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Delete an artifact version.

    By default, this will delete only the metadata of the artifact from the
    database, not the actual object stored in the artifact store.

    Args:
        name_id_or_prefix: The ID of artifact version or name or prefix of the artifact to
            delete.
        version: The version of the artifact to delete.
        delete_metadata: If True, delete the metadata of the artifact
            version from the database.
        delete_from_artifact_store: If True, delete the artifact object
                itself from the artifact store.
        project: The project name/ID to filter by.
    """
    artifact_version = self.get_artifact_version(
        name_id_or_prefix=name_id_or_prefix,
        version=version,
        project=project,
    )
    if delete_from_artifact_store:
        self._delete_artifact_from_artifact_store(
            artifact_version=artifact_version
        )
    if delete_metadata:
        self._delete_artifact_version(artifact_version=artifact_version)

delete_authorized_device(id_or_prefix)

Delete an authorized device.

Parameters:

Name Type Description Default
id_or_prefix Union[str, UUID]

The ID or ID prefix of the authorized device.

required
Source code in src/zenml/client.py
6930
6931
6932
6933
6934
6935
6936
6937
6938
6939
6940
6941
6942
6943
def delete_authorized_device(
    self,
    id_or_prefix: Union[str, UUID],
) -> None:
    """Delete an authorized device.

    Args:
        id_or_prefix: The ID or ID prefix of the authorized device.
    """
    device = self.get_authorized_device(
        id_or_prefix=id_or_prefix,
        allow_id_prefix_match=False,
    )
    self.zen_store.delete_authorized_device(device.id)

delete_build(id_or_prefix, project=None)

Delete a build.

Parameters:

Name Type Description Default
id_or_prefix str

The id or id prefix of the build.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
2736
2737
2738
2739
2740
2741
2742
2743
2744
2745
2746
def delete_build(
    self, id_or_prefix: str, project: Optional[Union[str, UUID]] = None
) -> None:
    """Delete a build.

    Args:
        id_or_prefix: The id or id prefix of the build.
        project: The project name/ID to filter by.
    """
    build = self.get_build(id_or_prefix=id_or_prefix, project=project)
    self.zen_store.delete_build(build_id=build.id)

delete_code_repository(name_id_or_prefix, project=None)

Delete a code repository.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the code repository.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
5260
5261
5262
5263
5264
5265
5266
5267
5268
5269
5270
5271
5272
5273
5274
5275
5276
def delete_code_repository(
    self,
    name_id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Delete a code repository.

    Args:
        name_id_or_prefix: The name, ID or prefix of the code repository.
        project: The project name/ID to filter by.
    """
    repo = self.get_code_repository(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
        project=project,
    )
    self.zen_store.delete_code_repository(code_repository_id=repo.id)

delete_deployment(id_or_prefix, project=None)

Delete a deployment.

Parameters:

Name Type Description Default
id_or_prefix str

The id or id prefix of the deployment.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
3477
3478
3479
3480
3481
3482
3483
3484
3485
3486
3487
3488
3489
3490
3491
3492
3493
def delete_deployment(
    self,
    id_or_prefix: str,
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Delete a deployment.

    Args:
        id_or_prefix: The id or id prefix of the deployment.
        project: The project name/ID to filter by.
    """
    deployment = self.get_deployment(
        id_or_prefix=id_or_prefix,
        project=project,
        hydrate=False,
    )
    self.zen_store.delete_deployment(deployment_id=deployment.id)

delete_event_source(name_id_or_prefix, project=None)

Deletes an event_source.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix id of the event_source to deregister.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
2926
2927
2928
2929
2930
2931
2932
2933
2934
2935
2936
2937
2938
2939
2940
2941
2942
2943
2944
2945
2946
@_fail_for_sql_zen_store
def delete_event_source(
    self,
    name_id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Deletes an event_source.

    Args:
        name_id_or_prefix: The name, id or prefix id of the event_source
            to deregister.
        project: The project name/ID to filter by.
    """
    event_source = self.get_event_source(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
        project=project,
    )

    self.zen_store.delete_event_source(event_source_id=event_source.id)
    logger.info("Deleted event_source with name '%s'.", event_source.name)

delete_flavor(name_id_or_prefix)

Deletes a flavor.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name, id or prefix of the id for the flavor to delete.

required
Source code in src/zenml/client.py
2272
2273
2274
2275
2276
2277
2278
2279
2280
2281
2282
2283
2284
def delete_flavor(self, name_id_or_prefix: str) -> None:
    """Deletes a flavor.

    Args:
        name_id_or_prefix: The name, id or prefix of the id for the
            flavor to delete.
    """
    flavor = self.get_flavor(
        name_id_or_prefix, allow_name_prefix_match=False
    )
    self.zen_store.delete_flavor(flavor_id=flavor.id)

    logger.info(f"Deleted flavor '{flavor.name}' of type '{flavor.type}'.")

delete_model(model_name_or_id, project=None)

Deletes a model from Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id Union[str, UUID]

name or id of the model to be deleted.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
6165
6166
6167
6168
6169
6170
6171
6172
6173
6174
6175
6176
6177
6178
6179
def delete_model(
    self,
    model_name_or_id: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Deletes a model from Model Control Plane.

    Args:
        model_name_or_id: name or id of the model to be deleted.
        project: The project name/ID to filter by.
    """
    model = self.get_model(
        model_name_or_id=model_name_or_id, project=project
    )
    self.zen_store.delete_model(model_id=model.id)

delete_model_version(model_version_id)

Deletes a model version from Model Control Plane.

Parameters:

Name Type Description Default
model_version_id UUID

Id of the model version to be deleted.

required
Source code in src/zenml/client.py
6368
6369
6370
6371
6372
6373
6374
6375
6376
6377
6378
6379
def delete_model_version(
    self,
    model_version_id: UUID,
) -> None:
    """Deletes a model version from Model Control Plane.

    Args:
        model_version_id: Id of the model version to be deleted.
    """
    self.zen_store.delete_model_version(
        model_version_id=model_version_id,
    )

Delete model version to artifact link in Model Control Plane.

Parameters:

Name Type Description Default
model_version_id UUID

The id of the model version holding the link.

required
artifact_version_id UUID

The id of the artifact version to be deleted.

required

Raises:

Type Description
RuntimeError

If more than one artifact link is found for given filters.

Source code in src/zenml/client.py
6700
6701
6702
6703
6704
6705
6706
6707
6708
6709
6710
6711
6712
6713
6714
6715
6716
6717
6718
6719
6720
6721
6722
6723
6724
6725
6726
6727
6728
6729
6730
def delete_model_version_artifact_link(
    self, model_version_id: UUID, artifact_version_id: UUID
) -> None:
    """Delete model version to artifact link in Model Control Plane.

    Args:
        model_version_id: The id of the model version holding the link.
        artifact_version_id: The id of the artifact version to be deleted.

    Raises:
        RuntimeError: If more than one artifact link is found for given filters.
    """
    artifact_links = self.list_model_version_artifact_links(
        model_version_id=model_version_id,
        artifact_version_id=artifact_version_id,
    )
    if artifact_links.items:
        if artifact_links.total > 1:
            raise RuntimeError(
                "More than one artifact link found for give model version "
                f"`{model_version_id}` and artifact version "
                f"`{artifact_version_id}`. This should not be happening and "
                "might indicate a corrupted state of your ZenML database. "
                "Please seek support via Community Slack."
            )
        self.zen_store.delete_model_version_artifact_link(
            model_version_id=model_version_id,
            model_version_artifact_link_name_or_id=artifact_links.items[
                0
            ].id,
        )

delete_pipeline(name_id_or_prefix, project=None)

Delete a pipeline.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or ID prefix of the pipeline.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
2426
2427
2428
2429
2430
2431
2432
2433
2434
2435
2436
2437
2438
2439
2440
def delete_pipeline(
    self,
    name_id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Delete a pipeline.

    Args:
        name_id_or_prefix: The name, ID or ID prefix of the pipeline.
        project: The project name/ID to filter by.
    """
    pipeline = self.get_pipeline(
        name_id_or_prefix=name_id_or_prefix, project=project
    )
    self.zen_store.delete_pipeline(pipeline_id=pipeline.id)

delete_pipeline_run(name_id_or_prefix, project=None)

Deletes a pipeline run.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

Name, ID, or prefix of the pipeline run.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
3991
3992
3993
3994
3995
3996
3997
3998
3999
4000
4001
4002
4003
4004
4005
4006
4007
def delete_pipeline_run(
    self,
    name_id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Deletes a pipeline run.

    Args:
        name_id_or_prefix: Name, ID, or prefix of the pipeline run.
        project: The project name/ID to filter by.
    """
    run = self.get_pipeline_run(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
        project=project,
    )
    self.zen_store.delete_run(run_id=run.id)

delete_project(name_id_or_prefix)

Delete a project.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name or ID of the project to delete.

required

Raises:

Type Description
IllegalOperationError

If the project to delete is the active project.

Source code in src/zenml/client.py
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
def delete_project(self, name_id_or_prefix: str) -> None:
    """Delete a project.

    Args:
        name_id_or_prefix: The name or ID of the project to delete.

    Raises:
        IllegalOperationError: If the project to delete is the active
            project.
    """
    project = self.get_project(
        name_id_or_prefix, allow_name_prefix_match=False
    )
    if self.active_project.id == project.id:
        raise IllegalOperationError(
            f"Project '{name_id_or_prefix}' cannot be deleted since "
            "it is currently active. Please set another project as "
            "active first."
        )
    self.zen_store.delete_project(project_name_or_id=project.id)

delete_run_template(name_id_or_prefix, project=None)

Delete a run template.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

Name/ID/ID prefix of the template to delete.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
3676
3677
3678
3679
3680
3681
3682
3683
3684
3685
3686
3687
3688
3689
3690
3691
3692
3693
3694
3695
3696
3697
3698
3699
3700
def delete_run_template(
    self,
    name_id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Delete a run template.

    Args:
        name_id_or_prefix: Name/ID/ID prefix of the template to delete.
        project: The project name/ID to filter by.
    """
    if is_valid_uuid(name_id_or_prefix):
        template_id = (
            UUID(name_id_or_prefix)
            if isinstance(name_id_or_prefix, str)
            else name_id_or_prefix
        )
    else:
        template_id = self.get_run_template(
            name_id_or_prefix,
            project=project,
            hydrate=False,
        ).id

    self.zen_store.delete_run_template(template_id=template_id)

delete_schedule(name_id_or_prefix, project=None)

Delete a schedule.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix id of the schedule to delete.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
3809
3810
3811
3812
3813
3814
3815
3816
3817
3818
3819
3820
3821
3822
3823
3824
3825
3826
3827
3828
3829
3830
3831
def delete_schedule(
    self,
    name_id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Delete a schedule.

    Args:
        name_id_or_prefix: The name, id or prefix id of the schedule
            to delete.
        project: The project name/ID to filter by.
    """
    schedule = self.get_schedule(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
        project=project,
    )
    logger.warning(
        f"Deleting schedule '{name_id_or_prefix}'... This will only delete "
        "the reference of the schedule from ZenML. Please make sure to "
        "manually stop/delete this schedule in your orchestrator as well!"
    )
    self.zen_store.delete_schedule(schedule_id=schedule.id)

delete_secret(name_id_or_prefix, private=None)

Deletes a secret.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name or ID of the secret.

required
private Optional[bool]

The private status of the secret to delete.

None
Source code in src/zenml/client.py
4933
4934
4935
4936
4937
4938
4939
4940
4941
4942
4943
4944
4945
4946
4947
4948
4949
4950
def delete_secret(
    self, name_id_or_prefix: str, private: Optional[bool] = None
) -> None:
    """Deletes a secret.

    Args:
        name_id_or_prefix: The name or ID of the secret.
        private: The private status of the secret to delete.
    """
    secret = self.get_secret(
        name_id_or_prefix=name_id_or_prefix,
        private=private,
        # Don't allow partial name matches, but allow partial ID matches
        allow_partial_name_match=False,
        allow_partial_id_match=True,
    )

    self.zen_store.delete_secret(secret_id=secret.id)

delete_service(name_id_or_prefix, project=None)

Delete a service.

Parameters:

Name Type Description Default
name_id_or_prefix UUID

The name or ID of the service to delete.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
def delete_service(
    self,
    name_id_or_prefix: UUID,
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Delete a service.

    Args:
        name_id_or_prefix: The name or ID of the service to delete.
        project: The project name/ID to filter by.
    """
    service = self.get_service(
        name_id_or_prefix,
        allow_name_prefix_match=False,
        project=project,
    )
    self.zen_store.delete_service(service_id=service.id)

delete_service_account(name_id_or_prefix)

Delete a service account.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name or ID of the service account to delete.

required
Source code in src/zenml/client.py
7387
7388
7389
7390
7391
7392
7393
7394
7395
7396
7397
7398
7399
7400
7401
def delete_service_account(
    self,
    name_id_or_prefix: Union[str, UUID],
) -> None:
    """Delete a service account.

    Args:
        name_id_or_prefix: The name or ID of the service account to delete.
    """
    service_account = self.get_service_account(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )
    self.zen_store.delete_service_account(
        service_account_name_or_id=service_account.id
    )

delete_service_connector(name_id_or_prefix)

Deletes a registered service connector.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The ID or name of the service connector to delete.

required
Source code in src/zenml/client.py
5836
5837
5838
5839
5840
5841
5842
5843
5844
5845
5846
5847
5848
5849
5850
5851
5852
5853
5854
5855
5856
5857
def delete_service_connector(
    self,
    name_id_or_prefix: Union[str, UUID],
) -> None:
    """Deletes a registered service connector.

    Args:
        name_id_or_prefix: The ID or name of the service connector to delete.
    """
    service_connector = self.get_service_connector(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
    )

    self.zen_store.delete_service_connector(
        service_connector_id=service_connector.id
    )
    logger.info(
        "Removed service connector (type: %s) with name '%s'.",
        service_connector.type,
        service_connector.name,
    )

delete_stack(name_id_or_prefix, recursive=False)

Deregisters a stack.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix id of the stack to deregister.

required
recursive bool

If True, all components of the stack which are not associated with any other stack will also be deleted.

False

Raises:

Type Description
ValueError

If the stack is the currently active stack for this client.

Source code in src/zenml/client.py
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
def delete_stack(
    self, name_id_or_prefix: Union[str, UUID], recursive: bool = False
) -> None:
    """Deregisters a stack.

    Args:
        name_id_or_prefix: The name, id or prefix id of the stack
            to deregister.
        recursive: If `True`, all components of the stack which are not
            associated with any other stack will also be deleted.

    Raises:
        ValueError: If the stack is the currently active stack for this
            client.
    """
    stack = self.get_stack(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )

    if stack.id == self.active_stack_model.id:
        raise ValueError(
            f"Unable to deregister active stack '{stack.name}'. Make "
            f"sure to designate a new active stack before deleting this "
            f"one."
        )

    cfg = GlobalConfiguration()
    if stack.id == cfg.active_stack_id:
        raise ValueError(
            f"Unable to deregister '{stack.name}' as it is the active "
            f"stack within your global configuration. Make "
            f"sure to designate a new active stack before deleting this "
            f"one."
        )

    if recursive:
        stack_components_free_for_deletion = []

        # Get all stack components associated with this stack
        for component_type, component_model in stack.components.items():
            # Get stack associated with the stack component

            stacks = self.list_stacks(
                component_id=component_model[0].id, size=2, page=1
            )

            # Check if the stack component is part of another stack
            if len(stacks) == 1 and stack.id == stacks[0].id:
                stack_components_free_for_deletion.append(
                    (component_type, component_model)
                )

        self.delete_stack(stack.id)

        for (
            stack_component_type,
            stack_component_model,
        ) in stack_components_free_for_deletion:
            self.delete_stack_component(
                stack_component_model[0].name, stack_component_type
            )

        logger.info("Deregistered stack with name '%s'.", stack.name)
        return

    self.zen_store.delete_stack(stack_id=stack.id)
    logger.info("Deregistered stack with name '%s'.", stack.name)

delete_stack_component(name_id_or_prefix, component_type)

Deletes a registered stack component.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The model of the component to delete.

required
component_type StackComponentType

The type of the component to delete.

required
Source code in src/zenml/client.py
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
def delete_stack_component(
    self,
    name_id_or_prefix: Union[str, UUID],
    component_type: StackComponentType,
) -> None:
    """Deletes a registered stack component.

    Args:
        name_id_or_prefix: The model of the component to delete.
        component_type: The type of the component to delete.
    """
    component = self.get_stack_component(
        name_id_or_prefix=name_id_or_prefix,
        component_type=component_type,
        allow_name_prefix_match=False,
    )

    self.zen_store.delete_stack_component(component_id=component.id)
    logger.info(
        "Deregistered stack component (type: %s) with name '%s'.",
        component.type,
        component.name,
    )

delete_tag(tag_name_or_id)

Deletes a tag.

Parameters:

Name Type Description Default
tag_name_or_id Union[str, UUID]

name or id of the tag to be deleted.

required
Source code in src/zenml/client.py
7711
7712
7713
7714
7715
7716
7717
7718
7719
7720
7721
7722
def delete_tag(
    self,
    tag_name_or_id: Union[str, UUID],
) -> None:
    """Deletes a tag.

    Args:
        tag_name_or_id: name or id of the tag to be deleted.
    """
    self.zen_store.delete_tag(
        tag_name_or_id=tag_name_or_id,
    )

delete_trigger(name_id_or_prefix, project=None)

Deletes an trigger.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix id of the trigger to deregister.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
3330
3331
3332
3333
3334
3335
3336
3337
3338
3339
3340
3341
3342
3343
3344
3345
3346
3347
3348
3349
3350
@_fail_for_sql_zen_store
def delete_trigger(
    self,
    name_id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Deletes an trigger.

    Args:
        name_id_or_prefix: The name, id or prefix id of the trigger
            to deregister.
        project: The project name/ID to filter by.
    """
    trigger = self.get_trigger(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
        project=project,
    )

    self.zen_store.delete_trigger(trigger_id=trigger.id)
    logger.info("Deleted trigger with name '%s'.", trigger.name)

delete_trigger_execution(trigger_execution_id)

Delete a trigger execution.

Parameters:

Name Type Description Default
trigger_execution_id UUID

The ID of the trigger execution to delete.

required
Source code in src/zenml/client.py
7006
7007
7008
7009
7010
7011
7012
7013
7014
def delete_trigger_execution(self, trigger_execution_id: UUID) -> None:
    """Delete a trigger execution.

    Args:
        trigger_execution_id: The ID of the trigger execution to delete.
    """
    self.zen_store.delete_trigger_execution(
        trigger_execution_id=trigger_execution_id
    )

delete_user(name_id_or_prefix)

Delete a user.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name or ID of the user to delete.

required
Source code in src/zenml/client.py
956
957
958
959
960
961
962
963
def delete_user(self, name_id_or_prefix: str) -> None:
    """Delete a user.

    Args:
        name_id_or_prefix: The name or ID of the user to delete.
    """
    user = self.get_user(name_id_or_prefix, allow_name_prefix_match=False)
    self.zen_store.delete_user(user_name_or_id=user.name)

detach_tag(tag_name_or_id, resources)

Detach a tag from resources.

Parameters:

Name Type Description Default
tag_name_or_id Union[str, UUID]

name or id of the tag to be detached.

required
resources List[TagResource]

the resources to detach the tag from.

required
Source code in src/zenml/client.py
7870
7871
7872
7873
7874
7875
7876
7877
7878
7879
7880
7881
7882
7883
7884
7885
7886
7887
7888
7889
7890
7891
7892
def detach_tag(
    self,
    tag_name_or_id: Union[str, UUID],
    resources: List[TagResource],
) -> None:
    """Detach a tag from resources.

    Args:
        tag_name_or_id: name or id of the tag to be detached.
        resources: the resources to detach the tag from.
    """
    tag_model = self.get_tag(tag_name_or_id)

    self.zen_store.batch_delete_tag_resource(
        tag_resources=[
            TagResourceRequest(
                tag_id=tag_model.id,
                resource_id=resource.id,
                resource_type=resource.type,
            )
            for resource in resources
        ]
    )

find_repository(path=None, enable_warnings=False) staticmethod

Search for a ZenML repository directory.

Parameters:

Name Type Description Default
path Optional[Path]

Optional path to look for the repository. If no path is given, this function tries to find the repository using the environment variable ZENML_REPOSITORY_PATH (if set) and recursively searching in the parent directories of the current working directory.

None
enable_warnings bool

If True, warnings are printed if the repository root cannot be found.

False

Returns:

Type Description
Optional[Path]

Absolute path to a ZenML repository directory or None if no

Optional[Path]

repository directory was found.

Source code in src/zenml/client.py
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
@staticmethod
def find_repository(
    path: Optional[Path] = None, enable_warnings: bool = False
) -> Optional[Path]:
    """Search for a ZenML repository directory.

    Args:
        path: Optional path to look for the repository. If no path is
            given, this function tries to find the repository using the
            environment variable `ZENML_REPOSITORY_PATH` (if set) and
            recursively searching in the parent directories of the current
            working directory.
        enable_warnings: If `True`, warnings are printed if the repository
            root cannot be found.

    Returns:
        Absolute path to a ZenML repository directory or None if no
        repository directory was found.
    """
    if not path:
        # try to get path from the environment variable
        env_var_path = os.getenv(ENV_ZENML_REPOSITORY_PATH)
        if env_var_path:
            path = Path(env_var_path)

    if path:
        # explicit path via parameter or environment variable, don't search
        # parent directories
        search_parent_directories = False
        warning_message = (
            f"Unable to find ZenML repository at path '{path}'. Make sure "
            f"to create a ZenML repository by calling `zenml init` when "
            f"specifying an explicit repository path in code or via the "
            f"environment variable '{ENV_ZENML_REPOSITORY_PATH}'."
        )
    else:
        # try to find the repository in the parent directories of the
        # current working directory
        path = Path.cwd()
        search_parent_directories = True
        warning_message = (
            f"Unable to find ZenML repository in your current working "
            f"directory ({path}) or any parent directories. If you "
            f"want to use an existing repository which is in a different "
            f"location, set the environment variable "
            f"'{ENV_ZENML_REPOSITORY_PATH}'. If you want to create a new "
            f"repository, run `zenml init`."
        )

    def _find_repository_helper(path_: Path) -> Optional[Path]:
        """Recursively search parent directories for a ZenML repository.

        Args:
            path_: The path to search.

        Returns:
            Absolute path to a ZenML repository directory or None if no
            repository directory was found.
        """
        if Client.is_repository_directory(path_):
            return path_

        if not search_parent_directories or io_utils.is_root(str(path_)):
            return None

        return _find_repository_helper(path_.parent)

    repository_path = _find_repository_helper(path)

    if repository_path:
        return repository_path.resolve()
    if enable_warnings:
        logger.warning(warning_message)
    return None

get_action(name_id_or_prefix, allow_name_prefix_match=True, project=None, hydrate=True)

Get an action by name, ID or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, ID or prefix of the action.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ActionResponse

The action.

Source code in src/zenml/client.py
2991
2992
2993
2994
2995
2996
2997
2998
2999
3000
3001
3002
3003
3004
3005
3006
3007
3008
3009
3010
3011
3012
3013
3014
3015
3016
3017
3018
@_fail_for_sql_zen_store
def get_action(
    self,
    name_id_or_prefix: Union[UUID, str],
    allow_name_prefix_match: bool = True,
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
) -> ActionResponse:
    """Get an action by name, ID or prefix.

    Args:
        name_id_or_prefix: The name, ID or prefix of the action.
        allow_name_prefix_match: If True, allow matching by name prefix.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The action.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_action,
        list_method=self.list_actions,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        project=project,
        hydrate=hydrate,
    )

get_api_key(service_account_name_id_or_prefix, name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Get an API key by name, id or prefix.

Parameters:

Name Type Description Default
service_account_name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the service account to get the API key for.

required
name_id_or_prefix Union[str, UUID]

The name, ID or ID prefix of the API key.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
APIKeyResponse

The API key.

Source code in src/zenml/client.py
7535
7536
7537
7538
7539
7540
7541
7542
7543
7544
7545
7546
7547
7548
7549
7550
7551
7552
7553
7554
7555
7556
7557
7558
7559
7560
7561
7562
7563
7564
7565
7566
7567
7568
7569
7570
7571
7572
7573
7574
7575
7576
7577
7578
7579
7580
7581
7582
7583
7584
7585
def get_api_key(
    self,
    service_account_name_id_or_prefix: Union[str, UUID],
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> APIKeyResponse:
    """Get an API key by name, id or prefix.

    Args:
        service_account_name_id_or_prefix: The name, ID or prefix of the
            service account to get the API key for.
        name_id_or_prefix: The name, ID or ID prefix of the API key.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The API key.
    """
    service_account = self.get_service_account(
        name_id_or_prefix=service_account_name_id_or_prefix,
        allow_name_prefix_match=False,
    )

    def get_api_key_method(
        api_key_name_or_id: str, hydrate: bool = True
    ) -> APIKeyResponse:
        return self.zen_store.get_api_key(
            service_account_id=service_account.id,
            api_key_name_or_id=api_key_name_or_id,
            hydrate=hydrate,
        )

    def list_api_keys_method(
        hydrate: bool = True,
        **filter_args: Any,
    ) -> Page[APIKeyResponse]:
        return self.list_api_keys(
            service_account_name_id_or_prefix=service_account.id,
            hydrate=hydrate,
            **filter_args,
        )

    return self._get_entity_by_id_or_name_or_prefix(
        get_method=get_api_key_method,
        list_method=list_api_keys_method,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_artifact(name_id_or_prefix, project=None, hydrate=False)

Get an artifact by name, id or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the artifact to get.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
ArtifactResponse

The artifact.

Source code in src/zenml/client.py
4116
4117
4118
4119
4120
4121
4122
4123
4124
4125
4126
4127
4128
4129
4130
4131
4132
4133
4134
4135
4136
4137
4138
4139
def get_artifact(
    self,
    name_id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = False,
) -> ArtifactResponse:
    """Get an artifact by name, id or prefix.

    Args:
        name_id_or_prefix: The name, ID or prefix of the artifact to get.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The artifact.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_artifact,
        list_method=self.list_artifacts,
        name_id_or_prefix=name_id_or_prefix,
        project=project,
        hydrate=hydrate,
    )

get_artifact_version(name_id_or_prefix, version=None, project=None, hydrate=True)

Get an artifact version by ID or artifact name.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

Either the ID of the artifact version or the name of the artifact.

required
version Optional[str]

The version of the artifact to get. Only used if name_id_or_prefix is the name of the artifact. If not specified, the latest version is returned.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ArtifactVersionResponse

The artifact version.

Source code in src/zenml/client.py
4287
4288
4289
4290
4291
4292
4293
4294
4295
4296
4297
4298
4299
4300
4301
4302
4303
4304
4305
4306
4307
4308
4309
4310
4311
4312
4313
4314
4315
4316
4317
4318
4319
4320
4321
4322
4323
4324
4325
4326
4327
4328
4329
4330
4331
4332
4333
4334
4335
4336
4337
4338
4339
def get_artifact_version(
    self,
    name_id_or_prefix: Union[str, UUID],
    version: Optional[str] = None,
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
) -> ArtifactVersionResponse:
    """Get an artifact version by ID or artifact name.

    Args:
        name_id_or_prefix: Either the ID of the artifact version or the
            name of the artifact.
        version: The version of the artifact to get. Only used if
            `name_id_or_prefix` is the name of the artifact. If not
            specified, the latest version is returned.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The artifact version.
    """
    from zenml import get_step_context

    if cll := client_lazy_loader(
        method_name="get_artifact_version",
        name_id_or_prefix=name_id_or_prefix,
        version=version,
        project=project,
        hydrate=hydrate,
    ):
        return cll  # type: ignore[return-value]

    artifact = self._get_entity_version_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_artifact_version,
        list_method=self.list_artifact_versions,
        name_id_or_prefix=name_id_or_prefix,
        version=version,
        project=project,
        hydrate=hydrate,
    )
    try:
        step_run = get_step_context().step_run
        client = Client()
        client.zen_store.update_run_step(
            step_run_id=step_run.id,
            step_run_update=StepRunUpdate(
                loaded_artifact_versions={artifact.name: artifact.id}
            ),
        )
    except RuntimeError:
        pass  # Cannot link to step run if called outside a step
    return artifact

get_authorized_device(id_or_prefix, allow_id_prefix_match=True, hydrate=True)

Get an authorized device by id or prefix.

Parameters:

Name Type Description Default
id_or_prefix Union[UUID, str]

The ID or ID prefix of the authorized device.

required
allow_id_prefix_match bool

If True, allow matching by ID prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
OAuthDeviceResponse

The requested authorized device.

Raises:

Type Description
KeyError

If no authorized device is found with the given ID or prefix.

Source code in src/zenml/client.py
6864
6865
6866
6867
6868
6869
6870
6871
6872
6873
6874
6875
6876
6877
6878
6879
6880
6881
6882
6883
6884
6885
6886
6887
6888
6889
6890
6891
6892
6893
6894
6895
6896
6897
6898
6899
6900
6901
6902
6903
6904
def get_authorized_device(
    self,
    id_or_prefix: Union[UUID, str],
    allow_id_prefix_match: bool = True,
    hydrate: bool = True,
) -> OAuthDeviceResponse:
    """Get an authorized device by id or prefix.

    Args:
        id_or_prefix: The ID or ID prefix of the authorized device.
        allow_id_prefix_match: If True, allow matching by ID prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The requested authorized device.

    Raises:
        KeyError: If no authorized device is found with the given ID or
            prefix.
    """
    if isinstance(id_or_prefix, str):
        try:
            id_or_prefix = UUID(id_or_prefix)
        except ValueError:
            if not allow_id_prefix_match:
                raise KeyError(
                    f"No authorized device found with id or prefix "
                    f"'{id_or_prefix}'."
                )
    if isinstance(id_or_prefix, UUID):
        return self.zen_store.get_authorized_device(
            id_or_prefix, hydrate=hydrate
        )
    return self._get_entity_by_prefix(
        get_method=self.zen_store.get_authorized_device,
        list_method=self.list_authorized_devices,
        partial_id_or_name=id_or_prefix,
        allow_name_prefix_match=False,
        hydrate=hydrate,
    )

get_build(id_or_prefix, project=None, hydrate=True)

Get a build by id or prefix.

Parameters:

Name Type Description Default
id_or_prefix Union[str, UUID]

The id or id prefix of the build.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
PipelineBuildResponse

The build.

Raises:

Type Description
KeyError

If no build was found for the given id or prefix.

ZenKeyError

If multiple builds were found that match the given id or prefix.

Source code in src/zenml/client.py
2593
2594
2595
2596
2597
2598
2599
2600
2601
2602
2603
2604
2605
2606
2607
2608
2609
2610
2611
2612
2613
2614
2615
2616
2617
2618
2619
2620
2621
2622
2623
2624
2625
2626
2627
2628
2629
2630
2631
2632
2633
2634
2635
2636
2637
2638
2639
2640
2641
2642
2643
2644
2645
2646
2647
2648
2649
2650
2651
2652
2653
2654
2655
2656
def get_build(
    self,
    id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
) -> PipelineBuildResponse:
    """Get a build by id or prefix.

    Args:
        id_or_prefix: The id or id prefix of the build.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The build.

    Raises:
        KeyError: If no build was found for the given id or prefix.
        ZenKeyError: If multiple builds were found that match the given
            id or prefix.
    """
    from zenml.utils.uuid_utils import is_valid_uuid

    # First interpret as full UUID
    if is_valid_uuid(id_or_prefix):
        if not isinstance(id_or_prefix, UUID):
            id_or_prefix = UUID(id_or_prefix, version=4)

        return self.zen_store.get_build(
            id_or_prefix,
            hydrate=hydrate,
        )

    list_kwargs: Dict[str, Any] = dict(
        id=f"startswith:{id_or_prefix}",
        hydrate=hydrate,
    )
    scope = ""
    if project:
        list_kwargs["project"] = project
        scope = f" in project {project}"

    entity = self.list_builds(**list_kwargs)

    # If only a single entity is found, return it.
    if entity.total == 1:
        return entity.items[0]

    # If no entity is found, raise an error.
    if entity.total == 0:
        raise KeyError(
            f"No builds have been found that have either an id or prefix "
            f"that matches the provided string '{id_or_prefix}'{scope}."
        )

    raise ZenKeyError(
        f"{entity.total} builds have been found{scope} that have "
        f"an ID that matches the provided "
        f"string '{id_or_prefix}':\n"
        f"{[entity.items]}.\n"
        f"Please use the id to uniquely identify "
        f"only one of the builds."
    )

get_code_repository(name_id_or_prefix, allow_name_prefix_match=True, project=None, hydrate=True)

Get a code repository by name, id or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or ID prefix of the code repository.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
CodeRepositoryResponse

The code repository.

Source code in src/zenml/client.py
5132
5133
5134
5135
5136
5137
5138
5139
5140
5141
5142
5143
5144
5145
5146
5147
5148
5149
5150
5151
5152
5153
5154
5155
5156
5157
5158
def get_code_repository(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
) -> CodeRepositoryResponse:
    """Get a code repository by name, id or prefix.

    Args:
        name_id_or_prefix: The name, ID or ID prefix of the code repository.
        allow_name_prefix_match: If True, allow matching by name prefix.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The code repository.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_code_repository,
        list_method=self.list_code_repositories,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
        project=project,
    )

get_deployment(id_or_prefix, project=None, hydrate=True)

Get a deployment by id or prefix.

Parameters:

Name Type Description Default
id_or_prefix Union[str, UUID]

The id or id prefix of the deployment.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
PipelineDeploymentResponse

The deployment.

Raises:

Type Description
KeyError

If no deployment was found for the given id or prefix.

ZenKeyError

If multiple deployments were found that match the given id or prefix.

Source code in src/zenml/client.py
3354
3355
3356
3357
3358
3359
3360
3361
3362
3363
3364
3365
3366
3367
3368
3369
3370
3371
3372
3373
3374
3375
3376
3377
3378
3379
3380
3381
3382
3383
3384
3385
3386
3387
3388
3389
3390
3391
3392
3393
3394
3395
3396
3397
3398
3399
3400
3401
3402
3403
3404
3405
3406
3407
3408
3409
3410
3411
3412
3413
3414
3415
3416
def get_deployment(
    self,
    id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
) -> PipelineDeploymentResponse:
    """Get a deployment by id or prefix.

    Args:
        id_or_prefix: The id or id prefix of the deployment.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The deployment.

    Raises:
        KeyError: If no deployment was found for the given id or prefix.
        ZenKeyError: If multiple deployments were found that match the given
            id or prefix.
    """
    from zenml.utils.uuid_utils import is_valid_uuid

    # First interpret as full UUID
    if is_valid_uuid(id_or_prefix):
        id_ = (
            UUID(id_or_prefix)
            if isinstance(id_or_prefix, str)
            else id_or_prefix
        )
        return self.zen_store.get_deployment(id_, hydrate=hydrate)

    list_kwargs: Dict[str, Any] = dict(
        id=f"startswith:{id_or_prefix}",
        hydrate=hydrate,
    )
    scope = ""
    if project:
        list_kwargs["project"] = project
        scope = f" in project {project}"

    entity = self.list_deployments(**list_kwargs)

    # If only a single entity is found, return it.
    if entity.total == 1:
        return entity.items[0]

    # If no entity is found, raise an error.
    if entity.total == 0:
        raise KeyError(
            f"No deployment have been found that have either an id or "
            f"prefix that matches the provided string '{id_or_prefix}'{scope}."
        )

    raise ZenKeyError(
        f"{entity.total} deployments have been found{scope} that have "
        f"an ID that matches the provided "
        f"string '{id_or_prefix}':\n"
        f"{[entity.items]}.\n"
        f"Please use the id to uniquely identify "
        f"only one of the deployments."
    )

get_event_source(name_id_or_prefix, allow_name_prefix_match=True, project=None, hydrate=True)

Get an event source by name, ID or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, ID or prefix of the stack.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
EventSourceResponse

The event_source.

Source code in src/zenml/client.py
2783
2784
2785
2786
2787
2788
2789
2790
2791
2792
2793
2794
2795
2796
2797
2798
2799
2800
2801
2802
2803
2804
2805
2806
2807
2808
2809
2810
@_fail_for_sql_zen_store
def get_event_source(
    self,
    name_id_or_prefix: Union[UUID, str],
    allow_name_prefix_match: bool = True,
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
) -> EventSourceResponse:
    """Get an event source by name, ID or prefix.

    Args:
        name_id_or_prefix: The name, ID or prefix of the stack.
        allow_name_prefix_match: If True, allow matching by name prefix.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The event_source.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_event_source,
        list_method=self.list_event_sources,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        project=project,
        hydrate=hydrate,
    )

get_flavor(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Get a stack component flavor.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name, ID or prefix to the id of the flavor to get.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
FlavorResponse

The stack component flavor.

Source code in src/zenml/client.py
2194
2195
2196
2197
2198
2199
2200
2201
2202
2203
2204
2205
2206
2207
2208
2209
2210
2211
2212
2213
2214
2215
2216
2217
2218
def get_flavor(
    self,
    name_id_or_prefix: str,
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> FlavorResponse:
    """Get a stack component flavor.

    Args:
        name_id_or_prefix: The name, ID or prefix to the id of the flavor
            to get.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The stack component flavor.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_flavor,
        list_method=self.list_flavors,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_flavor_by_name_and_type(name, component_type)

Fetches a registered flavor.

Parameters:

Name Type Description Default
component_type StackComponentType

The type of the component to fetch.

required
name str

The name of the flavor to fetch.

required

Returns:

Type Description
FlavorResponse

The registered flavor.

Raises:

Type Description
KeyError

If no flavor exists for the given type and name.

Source code in src/zenml/client.py
2303
2304
2305
2306
2307
2308
2309
2310
2311
2312
2313
2314
2315
2316
2317
2318
2319
2320
2321
2322
2323
2324
2325
2326
2327
2328
2329
2330
2331
2332
2333
2334
2335
2336
2337
def get_flavor_by_name_and_type(
    self, name: str, component_type: "StackComponentType"
) -> FlavorResponse:
    """Fetches a registered flavor.

    Args:
        component_type: The type of the component to fetch.
        name: The name of the flavor to fetch.

    Returns:
        The registered flavor.

    Raises:
        KeyError: If no flavor exists for the given type and name.
    """
    logger.debug(
        f"Fetching the flavor of type {component_type} with name {name}."
    )

    if not (
        flavors := self.list_flavors(
            type=component_type, name=name, hydrate=True
        ).items
    ):
        raise KeyError(
            f"No flavor with name '{name}' and type '{component_type}' "
            "exists."
        )
    if len(flavors) > 1:
        raise KeyError(
            f"More than one flavor with name {name} and type "
            f"{component_type} exists."
        )

    return flavors[0]

get_flavors_by_type(component_type)

Fetches the list of flavor for a stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

The type of the component to fetch.

required

Returns:

Type Description
Page[FlavorResponse]

The list of flavors.

Source code in src/zenml/client.py
2286
2287
2288
2289
2290
2291
2292
2293
2294
2295
2296
2297
2298
2299
2300
2301
def get_flavors_by_type(
    self, component_type: "StackComponentType"
) -> Page[FlavorResponse]:
    """Fetches the list of flavor for a stack component type.

    Args:
        component_type: The type of the component to fetch.

    Returns:
        The list of flavors.
    """
    logger.debug(f"Fetching the flavors of type {component_type}.")

    return self.list_flavors(
        type=component_type,
    )

get_instance() classmethod

Return the Client singleton instance.

Returns:

Type Description
Optional[Client]

The Client singleton instance or None, if the Client hasn't

Optional[Client]

been initialized yet.

Source code in src/zenml/client.py
387
388
389
390
391
392
393
394
395
@classmethod
def get_instance(cls) -> Optional["Client"]:
    """Return the Client singleton instance.

    Returns:
        The Client singleton instance or None, if the Client hasn't
        been initialized yet.
    """
    return cls._global_client

get_model(model_name_or_id, project=None, hydrate=True, bypass_lazy_loader=False)

Get an existing model from Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id Union[str, UUID]

name or id of the model to be retrieved.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True
bypass_lazy_loader bool

Whether to bypass the lazy loader.

False

Returns:

Type Description
ModelResponse

The model of interest.

Source code in src/zenml/client.py
6238
6239
6240
6241
6242
6243
6244
6245
6246
6247
6248
6249
6250
6251
6252
6253
6254
6255
6256
6257
6258
6259
6260
6261
6262
6263
6264
6265
6266
6267
6268
6269
6270
6271
6272
def get_model(
    self,
    model_name_or_id: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
    bypass_lazy_loader: bool = False,
) -> ModelResponse:
    """Get an existing model from Model Control Plane.

    Args:
        model_name_or_id: name or id of the model to be retrieved.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        bypass_lazy_loader: Whether to bypass the lazy loader.

    Returns:
        The model of interest.
    """
    if not bypass_lazy_loader:
        if cll := client_lazy_loader(
            "get_model",
            model_name_or_id=model_name_or_id,
            hydrate=hydrate,
            project=project,
        ):
            return cll  # type: ignore[return-value]

    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_model,
        list_method=self.list_models,
        name_id_or_prefix=model_name_or_id,
        project=project,
        hydrate=hydrate,
    )

get_model_version(model_name_or_id=None, model_version_name_or_number_or_id=None, project=None, hydrate=True)

Get an existing model version from Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id Optional[Union[str, UUID]]

name or id of the model containing the model version.

None
model_version_name_or_number_or_id Optional[Union[str, int, ModelStages, UUID]]

name, id, stage or number of the model version to be retrieved. If skipped - latest version is retrieved.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ModelVersionResponse

The model version of interest.

Raises:

Type Description
RuntimeError

In case method inputs don't adhere to restrictions.

KeyError

In case no model version with the identifiers exists.

ValueError

In case retrieval is attempted using non UUID model version identifier and no model identifier provided.

Source code in src/zenml/client.py
6381
6382
6383
6384
6385
6386
6387
6388
6389
6390
6391
6392
6393
6394
6395
6396
6397
6398
6399
6400
6401
6402
6403
6404
6405
6406
6407
6408
6409
6410
6411
6412
6413
6414
6415
6416
6417
6418
6419
6420
6421
6422
6423
6424
6425
6426
6427
6428
6429
6430
6431
6432
6433
6434
6435
6436
6437
6438
6439
6440
6441
6442
6443
6444
6445
6446
6447
6448
6449
6450
6451
6452
6453
6454
6455
6456
6457
6458
6459
6460
6461
6462
6463
6464
6465
6466
6467
6468
6469
6470
6471
6472
6473
6474
6475
6476
6477
6478
6479
6480
6481
6482
6483
6484
6485
6486
6487
6488
6489
6490
6491
6492
6493
6494
6495
6496
6497
6498
def get_model_version(
    self,
    model_name_or_id: Optional[Union[str, UUID]] = None,
    model_version_name_or_number_or_id: Optional[
        Union[str, int, ModelStages, UUID]
    ] = None,
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
) -> ModelVersionResponse:
    """Get an existing model version from Model Control Plane.

    Args:
        model_name_or_id: name or id of the model containing the model
            version.
        model_version_name_or_number_or_id: name, id, stage or number of
            the model version to be retrieved. If skipped - latest version
            is retrieved.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The model version of interest.

    Raises:
        RuntimeError: In case method inputs don't adhere to restrictions.
        KeyError: In case no model version with the identifiers exists.
        ValueError: In case retrieval is attempted using non UUID model version
            identifier and no model identifier provided.
    """
    if (
        not is_valid_uuid(model_version_name_or_number_or_id)
        and model_name_or_id is None
    ):
        raise ValueError(
            "No model identifier provided and model version identifier "
            f"`{model_version_name_or_number_or_id}` is not a valid UUID."
        )
    if cll := client_lazy_loader(
        "get_model_version",
        model_name_or_id=model_name_or_id,
        model_version_name_or_number_or_id=model_version_name_or_number_or_id,
        project=project,
        hydrate=hydrate,
    ):
        return cll  # type: ignore[return-value]

    if model_version_name_or_number_or_id is None:
        model_version_name_or_number_or_id = ModelStages.LATEST

    if isinstance(model_version_name_or_number_or_id, UUID):
        return self.zen_store.get_model_version(
            model_version_id=model_version_name_or_number_or_id,
            hydrate=hydrate,
        )
    elif isinstance(model_version_name_or_number_or_id, int):
        model_versions = self.zen_store.list_model_versions(
            model_version_filter_model=ModelVersionFilter(
                model=model_name_or_id,
                number=model_version_name_or_number_or_id,
                project=project or self.active_project.id,
            ),
            hydrate=hydrate,
        ).items
    elif isinstance(model_version_name_or_number_or_id, str):
        if model_version_name_or_number_or_id == ModelStages.LATEST:
            model_versions = self.zen_store.list_model_versions(
                model_version_filter_model=ModelVersionFilter(
                    model=model_name_or_id,
                    sort_by=f"{SorterOps.DESCENDING}:number",
                    project=project or self.active_project.id,
                ),
                hydrate=hydrate,
            ).items

            if len(model_versions) > 0:
                model_versions = [model_versions[0]]
            else:
                model_versions = []
        elif model_version_name_or_number_or_id in ModelStages.values():
            model_versions = self.zen_store.list_model_versions(
                model_version_filter_model=ModelVersionFilter(
                    model=model_name_or_id,
                    stage=model_version_name_or_number_or_id,
                    project=project or self.active_project.id,
                ),
                hydrate=hydrate,
            ).items
        else:
            model_versions = self.zen_store.list_model_versions(
                model_version_filter_model=ModelVersionFilter(
                    model=model_name_or_id,
                    name=model_version_name_or_number_or_id,
                    project=project or self.active_project.id,
                ),
                hydrate=hydrate,
            ).items
    else:
        raise RuntimeError(
            f"The model version identifier "
            f"`{model_version_name_or_number_or_id}` is not"
            f"of the correct type."
        )

    if len(model_versions) == 1:
        return model_versions[0]
    elif len(model_versions) == 0:
        raise KeyError(
            f"No model version found for model "
            f"`{model_name_or_id}` with version identifier "
            f"`{model_version_name_or_number_or_id}`."
        )
    else:
        raise RuntimeError(
            f"The model version identifier "
            f"`{model_version_name_or_number_or_id}` is not"
            f"unique for model `{model_name_or_id}`."
        )

get_pipeline(name_id_or_prefix, project=None, hydrate=True)

Get a pipeline by name, id or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or ID prefix of the pipeline.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
PipelineResponse

The pipeline.

Source code in src/zenml/client.py
2401
2402
2403
2404
2405
2406
2407
2408
2409
2410
2411
2412
2413
2414
2415
2416
2417
2418
2419
2420
2421
2422
2423
2424
def get_pipeline(
    self,
    name_id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
) -> PipelineResponse:
    """Get a pipeline by name, id or prefix.

    Args:
        name_id_or_prefix: The name, ID or ID prefix of the pipeline.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The pipeline.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_pipeline,
        list_method=self.list_pipelines,
        name_id_or_prefix=name_id_or_prefix,
        project=project,
        hydrate=hydrate,
    )

get_pipeline_run(name_id_or_prefix, allow_name_prefix_match=True, project=None, hydrate=True, include_full_metadata=False)

Gets a pipeline run by name, ID, or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

Name, ID, or prefix of the pipeline run.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True
include_full_metadata bool

If True, include metadata of all steps in the response.

False

Returns:

Type Description
PipelineRunResponse

The pipeline run.

Source code in src/zenml/client.py
3835
3836
3837
3838
3839
3840
3841
3842
3843
3844
3845
3846
3847
3848
3849
3850
3851
3852
3853
3854
3855
3856
3857
3858
3859
3860
3861
3862
3863
3864
3865
def get_pipeline_run(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
    include_full_metadata: bool = False,
) -> PipelineRunResponse:
    """Gets a pipeline run by name, ID, or prefix.

    Args:
        name_id_or_prefix: Name, ID, or prefix of the pipeline run.
        allow_name_prefix_match: If True, allow matching by name prefix.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        include_full_metadata: If True, include metadata of all steps in
            the response.

    Returns:
        The pipeline run.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_run,
        list_method=self.list_pipeline_runs,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        project=project,
        hydrate=hydrate,
        include_full_metadata=include_full_metadata,
    )

get_project(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Gets a project.

Parameters:

Name Type Description Default
name_id_or_prefix Optional[Union[UUID, str]]

The name or ID of the project.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ProjectResponse

The project

Source code in src/zenml/client.py
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
def get_project(
    self,
    name_id_or_prefix: Optional[Union[UUID, str]],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> ProjectResponse:
    """Gets a project.

    Args:
        name_id_or_prefix: The name or ID of the project.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The project
    """
    if not name_id_or_prefix:
        return self.active_project
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_project,
        list_method=self.list_projects,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_run_step(step_run_id, hydrate=True)

Get a step run by ID.

Parameters:

Name Type Description Default
step_run_id UUID

The ID of the step run to get.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
StepRunResponse

The step run.

Source code in src/zenml/client.py
4011
4012
4013
4014
4015
4016
4017
4018
4019
4020
4021
4022
4023
4024
4025
4026
4027
4028
4029
def get_run_step(
    self,
    step_run_id: UUID,
    hydrate: bool = True,
) -> StepRunResponse:
    """Get a step run by ID.

    Args:
        step_run_id: The ID of the step run to get.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The step run.
    """
    return self.zen_store.get_run_step(
        step_run_id,
        hydrate=hydrate,
    )

get_run_template(name_id_or_prefix, project=None, hydrate=True)

Get a run template.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

Name/ID/ID prefix of the template to get.

required
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
RunTemplateResponse

The run template.

Source code in src/zenml/client.py
3526
3527
3528
3529
3530
3531
3532
3533
3534
3535
3536
3537
3538
3539
3540
3541
3542
3543
3544
3545
3546
3547
3548
3549
3550
3551
3552
def get_run_template(
    self,
    name_id_or_prefix: Union[str, UUID],
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
) -> RunTemplateResponse:
    """Get a run template.

    Args:
        name_id_or_prefix: Name/ID/ID prefix of the template to get.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The run template.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_run_template,
        list_method=functools.partial(
            self.list_run_templates, hidden=None
        ),
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
        project=project,
        hydrate=hydrate,
    )

get_schedule(name_id_or_prefix, allow_name_prefix_match=True, project=None, hydrate=True)

Get a schedule by name, id or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix of the schedule.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ScheduleResponse

The schedule.

Source code in src/zenml/client.py
3704
3705
3706
3707
3708
3709
3710
3711
3712
3713
3714
3715
3716
3717
3718
3719
3720
3721
3722
3723
3724
3725
3726
3727
3728
3729
3730
def get_schedule(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
) -> ScheduleResponse:
    """Get a schedule by name, id or prefix.

    Args:
        name_id_or_prefix: The name, id or prefix of the schedule.
        allow_name_prefix_match: If True, allow matching by name prefix.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The schedule.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_schedule,
        list_method=self.list_schedules,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        project=project,
        hydrate=hydrate,
    )

get_secret(name_id_or_prefix, private=None, allow_partial_name_match=True, allow_partial_id_match=True, hydrate=True)

Get a secret.

Get a secret identified by a name, ID or prefix of the name or ID and optionally a scope.

If a private status is not provided, privately scoped secrets will be searched for first, followed by publicly scoped secrets. When a name or prefix is used instead of a UUID value, each scope is first searched for an exact match, then for a ID prefix or name substring match before moving on to the next scope.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or prefix to the id of the secret to get.

required
private Optional[bool]

Whether the secret is private. If not set, all secrets will be searched for, prioritizing privately scoped secrets.

None
allow_partial_name_match bool

If True, allow partial name matches.

True
allow_partial_id_match bool

If True, allow partial ID matches.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
SecretResponse

The secret.

Raises:

Type Description
KeyError

If no secret is found.

ZenKeyError

If multiple secrets are found.

NotImplementedError

If centralized secrets management is not enabled.

Source code in src/zenml/client.py
4671
4672
4673
4674
4675
4676
4677
4678
4679
4680
4681
4682
4683
4684
4685
4686
4687
4688
4689
4690
4691
4692
4693
4694
4695
4696
4697
4698
4699
4700
4701
4702
4703
4704
4705
4706
4707
4708
4709
4710
4711
4712
4713
4714
4715
4716
4717
4718
4719
4720
4721
4722
4723
4724
4725
4726
4727
4728
4729
4730
4731
4732
4733
4734
4735
4736
4737
4738
4739
4740
4741
4742
4743
4744
4745
4746
4747
4748
4749
4750
4751
4752
4753
4754
4755
4756
4757
4758
4759
4760
4761
4762
4763
4764
4765
4766
4767
4768
4769
4770
4771
4772
4773
4774
4775
4776
4777
4778
4779
4780
4781
4782
4783
4784
4785
4786
4787
4788
4789
4790
4791
4792
4793
4794
4795
4796
4797
4798
def get_secret(
    self,
    name_id_or_prefix: Union[str, UUID],
    private: Optional[bool] = None,
    allow_partial_name_match: bool = True,
    allow_partial_id_match: bool = True,
    hydrate: bool = True,
) -> SecretResponse:
    """Get a secret.

    Get a secret identified by a name, ID or prefix of the name or ID and
    optionally a scope.

    If a private status is not provided, privately scoped secrets will be
    searched for first, followed by publicly scoped secrets. When a name or
    prefix is used instead of a UUID value, each scope is first searched for
    an exact match, then for a ID prefix or name substring match before
    moving on to the next scope.

    Args:
        name_id_or_prefix: The name, ID or prefix to the id of the secret
            to get.
        private: Whether the secret is private. If not set, all secrets will
            be searched for, prioritizing privately scoped secrets.
        allow_partial_name_match: If True, allow partial name matches.
        allow_partial_id_match: If True, allow partial ID matches.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The secret.

    Raises:
        KeyError: If no secret is found.
        ZenKeyError: If multiple secrets are found.
        NotImplementedError: If centralized secrets management is not
            enabled.
    """
    from zenml.utils.uuid_utils import is_valid_uuid

    try:
        # First interpret as full UUID
        if is_valid_uuid(name_id_or_prefix):
            # Fetch by ID; filter by scope if provided
            secret = self.zen_store.get_secret(
                secret_id=UUID(name_id_or_prefix)
                if isinstance(name_id_or_prefix, str)
                else name_id_or_prefix,
                hydrate=hydrate,
            )
            if private is not None and secret.private != private:
                raise KeyError(
                    f"No secret found with ID {str(name_id_or_prefix)}"
                )

            return secret
    except NotImplementedError:
        raise NotImplementedError(
            "centralized secrets management is not supported or explicitly "
            "disabled in the target ZenML deployment."
        )

    # If not a UUID, try to find by name and then by prefix
    assert not isinstance(name_id_or_prefix, UUID)

    # Private statuses to search in order of priority
    search_private_statuses = (
        [False, True] if private is None else [private]
    )

    secrets = self.list_secrets(
        logical_operator=LogicalOperators.OR,
        name=f"contains:{name_id_or_prefix}"
        if allow_partial_name_match
        else f"equals:{name_id_or_prefix}",
        id=f"startswith:{name_id_or_prefix}"
        if allow_partial_id_match
        else None,
        hydrate=hydrate,
    )

    for search_private_status in search_private_statuses:
        partial_matches: List[SecretResponse] = []
        for secret in secrets.items:
            if secret.private != search_private_status:
                continue
            # Exact match
            if secret.name == name_id_or_prefix:
                # Need to fetch the secret again to get the secret values
                return self.zen_store.get_secret(
                    secret_id=secret.id,
                    hydrate=hydrate,
                )
            # Partial match
            partial_matches.append(secret)

        if len(partial_matches) > 1:
            match_summary = "\n".join(
                [
                    f"[{secret.id}]: name = {secret.name}"
                    for secret in partial_matches
                ]
            )
            raise ZenKeyError(
                f"{len(partial_matches)} secrets have been found that have "
                f"a name or ID that matches the provided "
                f"string '{name_id_or_prefix}':\n"
                f"{match_summary}.\n"
                f"Please use the id to uniquely identify "
                f"only one of the secrets."
            )

        # If only a single secret is found, return it
        if len(partial_matches) == 1:
            # Need to fetch the secret again to get the secret values
            return self.zen_store.get_secret(
                secret_id=partial_matches[0].id,
                hydrate=hydrate,
            )
    private_status = ""
    if private is not None:
        private_status = "private " if private else "public "
    msg = (
        f"No {private_status}secret found with name, ID or prefix "
        f"'{name_id_or_prefix}'"
    )

    raise KeyError(msg)

get_secret_by_name_and_private_status(name, private=None, hydrate=True)

Fetches a registered secret with a given name and optional private status.

This is a version of get_secret that restricts the search to a given name and an optional private status, without doing any prefix or UUID matching.

If no private status is provided, the search will be done first for private secrets, then for public secrets.

Parameters:

Name Type Description Default
name str

The name of the secret to get.

required
private Optional[bool]

The private status of the secret to get.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
SecretResponse

The registered secret.

Raises:

Type Description
KeyError

If no secret exists for the given name in the given scope.

Source code in src/zenml/client.py
4952
4953
4954
4955
4956
4957
4958
4959
4960
4961
4962
4963
4964
4965
4966
4967
4968
4969
4970
4971
4972
4973
4974
4975
4976
4977
4978
4979
4980
4981
4982
4983
4984
4985
4986
4987
4988
4989
4990
4991
4992
4993
4994
4995
4996
4997
4998
4999
5000
5001
5002
5003
5004
5005
5006
5007
5008
def get_secret_by_name_and_private_status(
    self,
    name: str,
    private: Optional[bool] = None,
    hydrate: bool = True,
) -> SecretResponse:
    """Fetches a registered secret with a given name and optional private status.

    This is a version of get_secret that restricts the search to a given
    name and an optional private status, without doing any prefix or UUID
    matching.

    If no private status is provided, the search will be done first for
    private secrets, then for public secrets.

    Args:
        name: The name of the secret to get.
        private: The private status of the secret to get.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The registered secret.

    Raises:
        KeyError: If no secret exists for the given name in the given scope.
    """
    logger.debug(
        f"Fetching the secret with name '{name}' and private status "
        f"'{private}'."
    )

    # Private statuses to search in order of priority
    search_private_statuses = (
        [False, True] if private is None else [private]
    )

    for search_private_status in search_private_statuses:
        secrets = self.list_secrets(
            logical_operator=LogicalOperators.AND,
            name=f"equals:{name}",
            private=search_private_status,
            hydrate=hydrate,
        )

        if len(secrets.items) >= 1:
            # Need to fetch the secret again to get the secret values
            return self.zen_store.get_secret(
                secret_id=secrets.items[0].id, hydrate=hydrate
            )

    private_status = ""
    if private is not None:
        private_status = "private " if private else "public "
    msg = f"No {private_status}secret with name '{name}' was found"

    raise KeyError(msg)

get_service(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True, type=None, project=None)

Gets a service.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name or ID of the service.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True
type Optional[str]

The type of the service.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Returns:

Type Description
ServiceResponse

The Service

Source code in src/zenml/client.py
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
def get_service(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
    type: Optional[str] = None,
    project: Optional[Union[str, UUID]] = None,
) -> ServiceResponse:
    """Gets a service.

    Args:
        name_id_or_prefix: The name or ID of the service.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        type: The type of the service.
        project: The project name/ID to filter by.

    Returns:
        The Service
    """

    def type_scoped_list_method(
        hydrate: bool = True,
        **kwargs: Any,
    ) -> Page[ServiceResponse]:
        """Call `zen_store.list_services` with type scoping.

        Args:
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            **kwargs: Keyword arguments to pass to `ServiceFilterModel`.

        Returns:
            The type-scoped list of services.
        """
        service_filter_model = ServiceFilter(**kwargs)
        if type:
            service_filter_model.set_type(type=type)
        return self.zen_store.list_services(
            filter_model=service_filter_model,
            hydrate=hydrate,
        )

    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_service,
        list_method=type_scoped_list_method,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        project=project,
        hydrate=hydrate,
    )

get_service_account(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Gets a service account.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name or ID of the service account.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ServiceAccountResponse

The ServiceAccount

Source code in src/zenml/client.py
7281
7282
7283
7284
7285
7286
7287
7288
7289
7290
7291
7292
7293
7294
7295
7296
7297
7298
7299
7300
7301
7302
7303
7304
def get_service_account(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> ServiceAccountResponse:
    """Gets a service account.

    Args:
        name_id_or_prefix: The name or ID of the service account.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The ServiceAccount
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_service_account,
        list_method=self.list_service_accounts,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_service_connector(name_id_or_prefix, allow_name_prefix_match=True, load_secrets=False, hydrate=True)

Fetches a registered service connector.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The id of the service connector to fetch.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
load_secrets bool

If True, load the secrets for the service connector.

False
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ServiceConnectorResponse

The registered service connector.

Source code in src/zenml/client.py
5499
5500
5501
5502
5503
5504
5505
5506
5507
5508
5509
5510
5511
5512
5513
5514
5515
5516
5517
5518
5519
5520
5521
5522
5523
5524
5525
5526
5527
5528
5529
5530
5531
5532
5533
5534
5535
5536
5537
5538
5539
5540
5541
5542
5543
def get_service_connector(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    load_secrets: bool = False,
    hydrate: bool = True,
) -> ServiceConnectorResponse:
    """Fetches a registered service connector.

    Args:
        name_id_or_prefix: The id of the service connector to fetch.
        allow_name_prefix_match: If True, allow matching by name prefix.
        load_secrets: If True, load the secrets for the service connector.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The registered service connector.
    """
    connector = self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_service_connector,
        list_method=self.list_service_connectors,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

    if load_secrets and connector.secret_id:
        client = Client()
        try:
            secret = client.get_secret(
                name_id_or_prefix=connector.secret_id,
                allow_partial_id_match=False,
                allow_partial_name_match=False,
            )
        except KeyError as err:
            logger.error(
                "Unable to retrieve secret values associated with "
                f"service connector '{connector.name}': {err}"
            )
        else:
            # Add secret values to connector configuration
            connector.secrets.update(secret.values)

    return connector

get_service_connector_client(name_id_or_prefix, resource_type=None, resource_id=None, verify=False)

Get the client side of a service connector instance to use with a local client.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the service connector to use.

required
resource_type Optional[str]

The type of the resource to connect to. If not provided, the resource type from the service connector configuration will be used.

None
resource_id Optional[str]

The ID of a particular resource instance to configure the local client to connect to. If the connector instance is already configured with a resource ID that is not the same or equivalent to the one requested, a ValueError exception is raised. May be omitted for connectors and resource types that do not support multiple resource instances.

None
verify bool

Whether to verify that the service connector configuration and credentials can be used to gain access to the resource.

False

Returns:

Type Description
ServiceConnector

The client side of the indicated service connector instance that can

ServiceConnector

be used to connect to the resource locally.

Source code in src/zenml/client.py
5972
5973
5974
5975
5976
5977
5978
5979
5980
5981
5982
5983
5984
5985
5986
5987
5988
5989
5990
5991
5992
5993
5994
5995
5996
5997
5998
5999
6000
6001
6002
6003
6004
6005
6006
6007
6008
6009
6010
6011
6012
6013
6014
6015
6016
6017
6018
6019
6020
6021
6022
6023
6024
6025
6026
6027
6028
6029
6030
6031
6032
6033
6034
6035
6036
6037
6038
6039
6040
6041
6042
6043
6044
6045
6046
6047
6048
6049
def get_service_connector_client(
    self,
    name_id_or_prefix: Union[UUID, str],
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
    verify: bool = False,
) -> "ServiceConnector":
    """Get the client side of a service connector instance to use with a local client.

    Args:
        name_id_or_prefix: The name, id or prefix of the service connector
            to use.
        resource_type: The type of the resource to connect to. If not
            provided, the resource type from the service connector
            configuration will be used.
        resource_id: The ID of a particular resource instance to configure
            the local client to connect to. If the connector instance is
            already configured with a resource ID that is not the same or
            equivalent to the one requested, a `ValueError` exception is
            raised. May be omitted for connectors and resource types that do
            not support multiple resource instances.
        verify: Whether to verify that the service connector configuration
            and credentials can be used to gain access to the resource.

    Returns:
        The client side of the indicated service connector instance that can
        be used to connect to the resource locally.
    """
    from zenml.service_connectors.service_connector_registry import (
        service_connector_registry,
    )

    # Get the service connector model
    service_connector = self.get_service_connector(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
    )

    connector_type = self.get_service_connector_type(
        service_connector.type
    )

    # Prefer to fetch the connector client from the server if the
    # implementation if available there, because some auth methods rely on
    # the server-side authentication environment
    if connector_type.remote:
        connector_client_model = (
            self.zen_store.get_service_connector_client(
                service_connector_id=service_connector.id,
                resource_type=resource_type,
                resource_id=resource_id,
            )
        )

        connector_client = (
            service_connector_registry.instantiate_connector(
                model=connector_client_model
            )
        )

        if verify:
            # Verify the connector client on the local machine, because the
            # server-side implementation may not be able to do so
            connector_client.verify()
    else:
        connector_instance = (
            service_connector_registry.instantiate_connector(
                model=service_connector
            )
        )

        # Fetch the connector client
        connector_client = connector_instance.get_connector_client(
            resource_type=resource_type,
            resource_id=resource_id,
        )

    return connector_client

get_service_connector_type(connector_type)

Returns the requested service connector type.

Parameters:

Name Type Description Default
connector_type str

the service connector type identifier.

required

Returns:

Type Description
ServiceConnectorTypeModel

The requested service connector type.

Source code in src/zenml/client.py
6098
6099
6100
6101
6102
6103
6104
6105
6106
6107
6108
6109
6110
6111
6112
def get_service_connector_type(
    self,
    connector_type: str,
) -> ServiceConnectorTypeModel:
    """Returns the requested service connector type.

    Args:
        connector_type: the service connector type identifier.

    Returns:
        The requested service connector type.
    """
    return self.zen_store.get_service_connector_type(
        connector_type=connector_type,
    )

get_settings(hydrate=True)

Get the server settings.

Parameters:

Name Type Description Default
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ServerSettingsResponse

The server settings.

Source code in src/zenml/client.py
709
710
711
712
713
714
715
716
717
718
719
def get_settings(self, hydrate: bool = True) -> ServerSettingsResponse:
    """Get the server settings.

    Args:
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The server settings.
    """
    return self.zen_store.get_server_settings(hydrate=hydrate)

get_stack(name_id_or_prefix=None, allow_name_prefix_match=True, hydrate=True)

Get a stack by name, ID or prefix.

If no name, ID or prefix is provided, the active stack is returned.

Parameters:

Name Type Description Default
name_id_or_prefix Optional[Union[UUID, str]]

The name, ID or prefix of the stack.

None
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
StackResponse

The stack.

Source code in src/zenml/client.py
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
def get_stack(
    self,
    name_id_or_prefix: Optional[Union[UUID, str]] = None,
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> StackResponse:
    """Get a stack by name, ID or prefix.

    If no name, ID or prefix is provided, the active stack is returned.

    Args:
        name_id_or_prefix: The name, ID or prefix of the stack.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The stack.
    """
    if name_id_or_prefix is not None:
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_stack,
            list_method=self.list_stacks,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )
    else:
        return self.active_stack_model

get_stack_component(component_type, name_id_or_prefix=None, allow_name_prefix_match=True, hydrate=True)

Fetches a registered stack component.

If the name_id_or_prefix is provided, it will try to fetch the component with the corresponding identifier. If not, it will try to fetch the active component of the given type.

Parameters:

Name Type Description Default
component_type StackComponentType

The type of the component to fetch

required
name_id_or_prefix Optional[Union[str, UUID]]

The id of the component to fetch.

None
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ComponentResponse

The registered stack component.

Raises:

Type Description
KeyError

If no name_id_or_prefix is provided and no such component is part of the active stack.

Source code in src/zenml/client.py
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
def get_stack_component(
    self,
    component_type: StackComponentType,
    name_id_or_prefix: Optional[Union[str, UUID]] = None,
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> ComponentResponse:
    """Fetches a registered stack component.

    If the name_id_or_prefix is provided, it will try to fetch the component
    with the corresponding identifier. If not, it will try to fetch the
    active component of the given type.

    Args:
        component_type: The type of the component to fetch
        name_id_or_prefix: The id of the component to fetch.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The registered stack component.

    Raises:
        KeyError: If no name_id_or_prefix is provided and no such component
            is part of the active stack.
    """
    # If no `name_id_or_prefix` provided, try to get the active component.
    if not name_id_or_prefix:
        components = self.active_stack_model.components.get(
            component_type, None
        )
        if components:
            return components[0]
        raise KeyError(
            "No name_id_or_prefix provided and there is no active "
            f"{component_type} in the current active stack."
        )

    # Else, try to fetch the component with an explicit type filter
    def type_scoped_list_method(
        hydrate: bool = False,
        **kwargs: Any,
    ) -> Page[ComponentResponse]:
        """Call `zen_store.list_stack_components` with type scoping.

        Args:
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            **kwargs: Keyword arguments to pass to `ComponentFilterModel`.

        Returns:
            The type-scoped list of components.
        """
        component_filter_model = ComponentFilter(**kwargs)
        component_filter_model.set_scope_type(
            component_type=component_type
        )
        return self.zen_store.list_stack_components(
            component_filter_model=component_filter_model,
            hydrate=hydrate,
        )

    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_stack_component,
        list_method=type_scoped_list_method,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_tag(tag_name_or_id, hydrate=True)

Get an existing tag.

Parameters:

Name Type Description Default
tag_name_or_id Union[str, UUID]

name or id of the tag to be retrieved.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
TagResponse

The tag of interest.

Source code in src/zenml/client.py
7765
7766
7767
7768
7769
7770
7771
7772
7773
7774
7775
7776
7777
7778
7779
7780
7781
7782
7783
def get_tag(
    self,
    tag_name_or_id: Union[str, UUID],
    hydrate: bool = True,
) -> TagResponse:
    """Get an existing tag.

    Args:
        tag_name_or_id: name or id of the tag to be retrieved.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The tag of interest.
    """
    return self.zen_store.get_tag(
        tag_name_or_id=tag_name_or_id,
        hydrate=hydrate,
    )

get_trigger(name_id_or_prefix, allow_name_prefix_match=True, project=None, hydrate=True)

Get a trigger by name, ID or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, ID or prefix of the trigger.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
TriggerResponse

The trigger.

Source code in src/zenml/client.py
3177
3178
3179
3180
3181
3182
3183
3184
3185
3186
3187
3188
3189
3190
3191
3192
3193
3194
3195
3196
3197
3198
3199
3200
3201
3202
3203
3204
@_fail_for_sql_zen_store
def get_trigger(
    self,
    name_id_or_prefix: Union[UUID, str],
    allow_name_prefix_match: bool = True,
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = True,
) -> TriggerResponse:
    """Get a trigger by name, ID or prefix.

    Args:
        name_id_or_prefix: The name, ID or prefix of the trigger.
        allow_name_prefix_match: If True, allow matching by name prefix.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The trigger.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_trigger,
        list_method=self.list_triggers,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        project=project,
        hydrate=hydrate,
    )

get_trigger_execution(trigger_execution_id, hydrate=True)

Get a trigger execution by ID.

Parameters:

Name Type Description Default
trigger_execution_id UUID

The ID of the trigger execution to get.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
TriggerExecutionResponse

The trigger execution.

Source code in src/zenml/client.py
6947
6948
6949
6950
6951
6952
6953
6954
6955
6956
6957
6958
6959
6960
6961
6962
6963
6964
def get_trigger_execution(
    self,
    trigger_execution_id: UUID,
    hydrate: bool = True,
) -> TriggerExecutionResponse:
    """Get a trigger execution by ID.

    Args:
        trigger_execution_id: The ID of the trigger execution to get.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The trigger execution.
    """
    return self.zen_store.get_trigger_execution(
        trigger_execution_id=trigger_execution_id, hydrate=hydrate
    )

get_user(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Gets a user.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name or ID of the user.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
UserResponse

The User

Source code in src/zenml/client.py
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
def get_user(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> UserResponse:
    """Gets a user.

    Args:
        name_id_or_prefix: The name or ID of the user.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The User
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_user,
        list_method=self.list_users,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

initialize(root=None) staticmethod

Initializes a new ZenML repository at the given path.

Parameters:

Name Type Description Default
root Optional[Path]

The root directory where the repository should be created. If None, the current working directory is used.

None

Raises:

Type Description
InitializationException

If the root directory already contains a ZenML repository.

Source code in src/zenml/client.py
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
@staticmethod
def initialize(
    root: Optional[Path] = None,
) -> None:
    """Initializes a new ZenML repository at the given path.

    Args:
        root: The root directory where the repository should be created.
            If None, the current working directory is used.

    Raises:
        InitializationException: If the root directory already contains a
            ZenML repository.
    """
    root = root or Path.cwd()
    logger.debug("Initializing new repository at path %s.", root)
    if Client.is_repository_directory(root):
        raise InitializationException(
            f"Found existing ZenML repository at path '{root}'."
        )

    config_directory = str(root / REPOSITORY_DIRECTORY_NAME)
    io_utils.create_dir_recursive_if_not_exists(config_directory)
    # Initialize the repository configuration at the custom path
    Client(root=root)

is_inside_repository(file_path) staticmethod

Returns whether a file is inside the active ZenML repository.

Parameters:

Name Type Description Default
file_path str

A file path.

required

Returns:

Type Description
bool

True if the file is inside the active ZenML repository, False

bool

otherwise.

Source code in src/zenml/client.py
626
627
628
629
630
631
632
633
634
635
636
637
638
639
@staticmethod
def is_inside_repository(file_path: str) -> bool:
    """Returns whether a file is inside the active ZenML repository.

    Args:
        file_path: A file path.

    Returns:
        True if the file is inside the active ZenML repository, False
        otherwise.
    """
    if repo_path := Client.find_repository():
        return repo_path in Path(file_path).resolve().parents
    return False

is_repository_directory(path) staticmethod

Checks whether a ZenML client exists at the given path.

Parameters:

Name Type Description Default
path Path

The path to check.

required

Returns:

Type Description
bool

True if a ZenML client exists at the given path,

bool

False otherwise.

Source code in src/zenml/client.py
537
538
539
540
541
542
543
544
545
546
547
548
549
@staticmethod
def is_repository_directory(path: Path) -> bool:
    """Checks whether a ZenML client exists at the given path.

    Args:
        path: The path to check.

    Returns:
        True if a ZenML client exists at the given path,
        False otherwise.
    """
    config_dir = path / REPOSITORY_DIRECTORY_NAME
    return fileio.isdir(str(config_dir))

list_actions(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, flavor=None, action_type=None, project=None, user=None, hydrate=False)

List actions.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of the action to filter by.

None
created Optional[datetime]

Use to filter by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
name Optional[str]

The name of the action to filter by.

None
flavor Optional[str]

The flavor of the action to filter by.

None
action_type Optional[str]

The type of the action to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ActionResponse]

A page of actions.

Source code in src/zenml/client.py
3020
3021
3022
3023
3024
3025
3026
3027
3028
3029
3030
3031
3032
3033
3034
3035
3036
3037
3038
3039
3040
3041
3042
3043
3044
3045
3046
3047
3048
3049
3050
3051
3052
3053
3054
3055
3056
3057
3058
3059
3060
3061
3062
3063
3064
3065
3066
3067
3068
3069
3070
3071
3072
@_fail_for_sql_zen_store
def list_actions(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    flavor: Optional[str] = None,
    action_type: Optional[str] = None,
    project: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[ActionResponse]:
    """List actions.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of the action to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        project: The project name/ID to filter by.
        user: Filter by user name/ID.
        name: The name of the action to filter by.
        flavor: The flavor of the action to filter by.
        action_type: The type of the action to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of actions.
    """
    filter_model = ActionFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        project=project or self.active_project.id,
        user=user,
        name=name,
        id=id,
        flavor=flavor,
        plugin_subtype=action_type,
        created=created,
        updated=updated,
    )
    return self.zen_store.list_actions(filter_model, hydrate=hydrate)

list_api_keys(service_account_name_id_or_prefix, sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, description=None, active=None, last_login=None, last_rotated=None, hydrate=False)

List all API keys.

Parameters:

Name Type Description Default
service_account_name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the service account to list the API keys for.

required
sort_by str

The column to sort by.

'created'
page int

The page of items.

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages.

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or].

AND
id Optional[Union[UUID, str]]

Use the id of the API key to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation.

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering.

None
name Optional[str]

The name of the API key to filter by.

None
description Optional[str]

The description of the API key to filter by.

None
active Optional[bool]

Whether the API key is active or not.

None
last_login Optional[Union[datetime, str]]

The last time the API key was used.

None
last_rotated Optional[Union[datetime, str]]

The last time the API key was rotated.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[APIKeyResponse]

A page of API keys matching the filter description.

Source code in src/zenml/client.py
7471
7472
7473
7474
7475
7476
7477
7478
7479
7480
7481
7482
7483
7484
7485
7486
7487
7488
7489
7490
7491
7492
7493
7494
7495
7496
7497
7498
7499
7500
7501
7502
7503
7504
7505
7506
7507
7508
7509
7510
7511
7512
7513
7514
7515
7516
7517
7518
7519
7520
7521
7522
7523
7524
7525
7526
7527
7528
7529
7530
7531
7532
7533
def list_api_keys(
    self,
    service_account_name_id_or_prefix: Union[str, UUID],
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    description: Optional[str] = None,
    active: Optional[bool] = None,
    last_login: Optional[Union[datetime, str]] = None,
    last_rotated: Optional[Union[datetime, str]] = None,
    hydrate: bool = False,
) -> Page[APIKeyResponse]:
    """List all API keys.

    Args:
        service_account_name_id_or_prefix: The name, ID or prefix of the
            service account to list the API keys for.
        sort_by: The column to sort by.
        page: The page of items.
        size: The maximum size of all pages.
        logical_operator: Which logical operator to use [and, or].
        id: Use the id of the API key to filter by.
        created: Use to filter by time of creation.
        updated: Use the last updated date for filtering.
        name: The name of the API key to filter by.
        description: The description of the API key to filter by.
        active: Whether the API key is active or not.
        last_login: The last time the API key was used.
        last_rotated: The last time the API key was rotated.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of API keys matching the filter description.
    """
    service_account = self.get_service_account(
        name_id_or_prefix=service_account_name_id_or_prefix,
        allow_name_prefix_match=False,
    )
    filter_model = APIKeyFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        name=name,
        description=description,
        active=active,
        last_login=last_login,
        last_rotated=last_rotated,
    )
    return self.zen_store.list_api_keys(
        service_account_id=service_account.id,
        filter_model=filter_model,
        hydrate=hydrate,
    )

list_artifact_versions(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, artifact=None, name=None, version=None, version_number=None, artifact_store_id=None, type=None, data_type=None, uri=None, materializer=None, project=None, model_version_id=None, only_unused=False, has_custom_name=None, user=None, model=None, pipeline_run=None, run_metadata=None, tag=None, tags=None, hydrate=False)

Get a list of artifact versions.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of artifact version to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
artifact Optional[Union[str, UUID]]

The name or ID of the artifact to filter by.

None
name Optional[str]

The name of the artifact to filter by.

None
version Optional[Union[str, int]]

The version of the artifact to filter by.

None
version_number Optional[int]

The version number of the artifact to filter by.

None
artifact_store_id Optional[Union[str, UUID]]

The id of the artifact store to filter by.

None
type Optional[ArtifactType]

The type of the artifact to filter by.

None
data_type Optional[str]

The data type of the artifact to filter by.

None
uri Optional[str]

The uri of the artifact to filter by.

None
materializer Optional[str]

The materializer of the artifact to filter by.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
model_version_id Optional[Union[str, UUID]]

Filter by model version ID.

None
only_unused Optional[bool]

Only return artifact versions that are not used in any pipeline runs.

False
has_custom_name Optional[bool]

Filter artifacts with/without custom names.

None
tag Optional[str]

A tag to filter by.

None
tags Optional[List[str]]

Tags to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name or ID.

None
model Optional[Union[UUID, str]]

Filter by model name or ID.

None
pipeline_run Optional[Union[UUID, str]]

Filter by pipeline run name or ID.

None
run_metadata Optional[List[str]]

Filter by run metadata.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ArtifactVersionResponse]

A list of artifact versions.

Source code in src/zenml/client.py
4341
4342
4343
4344
4345
4346
4347
4348
4349
4350
4351
4352
4353
4354
4355
4356
4357
4358
4359
4360
4361
4362
4363
4364
4365
4366
4367
4368
4369
4370
4371
4372
4373
4374
4375
4376
4377
4378
4379
4380
4381
4382
4383
4384
4385
4386
4387
4388
4389
4390
4391
4392
4393
4394
4395
4396
4397
4398
4399
4400
4401
4402
4403
4404
4405
4406
4407
4408
4409
4410
4411
4412
4413
4414
4415
4416
4417
4418
4419
4420
4421
4422
4423
4424
4425
4426
4427
4428
4429
4430
4431
4432
4433
4434
4435
4436
4437
4438
4439
4440
def list_artifact_versions(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    artifact: Optional[Union[str, UUID]] = None,
    name: Optional[str] = None,
    version: Optional[Union[str, int]] = None,
    version_number: Optional[int] = None,
    artifact_store_id: Optional[Union[str, UUID]] = None,
    type: Optional[ArtifactType] = None,
    data_type: Optional[str] = None,
    uri: Optional[str] = None,
    materializer: Optional[str] = None,
    project: Optional[Union[str, UUID]] = None,
    model_version_id: Optional[Union[str, UUID]] = None,
    only_unused: Optional[bool] = False,
    has_custom_name: Optional[bool] = None,
    user: Optional[Union[UUID, str]] = None,
    model: Optional[Union[UUID, str]] = None,
    pipeline_run: Optional[Union[UUID, str]] = None,
    run_metadata: Optional[List[str]] = None,
    tag: Optional[str] = None,
    tags: Optional[List[str]] = None,
    hydrate: bool = False,
) -> Page[ArtifactVersionResponse]:
    """Get a list of artifact versions.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of artifact version to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        artifact: The name or ID of the artifact to filter by.
        name: The name of the artifact to filter by.
        version: The version of the artifact to filter by.
        version_number: The version number of the artifact to filter by.
        artifact_store_id: The id of the artifact store to filter by.
        type: The type of the artifact to filter by.
        data_type: The data type of the artifact to filter by.
        uri: The uri of the artifact to filter by.
        materializer: The materializer of the artifact to filter by.
        project: The project name/ID to filter by.
        model_version_id: Filter by model version ID.
        only_unused: Only return artifact versions that are not used in
            any pipeline runs.
        has_custom_name: Filter artifacts with/without custom names.
        tag: A tag to filter by.
        tags: Tags to filter by.
        user: Filter by user name or ID.
        model: Filter by model name or ID.
        pipeline_run: Filter by pipeline run name or ID.
        run_metadata: Filter by run metadata.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A list of artifact versions.
    """
    if name:
        artifact = name

    artifact_version_filter_model = ArtifactVersionFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        artifact=artifact,
        version=str(version) if version else None,
        version_number=version_number,
        artifact_store_id=artifact_store_id,
        type=type,
        data_type=data_type,
        uri=uri,
        materializer=materializer,
        project=project or self.active_project.id,
        model_version_id=model_version_id,
        only_unused=only_unused,
        has_custom_name=has_custom_name,
        tag=tag,
        tags=tags,
        user=user,
        model=model,
        pipeline_run=pipeline_run,
        run_metadata=run_metadata,
    )
    return self.zen_store.list_artifact_versions(
        artifact_version_filter_model,
        hydrate=hydrate,
    )

list_artifacts(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, has_custom_name=None, user=None, project=None, hydrate=False, tag=None, tags=None)

Get a list of artifacts.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of artifact to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

The name of the artifact to filter by.

None
has_custom_name Optional[bool]

Filter artifacts with/without custom names.

None
user Optional[Union[UUID, str]]

Filter by user name or ID.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False
tag Optional[str]

Filter artifacts by tag.

None
tags Optional[List[str]]

Tags to filter by.

None

Returns:

Type Description
Page[ArtifactResponse]

A list of artifacts.

Source code in src/zenml/client.py
4141
4142
4143
4144
4145
4146
4147
4148
4149
4150
4151
4152
4153
4154
4155
4156
4157
4158
4159
4160
4161
4162
4163
4164
4165
4166
4167
4168
4169
4170
4171
4172
4173
4174
4175
4176
4177
4178
4179
4180
4181
4182
4183
4184
4185
4186
4187
4188
4189
4190
4191
4192
4193
4194
4195
4196
4197
4198
def list_artifacts(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    has_custom_name: Optional[bool] = None,
    user: Optional[Union[UUID, str]] = None,
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = False,
    tag: Optional[str] = None,
    tags: Optional[List[str]] = None,
) -> Page[ArtifactResponse]:
    """Get a list of artifacts.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of artifact to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: The name of the artifact to filter by.
        has_custom_name: Filter artifacts with/without custom names.
        user: Filter by user name or ID.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        tag: Filter artifacts by tag.
        tags: Tags to filter by.

    Returns:
        A list of artifacts.
    """
    artifact_filter_model = ArtifactFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        name=name,
        has_custom_name=has_custom_name,
        tag=tag,
        tags=tags,
        user=user,
        project=project or self.active_project.id,
    )
    return self.zen_store.list_artifacts(
        artifact_filter_model,
        hydrate=hydrate,
    )

list_authorized_devices(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, expires=None, client_id=None, status=None, trusted_device=None, user=None, failed_auth_attempts=None, last_login=None, hydrate=False)

List all authorized devices.

Parameters:

Name Type Description Default
sort_by str

The column to sort by.

'created'
page int

The page of items.

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages.

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or].

AND
id Optional[Union[UUID, str]]

Use the id of the code repository to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation.

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering.

None
expires Optional[Union[datetime, str]]

Use the expiration date for filtering.

None
client_id Union[UUID, str, None]

Use the client id for filtering.

None
status Union[OAuthDeviceStatus, str, None]

Use the status for filtering.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
trusted_device Union[bool, str, None]

Use the trusted device flag for filtering.

None
failed_auth_attempts Union[int, str, None]

Use the failed auth attempts for filtering.

None
last_login Optional[Union[datetime, str, None]]

Use the last login date for filtering.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[OAuthDeviceResponse]

A page of authorized devices matching the filter.

Source code in src/zenml/client.py
6802
6803
6804
6805
6806
6807
6808
6809
6810
6811
6812
6813
6814
6815
6816
6817
6818
6819
6820
6821
6822
6823
6824
6825
6826
6827
6828
6829
6830
6831
6832
6833
6834
6835
6836
6837
6838
6839
6840
6841
6842
6843
6844
6845
6846
6847
6848
6849
6850
6851
6852
6853
6854
6855
6856
6857
6858
6859
6860
6861
6862
def list_authorized_devices(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    expires: Optional[Union[datetime, str]] = None,
    client_id: Union[UUID, str, None] = None,
    status: Union[OAuthDeviceStatus, str, None] = None,
    trusted_device: Union[bool, str, None] = None,
    user: Optional[Union[UUID, str]] = None,
    failed_auth_attempts: Union[int, str, None] = None,
    last_login: Optional[Union[datetime, str, None]] = None,
    hydrate: bool = False,
) -> Page[OAuthDeviceResponse]:
    """List all authorized devices.

    Args:
        sort_by: The column to sort by.
        page: The page of items.
        size: The maximum size of all pages.
        logical_operator: Which logical operator to use [and, or].
        id: Use the id of the code repository to filter by.
        created: Use to filter by time of creation.
        updated: Use the last updated date for filtering.
        expires: Use the expiration date for filtering.
        client_id: Use the client id for filtering.
        status: Use the status for filtering.
        user: Filter by user name/ID.
        trusted_device: Use the trusted device flag for filtering.
        failed_auth_attempts: Use the failed auth attempts for filtering.
        last_login: Use the last login date for filtering.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of authorized devices matching the filter.
    """
    filter_model = OAuthDeviceFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        expires=expires,
        client_id=client_id,
        user=user,
        status=status,
        trusted_device=trusted_device,
        failed_auth_attempts=failed_auth_attempts,
        last_login=last_login,
    )
    return self.zen_store.list_authorized_devices(
        filter_model=filter_model,
        hydrate=hydrate,
    )

list_builds(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, project=None, user=None, pipeline_id=None, stack_id=None, container_registry_id=None, is_local=None, contains_code=None, zenml_version=None, python_version=None, checksum=None, stack_checksum=None, duration=None, hydrate=False)

List all builds.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of build to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
pipeline_id Optional[Union[str, UUID]]

The id of the pipeline to filter by.

None
stack_id Optional[Union[str, UUID]]

The id of the stack to filter by.

None
container_registry_id Optional[Union[UUID, str]]

The id of the container registry to filter by.

None
is_local Optional[bool]

Use to filter local builds.

None
contains_code Optional[bool]

Use to filter builds that contain code.

None
zenml_version Optional[str]

The version of ZenML to filter by.

None
python_version Optional[str]

The Python version to filter by.

None
checksum Optional[str]

The build checksum to filter by.

None
stack_checksum Optional[str]

The stack checksum to filter by.

None
duration Optional[Union[int, str]]

The duration of the build in seconds to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[PipelineBuildResponse]

A page with builds fitting the filter description

Source code in src/zenml/client.py
2658
2659
2660
2661
2662
2663
2664
2665
2666
2667
2668
2669
2670
2671
2672
2673
2674
2675
2676
2677
2678
2679
2680
2681
2682
2683
2684
2685
2686
2687
2688
2689
2690
2691
2692
2693
2694
2695
2696
2697
2698
2699
2700
2701
2702
2703
2704
2705
2706
2707
2708
2709
2710
2711
2712
2713
2714
2715
2716
2717
2718
2719
2720
2721
2722
2723
2724
2725
2726
2727
2728
2729
2730
2731
2732
2733
2734
def list_builds(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    project: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    pipeline_id: Optional[Union[str, UUID]] = None,
    stack_id: Optional[Union[str, UUID]] = None,
    container_registry_id: Optional[Union[UUID, str]] = None,
    is_local: Optional[bool] = None,
    contains_code: Optional[bool] = None,
    zenml_version: Optional[str] = None,
    python_version: Optional[str] = None,
    checksum: Optional[str] = None,
    stack_checksum: Optional[str] = None,
    duration: Optional[Union[int, str]] = None,
    hydrate: bool = False,
) -> Page[PipelineBuildResponse]:
    """List all builds.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of build to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        project: The project name/ID to filter by.
        user: Filter by user name/ID.
        pipeline_id: The id of the pipeline to filter by.
        stack_id: The id of the stack to filter by.
        container_registry_id: The id of the container registry to
            filter by.
        is_local: Use to filter local builds.
        contains_code: Use to filter builds that contain code.
        zenml_version: The version of ZenML to filter by.
        python_version: The Python version to filter by.
        checksum: The build checksum to filter by.
        stack_checksum: The stack checksum to filter by.
        duration: The duration of the build in seconds to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page with builds fitting the filter description
    """
    build_filter_model = PipelineBuildFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        project=project or self.active_project.id,
        user=user,
        pipeline_id=pipeline_id,
        stack_id=stack_id,
        container_registry_id=container_registry_id,
        is_local=is_local,
        contains_code=contains_code,
        zenml_version=zenml_version,
        python_version=python_version,
        checksum=checksum,
        stack_checksum=stack_checksum,
        duration=duration,
    )
    return self.zen_store.list_builds(
        build_filter_model=build_filter_model,
        hydrate=hydrate,
    )

list_code_repositories(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, project=None, user=None, hydrate=False)

List all code repositories.

Parameters:

Name Type Description Default
sort_by str

The column to sort by.

'created'
page int

The page of items.

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages.

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or].

AND
id Optional[Union[UUID, str]]

Use the id of the code repository to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation.

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering.

None
name Optional[str]

The name of the code repository to filter by.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[CodeRepositoryResponse]

A page of code repositories matching the filter description.

Source code in src/zenml/client.py
5160
5161
5162
5163
5164
5165
5166
5167
5168
5169
5170
5171
5172
5173
5174
5175
5176
5177
5178
5179
5180
5181
5182
5183
5184
5185
5186
5187
5188
5189
5190
5191
5192
5193
5194
5195
5196
5197
5198
5199
5200
5201
5202
5203
5204
5205
5206
5207
5208
def list_code_repositories(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    project: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[CodeRepositoryResponse]:
    """List all code repositories.

    Args:
        sort_by: The column to sort by.
        page: The page of items.
        size: The maximum size of all pages.
        logical_operator: Which logical operator to use [and, or].
        id: Use the id of the code repository to filter by.
        created: Use to filter by time of creation.
        updated: Use the last updated date for filtering.
        name: The name of the code repository to filter by.
        project: The project name/ID to filter by.
        user: Filter by user name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of code repositories matching the filter description.
    """
    filter_model = CodeRepositoryFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        name=name,
        project=project or self.active_project.id,
        user=user,
    )
    return self.zen_store.list_code_repositories(
        filter_model=filter_model,
        hydrate=hydrate,
    )

list_deployments(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, project=None, user=None, pipeline_id=None, stack_id=None, build_id=None, template_id=None, hydrate=False)

List all deployments.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of build to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
pipeline_id Optional[Union[str, UUID]]

The id of the pipeline to filter by.

None
stack_id Optional[Union[str, UUID]]

The id of the stack to filter by.

None
build_id Optional[Union[str, UUID]]

The id of the build to filter by.

None
template_id Optional[Union[str, UUID]]

The ID of the template to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[PipelineDeploymentResponse]

A page with deployments fitting the filter description

Source code in src/zenml/client.py
3418
3419
3420
3421
3422
3423
3424
3425
3426
3427
3428
3429
3430
3431
3432
3433
3434
3435
3436
3437
3438
3439
3440
3441
3442
3443
3444
3445
3446
3447
3448
3449
3450
3451
3452
3453
3454
3455
3456
3457
3458
3459
3460
3461
3462
3463
3464
3465
3466
3467
3468
3469
3470
3471
3472
3473
3474
3475
def list_deployments(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    project: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    pipeline_id: Optional[Union[str, UUID]] = None,
    stack_id: Optional[Union[str, UUID]] = None,
    build_id: Optional[Union[str, UUID]] = None,
    template_id: Optional[Union[str, UUID]] = None,
    hydrate: bool = False,
) -> Page[PipelineDeploymentResponse]:
    """List all deployments.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of build to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        project: The project name/ID to filter by.
        user: Filter by user name/ID.
        pipeline_id: The id of the pipeline to filter by.
        stack_id: The id of the stack to filter by.
        build_id: The id of the build to filter by.
        template_id: The ID of the template to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page with deployments fitting the filter description
    """
    deployment_filter_model = PipelineDeploymentFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        project=project or self.active_project.id,
        user=user,
        pipeline_id=pipeline_id,
        stack_id=stack_id,
        build_id=build_id,
        template_id=template_id,
    )
    return self.zen_store.list_deployments(
        deployment_filter_model=deployment_filter_model,
        hydrate=hydrate,
    )

list_event_sources(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, flavor=None, event_source_type=None, project=None, user=None, hydrate=False)

Lists all event_sources.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of event_sources to filter by.

None
created Optional[datetime]

Use to filter by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
name Optional[str]

The name of the event_source to filter by.

None
flavor Optional[str]

The flavor of the event_source to filter by.

None
event_source_type Optional[str]

The subtype of the event_source to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[EventSourceResponse]

A page of event_sources.

Source code in src/zenml/client.py
2812
2813
2814
2815
2816
2817
2818
2819
2820
2821
2822
2823
2824
2825
2826
2827
2828
2829
2830
2831
2832
2833
2834
2835
2836
2837
2838
2839
2840
2841
2842
2843
2844
2845
2846
2847
2848
2849
2850
2851
2852
2853
2854
2855
2856
2857
2858
2859
2860
2861
2862
2863
2864
2865
def list_event_sources(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    flavor: Optional[str] = None,
    event_source_type: Optional[str] = None,
    project: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[EventSourceResponse]:
    """Lists all event_sources.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of event_sources to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        project: The project name/ID to filter by.
        user: Filter by user name/ID.
        name: The name of the event_source to filter by.
        flavor: The flavor of the event_source to filter by.
        event_source_type: The subtype of the event_source to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of event_sources.
    """
    event_source_filter_model = EventSourceFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        project=project or self.active_project.id,
        user=user,
        name=name,
        flavor=flavor,
        plugin_subtype=event_source_type,
        id=id,
        created=created,
        updated=updated,
    )
    return self.zen_store.list_event_sources(
        event_source_filter_model, hydrate=hydrate
    )

list_flavors(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, type=None, integration=None, user=None, hydrate=False)

Fetches all the flavor models.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of flavors to filter by.

None
created Optional[datetime]

Use to flavors by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
name Optional[str]

The name of the flavor to filter by.

None
type Optional[str]

The type of the flavor to filter by.

None
integration Optional[str]

The integration of the flavor to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[FlavorResponse]

A list of all the flavor models.

Source code in src/zenml/client.py
2220
2221
2222
2223
2224
2225
2226
2227
2228
2229
2230
2231
2232
2233
2234
2235
2236
2237
2238
2239
2240
2241
2242
2243
2244
2245
2246
2247
2248
2249
2250
2251
2252
2253
2254
2255
2256
2257
2258
2259
2260
2261
2262
2263
2264
2265
2266
2267
2268
2269
2270
def list_flavors(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    type: Optional[str] = None,
    integration: Optional[str] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[FlavorResponse]:
    """Fetches all the flavor models.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of flavors to filter by.
        created: Use to flavors by time of creation
        updated: Use the last updated date for filtering
        user: Filter by user name/ID.
        name: The name of the flavor to filter by.
        type: The type of the flavor to filter by.
        integration: The integration of the flavor to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A list of all the flavor models.
    """
    flavor_filter_model = FlavorFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        user=user,
        name=name,
        type=type,
        integration=integration,
        id=id,
        created=created,
        updated=updated,
    )
    return self.zen_store.list_flavors(
        flavor_filter_model=flavor_filter_model, hydrate=hydrate
    )

Get model version to artifact links by filter in Model Control Plane.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
model_version_id Optional[Union[UUID, str]]

Use the model version id for filtering

None
artifact_version_id Optional[Union[UUID, str]]

Use the artifact id for filtering

None
artifact_name Optional[str]

Use the artifact name for filtering

None
only_data_artifacts Optional[bool]

Use to filter by data artifacts

None
only_model_artifacts Optional[bool]

Use to filter by model artifacts

None
only_deployment_artifacts Optional[bool]

Use to filter by deployment artifacts

None
has_custom_name Optional[bool]

Filter artifacts with/without custom names.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ModelVersionArtifactResponse]

A page of all model version to artifact links.

Source code in src/zenml/client.py
6639
6640
6641
6642
6643
6644
6645
6646
6647
6648
6649
6650
6651
6652
6653
6654
6655
6656
6657
6658
6659
6660
6661
6662
6663
6664
6665
6666
6667
6668
6669
6670
6671
6672
6673
6674
6675
6676
6677
6678
6679
6680
6681
6682
6683
6684
6685
6686
6687
6688
6689
6690
6691
6692
6693
6694
6695
6696
6697
6698
def list_model_version_artifact_links(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    model_version_id: Optional[Union[UUID, str]] = None,
    artifact_version_id: Optional[Union[UUID, str]] = None,
    artifact_name: Optional[str] = None,
    only_data_artifacts: Optional[bool] = None,
    only_model_artifacts: Optional[bool] = None,
    only_deployment_artifacts: Optional[bool] = None,
    has_custom_name: Optional[bool] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[ModelVersionArtifactResponse]:
    """Get model version to artifact links by filter in Model Control Plane.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        model_version_id: Use the model version id for filtering
        artifact_version_id: Use the artifact id for filtering
        artifact_name: Use the artifact name for filtering
        only_data_artifacts: Use to filter by data artifacts
        only_model_artifacts: Use to filter by model artifacts
        only_deployment_artifacts: Use to filter by deployment artifacts
        has_custom_name: Filter artifacts with/without custom names.
        user: Filter by user name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of all model version to artifact links.
    """
    return self.zen_store.list_model_version_artifact_links(
        ModelVersionArtifactFilter(
            sort_by=sort_by,
            logical_operator=logical_operator,
            page=page,
            size=size,
            created=created,
            updated=updated,
            model_version_id=model_version_id,
            artifact_version_id=artifact_version_id,
            artifact_name=artifact_name,
            only_data_artifacts=only_data_artifacts,
            only_model_artifacts=only_model_artifacts,
            only_deployment_artifacts=only_deployment_artifacts,
            has_custom_name=has_custom_name,
            user=user,
        ),
        hydrate=hydrate,
    )

Get all model version to pipeline run links by filter.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
model_version_id Optional[Union[UUID, str]]

Use the model version id for filtering

None
pipeline_run_id Optional[Union[UUID, str]]

Use the pipeline run id for filtering

None
pipeline_run_name Optional[str]

Use the pipeline run name for filtering

None
user Optional[Union[UUID, str]]

Filter by user name or ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response

False

Returns:

Type Description
Page[ModelVersionPipelineRunResponse]

A page of all model version to pipeline run links.

Source code in src/zenml/client.py
6751
6752
6753
6754
6755
6756
6757
6758
6759
6760
6761
6762
6763
6764
6765
6766
6767
6768
6769
6770
6771
6772
6773
6774
6775
6776
6777
6778
6779
6780
6781
6782
6783
6784
6785
6786
6787
6788
6789
6790
6791
6792
6793
6794
6795
6796
6797
6798
def list_model_version_pipeline_run_links(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    model_version_id: Optional[Union[UUID, str]] = None,
    pipeline_run_id: Optional[Union[UUID, str]] = None,
    pipeline_run_name: Optional[str] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[ModelVersionPipelineRunResponse]:
    """Get all model version to pipeline run links by filter.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        model_version_id: Use the model version id for filtering
        pipeline_run_id: Use the pipeline run id for filtering
        pipeline_run_name: Use the pipeline run name for filtering
        user: Filter by user name or ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response

    Returns:
        A page of all model version to pipeline run links.
    """
    return self.zen_store.list_model_version_pipeline_run_links(
        ModelVersionPipelineRunFilter(
            sort_by=sort_by,
            logical_operator=logical_operator,
            page=page,
            size=size,
            created=created,
            updated=updated,
            model_version_id=model_version_id,
            pipeline_run_id=pipeline_run_id,
            pipeline_run_name=pipeline_run_name,
            user=user,
        ),
        hydrate=hydrate,
    )

list_model_versions(model=None, model_name_or_id=None, sort_by='number', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, created=None, updated=None, name=None, id=None, number=None, stage=None, run_metadata=None, user=None, hydrate=False, tag=None, tags=None, project=None)

Get model versions by filter from Model Control Plane.

Parameters:

Name Type Description Default
model Optional[Union[str, UUID]]

The model to filter by.

None
model_name_or_id Optional[Union[str, UUID]]

name or id of the model containing the model version.

None
sort_by str

The column to sort by

'number'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

name or id of the model version.

None
id Optional[Union[UUID, str]]

id of the model version.

None
number Optional[int]

number of the model version.

None
stage Optional[Union[str, ModelStages]]

stage of the model version.

None
run_metadata Optional[List[str]]

run metadata of the model version.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False
tag Optional[str]

The tag to filter by.

None
tags Optional[List[str]]

Tags to filter by.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Returns:

Type Description
Page[ModelVersionResponse]

A page object with all model versions.

Source code in src/zenml/client.py
6500
6501
6502
6503
6504
6505
6506
6507
6508
6509
6510
6511
6512
6513
6514
6515
6516
6517
6518
6519
6520
6521
6522
6523
6524
6525
6526
6527
6528
6529
6530
6531
6532
6533
6534
6535
6536
6537
6538
6539
6540
6541
6542
6543
6544
6545
6546
6547
6548
6549
6550
6551
6552
6553
6554
6555
6556
6557
6558
6559
6560
6561
6562
6563
6564
6565
6566
6567
6568
6569
6570
6571
6572
6573
6574
6575
6576
6577
6578
6579
6580
6581
6582
6583
def list_model_versions(
    self,
    model: Optional[Union[str, UUID]] = None,
    model_name_or_id: Optional[Union[str, UUID]] = None,
    sort_by: str = "number",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    id: Optional[Union[UUID, str]] = None,
    number: Optional[int] = None,
    stage: Optional[Union[str, ModelStages]] = None,
    run_metadata: Optional[List[str]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
    tag: Optional[str] = None,
    tags: Optional[List[str]] = None,
    project: Optional[Union[str, UUID]] = None,
) -> Page[ModelVersionResponse]:
    """Get model versions by filter from Model Control Plane.

    Args:
        model: The model to filter by.
        model_name_or_id: name or id of the model containing the model
            version.
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: name or id of the model version.
        id: id of the model version.
        number: number of the model version.
        stage: stage of the model version.
        run_metadata: run metadata of the model version.
        user: Filter by user name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        tag: The tag to filter by.
        tags: Tags to filter by.
        project: The project name/ID to filter by.

    Returns:
        A page object with all model versions.
    """
    if model_name_or_id:
        logger.warning(
            "The `model_name_or_id` argument is deprecated. "
            "Please use the `model` argument instead."
        )
        if model is None:
            model = model_name_or_id
        else:
            logger.warning(
                "Ignoring `model_name_or_id` argument as `model` argument "
                "was also provided."
            )

    model_version_filter_model = ModelVersionFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        created=created,
        updated=updated,
        name=name,
        id=id,
        number=number,
        stage=stage,
        run_metadata=run_metadata,
        tag=tag,
        tags=tags,
        user=user,
        model=model,
        project=project or self.active_project.id,
    )

    return self.zen_store.list_model_versions(
        model_version_filter_model=model_version_filter_model,
        hydrate=hydrate,
    )

list_models(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, created=None, updated=None, name=None, id=None, user=None, project=None, hydrate=False, tag=None, tags=None)

Get models by filter from Model Control Plane.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

The name of the model to filter by.

None
id Optional[Union[UUID, str]]

The id of the model to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False
tag Optional[str]

The tag of the model to filter by.

None
tags Optional[List[str]]

Tags to filter by.

None

Returns:

Type Description
Page[ModelResponse]

A page object with all models.

Source code in src/zenml/client.py
6274
6275
6276
6277
6278
6279
6280
6281
6282
6283
6284
6285
6286
6287
6288
6289
6290
6291
6292
6293
6294
6295
6296
6297
6298
6299
6300
6301
6302
6303
6304
6305
6306
6307
6308
6309
6310
6311
6312
6313
6314
6315
6316
6317
6318
6319
6320
6321
6322
6323
6324
6325
6326
6327
6328
def list_models(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    id: Optional[Union[UUID, str]] = None,
    user: Optional[Union[UUID, str]] = None,
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = False,
    tag: Optional[str] = None,
    tags: Optional[List[str]] = None,
) -> Page[ModelResponse]:
    """Get models by filter from Model Control Plane.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: The name of the model to filter by.
        id: The id of the model to filter by.
        user: Filter by user name/ID.
        project: The project name/ID to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        tag: The tag of the model to filter by.
        tags: Tags to filter by.

    Returns:
        A page object with all models.
    """
    filter = ModelFilter(
        name=name,
        id=id,
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        created=created,
        updated=updated,
        tag=tag,
        tags=tags,
        user=user,
        project=project or self.active_project.id,
    )

    return self.zen_store.list_models(
        model_filter_model=filter, hydrate=hydrate
    )

list_pipeline_runs(sort_by='desc:created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, project=None, pipeline_id=None, pipeline_name=None, stack_id=None, schedule_id=None, build_id=None, deployment_id=None, code_repository_id=None, template_id=None, model_version_id=None, orchestrator_run_id=None, status=None, start_time=None, end_time=None, unlisted=None, templatable=None, tag=None, tags=None, user=None, run_metadata=None, pipeline=None, code_repository=None, model=None, stack=None, stack_component=None, hydrate=False, include_full_metadata=False)

List all pipeline runs.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'desc:created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

The id of the runs to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
pipeline_id Optional[Union[str, UUID]]

The id of the pipeline to filter by.

None
pipeline_name Optional[str]

DEPRECATED. Use pipeline instead to filter by pipeline name.

None
stack_id Optional[Union[str, UUID]]

The id of the stack to filter by.

None
schedule_id Optional[Union[str, UUID]]

The id of the schedule to filter by.

None
build_id Optional[Union[str, UUID]]

The id of the build to filter by.

None
deployment_id Optional[Union[str, UUID]]

The id of the deployment to filter by.

None
code_repository_id Optional[Union[str, UUID]]

The id of the code repository to filter by.

None
template_id Optional[Union[str, UUID]]

The ID of the template to filter by.

None
model_version_id Optional[Union[str, UUID]]

The ID of the model version to filter by.

None
orchestrator_run_id Optional[str]

The run id of the orchestrator to filter by.

None
name Optional[str]

The name of the run to filter by.

None
status Optional[str]

The status of the pipeline run

None
start_time Optional[Union[datetime, str]]

The start_time for the pipeline run

None
end_time Optional[Union[datetime, str]]

The end_time for the pipeline run

None
unlisted Optional[bool]

If the runs should be unlisted or not.

None
templatable Optional[bool]

If the runs should be templatable or not.

None
tag Optional[str]

Tag to filter by.

None
tags Optional[List[str]]

Tags to filter by.

None
user Optional[Union[UUID, str]]

The name/ID of the user to filter by.

None
run_metadata Optional[List[str]]

The run_metadata of the run to filter by.

None
pipeline Optional[Union[UUID, str]]

The name/ID of the pipeline to filter by.

None
code_repository Optional[Union[UUID, str]]

Filter by code repository name/ID.

None
model Optional[Union[UUID, str]]

Filter by model name/ID.

None
stack Optional[Union[UUID, str]]

Filter by stack name/ID.

None
stack_component Optional[Union[UUID, str]]

Filter by stack component name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False
include_full_metadata bool

If True, include metadata of all steps in the response.

False

Returns:

Type Description
Page[PipelineRunResponse]

A page with Pipeline Runs fitting the filter description

Source code in src/zenml/client.py
3867
3868
3869
3870
3871
3872
3873
3874
3875
3876
3877
3878
3879
3880
3881
3882
3883
3884
3885
3886
3887
3888
3889
3890
3891
3892
3893
3894
3895
3896
3897
3898
3899
3900
3901
3902
3903
3904
3905
3906
3907
3908
3909
3910
3911
3912
3913
3914
3915
3916
3917
3918
3919
3920
3921
3922
3923
3924
3925
3926
3927
3928
3929
3930
3931
3932
3933
3934
3935
3936
3937
3938
3939
3940
3941
3942
3943
3944
3945
3946
3947
3948
3949
3950
3951
3952
3953
3954
3955
3956
3957
3958
3959
3960
3961
3962
3963
3964
3965
3966
3967
3968
3969
3970
3971
3972
3973
3974
3975
3976
3977
3978
3979
3980
3981
3982
3983
3984
3985
3986
3987
3988
3989
def list_pipeline_runs(
    self,
    sort_by: str = "desc:created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    project: Optional[Union[str, UUID]] = None,
    pipeline_id: Optional[Union[str, UUID]] = None,
    pipeline_name: Optional[str] = None,
    stack_id: Optional[Union[str, UUID]] = None,
    schedule_id: Optional[Union[str, UUID]] = None,
    build_id: Optional[Union[str, UUID]] = None,
    deployment_id: Optional[Union[str, UUID]] = None,
    code_repository_id: Optional[Union[str, UUID]] = None,
    template_id: Optional[Union[str, UUID]] = None,
    model_version_id: Optional[Union[str, UUID]] = None,
    orchestrator_run_id: Optional[str] = None,
    status: Optional[str] = None,
    start_time: Optional[Union[datetime, str]] = None,
    end_time: Optional[Union[datetime, str]] = None,
    unlisted: Optional[bool] = None,
    templatable: Optional[bool] = None,
    tag: Optional[str] = None,
    tags: Optional[List[str]] = None,
    user: Optional[Union[UUID, str]] = None,
    run_metadata: Optional[List[str]] = None,
    pipeline: Optional[Union[UUID, str]] = None,
    code_repository: Optional[Union[UUID, str]] = None,
    model: Optional[Union[UUID, str]] = None,
    stack: Optional[Union[UUID, str]] = None,
    stack_component: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
    include_full_metadata: bool = False,
) -> Page[PipelineRunResponse]:
    """List all pipeline runs.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: The id of the runs to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        project: The project name/ID to filter by.
        pipeline_id: The id of the pipeline to filter by.
        pipeline_name: DEPRECATED. Use `pipeline` instead to filter by
            pipeline name.
        stack_id: The id of the stack to filter by.
        schedule_id: The id of the schedule to filter by.
        build_id: The id of the build to filter by.
        deployment_id: The id of the deployment to filter by.
        code_repository_id: The id of the code repository to filter by.
        template_id: The ID of the template to filter by.
        model_version_id: The ID of the model version to filter by.
        orchestrator_run_id: The run id of the orchestrator to filter by.
        name: The name of the run to filter by.
        status: The status of the pipeline run
        start_time: The start_time for the pipeline run
        end_time: The end_time for the pipeline run
        unlisted: If the runs should be unlisted or not.
        templatable: If the runs should be templatable or not.
        tag: Tag to filter by.
        tags: Tags to filter by.
        user: The name/ID of the user to filter by.
        run_metadata: The run_metadata of the run to filter by.
        pipeline: The name/ID of the pipeline to filter by.
        code_repository: Filter by code repository name/ID.
        model: Filter by model name/ID.
        stack: Filter by stack name/ID.
        stack_component: Filter by stack component name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        include_full_metadata: If True, include metadata of all steps in
            the response.

    Returns:
        A page with Pipeline Runs fitting the filter description
    """
    runs_filter_model = PipelineRunFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        name=name,
        project=project or self.active_project.id,
        pipeline_id=pipeline_id,
        pipeline_name=pipeline_name,
        schedule_id=schedule_id,
        build_id=build_id,
        deployment_id=deployment_id,
        code_repository_id=code_repository_id,
        template_id=template_id,
        model_version_id=model_version_id,
        orchestrator_run_id=orchestrator_run_id,
        stack_id=stack_id,
        status=status,
        start_time=start_time,
        end_time=end_time,
        tag=tag,
        tags=tags,
        unlisted=unlisted,
        user=user,
        run_metadata=run_metadata,
        pipeline=pipeline,
        code_repository=code_repository,
        stack=stack,
        model=model,
        stack_component=stack_component,
        templatable=templatable,
    )
    return self.zen_store.list_runs(
        runs_filter_model=runs_filter_model,
        hydrate=hydrate,
        include_full_metadata=include_full_metadata,
    )

list_pipelines(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, latest_run_status=None, project=None, user=None, tag=None, tags=None, hydrate=False)

List all pipelines.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of pipeline to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

The name of the pipeline to filter by.

None
latest_run_status Optional[str]

Filter by the status of the latest run of a pipeline.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
user Optional[Union[UUID, str]]

The name/ID of the user to filter by.

None
tag Optional[str]

Tag to filter by.

None
tags Optional[List[str]]

Tags to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[PipelineResponse]

A page with Pipeline fitting the filter description

Source code in src/zenml/client.py
2341
2342
2343
2344
2345
2346
2347
2348
2349
2350
2351
2352
2353
2354
2355
2356
2357
2358
2359
2360
2361
2362
2363
2364
2365
2366
2367
2368
2369
2370
2371
2372
2373
2374
2375
2376
2377
2378
2379
2380
2381
2382
2383
2384
2385
2386
2387
2388
2389
2390
2391
2392
2393
2394
2395
2396
2397
2398
2399
def list_pipelines(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    latest_run_status: Optional[str] = None,
    project: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    tag: Optional[str] = None,
    tags: Optional[List[str]] = None,
    hydrate: bool = False,
) -> Page[PipelineResponse]:
    """List all pipelines.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of pipeline to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: The name of the pipeline to filter by.
        latest_run_status: Filter by the status of the latest run of a
            pipeline.
        project: The project name/ID to filter by.
        user: The name/ID of the user to filter by.
        tag: Tag to filter by.
        tags: Tags to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page with Pipeline fitting the filter description
    """
    pipeline_filter_model = PipelineFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        name=name,
        latest_run_status=latest_run_status,
        project=project or self.active_project.id,
        user=user,
        tag=tag,
        tags=tags,
    )
    return self.zen_store.list_pipelines(
        pipeline_filter_model=pipeline_filter_model,
        hydrate=hydrate,
    )

list_projects(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, display_name=None, hydrate=False)

List all projects.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the project ID to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

Use the project name for filtering

None
display_name Optional[str]

Use the project display name for filtering

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ProjectResponse]

Page of projects

Source code in src/zenml/client.py
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
def list_projects(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    display_name: Optional[str] = None,
    hydrate: bool = False,
) -> Page[ProjectResponse]:
    """List all projects.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the project ID to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: Use the project name for filtering
        display_name: Use the project display name for filtering
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        Page of projects
    """
    return self.zen_store.list_projects(
        ProjectFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            display_name=display_name,
        ),
        hydrate=hydrate,
    )

list_run_steps(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, cache_key=None, code_hash=None, status=None, start_time=None, end_time=None, pipeline_run_id=None, deployment_id=None, original_step_run_id=None, project=None, user=None, model_version_id=None, model=None, run_metadata=None, hydrate=False)

List all pipelines.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of runs to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
start_time Optional[Union[datetime, str]]

Use to filter by the time when the step started running

None
end_time Optional[Union[datetime, str]]

Use to filter by the time when the step finished running

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
pipeline_run_id Optional[Union[str, UUID]]

The id of the pipeline run to filter by.

None
deployment_id Optional[Union[str, UUID]]

The id of the deployment to filter by.

None
original_step_run_id Optional[Union[str, UUID]]

The id of the original step run to filter by.

None
model_version_id Optional[Union[str, UUID]]

The ID of the model version to filter by.

None
model Optional[Union[UUID, str]]

Filter by model name/ID.

None
name Optional[str]

The name of the step run to filter by.

None
cache_key Optional[str]

The cache key of the step run to filter by.

None
code_hash Optional[str]

The code hash of the step run to filter by.

None
status Optional[str]

The name of the run to filter by.

None
run_metadata Optional[List[str]]

Filter by run metadata.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[StepRunResponse]

A page with Pipeline fitting the filter description

Source code in src/zenml/client.py
4031
4032
4033
4034
4035
4036
4037
4038
4039
4040
4041
4042
4043
4044
4045
4046
4047
4048
4049
4050
4051
4052
4053
4054
4055
4056
4057
4058
4059
4060
4061
4062
4063
4064
4065
4066
4067
4068
4069
4070
4071
4072
4073
4074
4075
4076
4077
4078
4079
4080
4081
4082
4083
4084
4085
4086
4087
4088
4089
4090
4091
4092
4093
4094
4095
4096
4097
4098
4099
4100
4101
4102
4103
4104
4105
4106
4107
4108
4109
4110
4111
4112
def list_run_steps(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    cache_key: Optional[str] = None,
    code_hash: Optional[str] = None,
    status: Optional[str] = None,
    start_time: Optional[Union[datetime, str]] = None,
    end_time: Optional[Union[datetime, str]] = None,
    pipeline_run_id: Optional[Union[str, UUID]] = None,
    deployment_id: Optional[Union[str, UUID]] = None,
    original_step_run_id: Optional[Union[str, UUID]] = None,
    project: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    model_version_id: Optional[Union[str, UUID]] = None,
    model: Optional[Union[UUID, str]] = None,
    run_metadata: Optional[List[str]] = None,
    hydrate: bool = False,
) -> Page[StepRunResponse]:
    """List all pipelines.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of runs to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        start_time: Use to filter by the time when the step started running
        end_time: Use to filter by the time when the step finished running
        project: The project name/ID to filter by.
        user: Filter by user name/ID.
        pipeline_run_id: The id of the pipeline run to filter by.
        deployment_id: The id of the deployment to filter by.
        original_step_run_id: The id of the original step run to filter by.
        model_version_id: The ID of the model version to filter by.
        model: Filter by model name/ID.
        name: The name of the step run to filter by.
        cache_key: The cache key of the step run to filter by.
        code_hash: The code hash of the step run to filter by.
        status: The name of the run to filter by.
        run_metadata: Filter by run metadata.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page with Pipeline fitting the filter description
    """
    step_run_filter_model = StepRunFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        cache_key=cache_key,
        code_hash=code_hash,
        pipeline_run_id=pipeline_run_id,
        deployment_id=deployment_id,
        original_step_run_id=original_step_run_id,
        status=status,
        created=created,
        updated=updated,
        start_time=start_time,
        end_time=end_time,
        name=name,
        project=project or self.active_project.id,
        user=user,
        model_version_id=model_version_id,
        model=model,
        run_metadata=run_metadata,
    )
    return self.zen_store.list_run_steps(
        step_run_filter_model=step_run_filter_model,
        hydrate=hydrate,
    )

list_run_templates(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, created=None, updated=None, id=None, name=None, hidden=False, tag=None, project=None, pipeline_id=None, build_id=None, stack_id=None, code_repository_id=None, user=None, pipeline=None, stack=None, hydrate=False)

Get a page of run templates.

Parameters:

Name Type Description Default
sort_by str

The column to sort by.

'created'
page int

The page of items.

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages.

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or].

AND
created Optional[Union[datetime, str]]

Filter by the creation date.

None
updated Optional[Union[datetime, str]]

Filter by the last updated date.

None
id Optional[Union[UUID, str]]

Filter by run template ID.

None
name Optional[str]

Filter by run template name.

None
hidden Optional[bool]

Filter by run template hidden status.

False
tag Optional[str]

Filter by run template tags.

None
project Optional[Union[str, UUID]]

Filter by project name/ID.

None
pipeline_id Optional[Union[str, UUID]]

Filter by pipeline ID.

None
build_id Optional[Union[str, UUID]]

Filter by build ID.

None
stack_id Optional[Union[str, UUID]]

Filter by stack ID.

None
code_repository_id Optional[Union[str, UUID]]

Filter by code repository ID.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
pipeline Optional[Union[UUID, str]]

Filter by pipeline name/ID.

None
stack Optional[Union[UUID, str]]

Filter by stack name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[RunTemplateResponse]

A page of run templates.

Source code in src/zenml/client.py
3554
3555
3556
3557
3558
3559
3560
3561
3562
3563
3564
3565
3566
3567
3568
3569
3570
3571
3572
3573
3574
3575
3576
3577
3578
3579
3580
3581
3582
3583
3584
3585
3586
3587
3588
3589
3590
3591
3592
3593
3594
3595
3596
3597
3598
3599
3600
3601
3602
3603
3604
3605
3606
3607
3608
3609
3610
3611
3612
3613
3614
3615
3616
3617
3618
3619
3620
3621
3622
3623
3624
3625
3626
def list_run_templates(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    id: Optional[Union[UUID, str]] = None,
    name: Optional[str] = None,
    hidden: Optional[bool] = False,
    tag: Optional[str] = None,
    project: Optional[Union[str, UUID]] = None,
    pipeline_id: Optional[Union[str, UUID]] = None,
    build_id: Optional[Union[str, UUID]] = None,
    stack_id: Optional[Union[str, UUID]] = None,
    code_repository_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    pipeline: Optional[Union[UUID, str]] = None,
    stack: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[RunTemplateResponse]:
    """Get a page of run templates.

    Args:
        sort_by: The column to sort by.
        page: The page of items.
        size: The maximum size of all pages.
        logical_operator: Which logical operator to use [and, or].
        created: Filter by the creation date.
        updated: Filter by the last updated date.
        id: Filter by run template ID.
        name: Filter by run template name.
        hidden: Filter by run template hidden status.
        tag: Filter by run template tags.
        project: Filter by project name/ID.
        pipeline_id: Filter by pipeline ID.
        build_id: Filter by build ID.
        stack_id: Filter by stack ID.
        code_repository_id: Filter by code repository ID.
        user: Filter by user name/ID.
        pipeline: Filter by pipeline name/ID.
        stack: Filter by stack name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of run templates.
    """
    filter = RunTemplateFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        created=created,
        updated=updated,
        id=id,
        name=name,
        hidden=hidden,
        tag=tag,
        project=project or self.active_project.id,
        pipeline_id=pipeline_id,
        build_id=build_id,
        stack_id=stack_id,
        code_repository_id=code_repository_id,
        user=user,
        pipeline=pipeline,
        stack=stack,
    )

    return self.zen_store.list_run_templates(
        template_filter_model=filter, hydrate=hydrate
    )

list_schedules(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, project=None, user=None, pipeline_id=None, orchestrator_id=None, active=None, cron_expression=None, start_time=None, end_time=None, interval_second=None, catchup=None, hydrate=False, run_once_start_time=None)

List schedules.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of stacks to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

The name of the stack to filter by.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
pipeline_id Optional[Union[str, UUID]]

The id of the pipeline to filter by.

None
orchestrator_id Optional[Union[str, UUID]]

The id of the orchestrator to filter by.

None
active Optional[Union[str, bool]]

Use to filter by active status.

None
cron_expression Optional[str]

Use to filter by cron expression.

None
start_time Optional[Union[datetime, str]]

Use to filter by start time.

None
end_time Optional[Union[datetime, str]]

Use to filter by end time.

None
interval_second Optional[int]

Use to filter by interval second.

None
catchup Optional[Union[str, bool]]

Use to filter by catchup.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False
run_once_start_time Optional[Union[datetime, str]]

Use to filter by run once start time.

None

Returns:

Type Description
Page[ScheduleResponse]

A list of schedules.

Source code in src/zenml/client.py
3732
3733
3734
3735
3736
3737
3738
3739
3740
3741
3742
3743
3744
3745
3746
3747
3748
3749
3750
3751
3752
3753
3754
3755
3756
3757
3758
3759
3760
3761
3762
3763
3764
3765
3766
3767
3768
3769
3770
3771
3772
3773
3774
3775
3776
3777
3778
3779
3780
3781
3782
3783
3784
3785
3786
3787
3788
3789
3790
3791
3792
3793
3794
3795
3796
3797
3798
3799
3800
3801
3802
3803
3804
3805
3806
3807
def list_schedules(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    project: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    pipeline_id: Optional[Union[str, UUID]] = None,
    orchestrator_id: Optional[Union[str, UUID]] = None,
    active: Optional[Union[str, bool]] = None,
    cron_expression: Optional[str] = None,
    start_time: Optional[Union[datetime, str]] = None,
    end_time: Optional[Union[datetime, str]] = None,
    interval_second: Optional[int] = None,
    catchup: Optional[Union[str, bool]] = None,
    hydrate: bool = False,
    run_once_start_time: Optional[Union[datetime, str]] = None,
) -> Page[ScheduleResponse]:
    """List schedules.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of stacks to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: The name of the stack to filter by.
        project: The project name/ID to filter by.
        user: Filter by user name/ID.
        pipeline_id: The id of the pipeline to filter by.
        orchestrator_id: The id of the orchestrator to filter by.
        active: Use to filter by active status.
        cron_expression: Use to filter by cron expression.
        start_time: Use to filter by start time.
        end_time: Use to filter by end time.
        interval_second: Use to filter by interval second.
        catchup: Use to filter by catchup.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        run_once_start_time: Use to filter by run once start time.

    Returns:
        A list of schedules.
    """
    schedule_filter_model = ScheduleFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        name=name,
        project=project or self.active_project.id,
        user=user,
        pipeline_id=pipeline_id,
        orchestrator_id=orchestrator_id,
        active=active,
        cron_expression=cron_expression,
        start_time=start_time,
        end_time=end_time,
        interval_second=interval_second,
        catchup=catchup,
        run_once_start_time=run_once_start_time,
    )
    return self.zen_store.list_schedules(
        schedule_filter_model=schedule_filter_model,
        hydrate=hydrate,
    )

list_secrets(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, private=None, user=None, hydrate=False)

Fetches all the secret models.

The returned secrets do not contain the secret values. To get the secret values, use get_secret individually for each secret.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of secrets to filter by.

None
created Optional[datetime]

Use to secrets by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
name Optional[str]

The name of the secret to filter by.

None
private Optional[bool]

The private status of the secret to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[SecretResponse]

A list of all the secret models without the secret values.

Raises:

Type Description
NotImplementedError

If centralized secrets management is not enabled.

Source code in src/zenml/client.py
4800
4801
4802
4803
4804
4805
4806
4807
4808
4809
4810
4811
4812
4813
4814
4815
4816
4817
4818
4819
4820
4821
4822
4823
4824
4825
4826
4827
4828
4829
4830
4831
4832
4833
4834
4835
4836
4837
4838
4839
4840
4841
4842
4843
4844
4845
4846
4847
4848
4849
4850
4851
4852
4853
4854
4855
4856
4857
4858
4859
4860
4861
def list_secrets(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    private: Optional[bool] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[SecretResponse]:
    """Fetches all the secret models.

    The returned secrets do not contain the secret values. To get the
    secret values, use `get_secret` individually for each secret.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of secrets to filter by.
        created: Use to secrets by time of creation
        updated: Use the last updated date for filtering
        name: The name of the secret to filter by.
        private: The private status of the secret to filter by.
        user: Filter by user name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A list of all the secret models without the secret values.

    Raises:
        NotImplementedError: If centralized secrets management is not
            enabled.
    """
    secret_filter_model = SecretFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        user=user,
        name=name,
        private=private,
        id=id,
        created=created,
        updated=updated,
    )
    try:
        return self.zen_store.list_secrets(
            secret_filter_model=secret_filter_model,
            hydrate=hydrate,
        )
    except NotImplementedError:
        raise NotImplementedError(
            "centralized secrets management is not supported or explicitly "
            "disabled in the target ZenML deployment."
        )

list_secrets_by_private_status(private, hydrate=False)

Fetches the list of secrets with a given private status.

The returned secrets do not contain the secret values. To get the secret values, use get_secret individually for each secret.

Parameters:

Name Type Description Default
private bool

The private status of the secrets to search for.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[SecretResponse]

The list of secrets in the given scope without the secret values.

Source code in src/zenml/client.py
5010
5011
5012
5013
5014
5015
5016
5017
5018
5019
5020
5021
5022
5023
5024
5025
5026
5027
5028
5029
5030
def list_secrets_by_private_status(
    self,
    private: bool,
    hydrate: bool = False,
) -> Page[SecretResponse]:
    """Fetches the list of secrets with a given private status.

    The returned secrets do not contain the secret values. To get the
    secret values, use `get_secret` individually for each secret.

    Args:
        private: The private status of the secrets to search for.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The list of secrets in the given scope without the secret values.
    """
    logger.debug(f"Fetching the secrets with private status '{private}'.")

    return self.list_secrets(private=private, hydrate=hydrate)

list_service_accounts(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, description=None, active=None, hydrate=False)

List all service accounts.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of stacks to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

Use the service account name for filtering

None
description Optional[str]

Use the service account description for filtering

None
active Optional[bool]

Use the service account active status for filtering

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ServiceAccountResponse]

The list of service accounts matching the filter description.

Source code in src/zenml/client.py
7306
7307
7308
7309
7310
7311
7312
7313
7314
7315
7316
7317
7318
7319
7320
7321
7322
7323
7324
7325
7326
7327
7328
7329
7330
7331
7332
7333
7334
7335
7336
7337
7338
7339
7340
7341
7342
7343
7344
7345
7346
7347
7348
7349
7350
7351
7352
7353
def list_service_accounts(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    description: Optional[str] = None,
    active: Optional[bool] = None,
    hydrate: bool = False,
) -> Page[ServiceAccountResponse]:
    """List all service accounts.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of stacks to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: Use the service account name for filtering
        description: Use the service account description for filtering
        active: Use the service account active status for filtering
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The list of service accounts matching the filter description.
    """
    return self.zen_store.list_service_accounts(
        ServiceAccountFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            description=description,
            active=active,
        ),
        hydrate=hydrate,
    )

list_service_connector_resources(connector_type=None, resource_type=None, resource_id=None)

List resources that can be accessed by service connectors.

Parameters:

Name Type Description Default
connector_type Optional[str]

The type of service connector to filter by.

None
resource_type Optional[str]

The type of resource to filter by.

None
resource_id Optional[str]

The ID of a particular resource instance to filter by.

None

Returns:

Type Description
List[ServiceConnectorResourcesModel]

The matching list of resources that available service

List[ServiceConnectorResourcesModel]

connectors have access to.

Source code in src/zenml/client.py
6051
6052
6053
6054
6055
6056
6057
6058
6059
6060
6061
6062
6063
6064
6065
6066
6067
6068
6069
6070
6071
6072
6073
6074
def list_service_connector_resources(
    self,
    connector_type: Optional[str] = None,
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
) -> List[ServiceConnectorResourcesModel]:
    """List resources that can be accessed by service connectors.

    Args:
        connector_type: The type of service connector to filter by.
        resource_type: The type of resource to filter by.
        resource_id: The ID of a particular resource instance to filter by.

    Returns:
        The matching list of resources that available service
        connectors have access to.
    """
    return self.zen_store.list_service_connector_resources(
        ServiceConnectorFilter(
            connector_type=connector_type,
            resource_type=resource_type,
            resource_id=resource_id,
        )
    )

list_service_connector_types(connector_type=None, resource_type=None, auth_method=None)

Get a list of service connector types.

Parameters:

Name Type Description Default
connector_type Optional[str]

Filter by connector type.

None
resource_type Optional[str]

Filter by resource type.

None
auth_method Optional[str]

Filter by authentication method.

None

Returns:

Type Description
List[ServiceConnectorTypeModel]

List of service connector types.

Source code in src/zenml/client.py
6076
6077
6078
6079
6080
6081
6082
6083
6084
6085
6086
6087
6088
6089
6090
6091
6092
6093
6094
6095
6096
def list_service_connector_types(
    self,
    connector_type: Optional[str] = None,
    resource_type: Optional[str] = None,
    auth_method: Optional[str] = None,
) -> List[ServiceConnectorTypeModel]:
    """Get a list of service connector types.

    Args:
        connector_type: Filter by connector type.
        resource_type: Filter by resource type.
        auth_method: Filter by authentication method.

    Returns:
        List of service connector types.
    """
    return self.zen_store.list_service_connector_types(
        connector_type=connector_type,
        resource_type=resource_type,
        auth_method=auth_method,
    )

list_service_connectors(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, connector_type=None, auth_method=None, resource_type=None, resource_id=None, user=None, labels=None, secret_id=None, hydrate=False)

Lists all registered service connectors.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

The id of the service connector to filter by.

None
created Optional[datetime]

Filter service connectors by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
connector_type Optional[str]

Use the service connector type for filtering

None
auth_method Optional[str]

Use the service connector auth method for filtering

None
resource_type Optional[str]

Filter service connectors by the resource type that they can give access to.

None
resource_id Optional[str]

Filter service connectors by the resource id that they can give access to.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
name Optional[str]

The name of the service connector to filter by.

None
labels Optional[Dict[str, Optional[str]]]

The labels of the service connector to filter by.

None
secret_id Optional[Union[str, UUID]]

Filter by the id of the secret that is referenced by the service connector.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ServiceConnectorResponse]

A page of service connectors.

Source code in src/zenml/client.py
5545
5546
5547
5548
5549
5550
5551
5552
5553
5554
5555
5556
5557
5558
5559
5560
5561
5562
5563
5564
5565
5566
5567
5568
5569
5570
5571
5572
5573
5574
5575
5576
5577
5578
5579
5580
5581
5582
5583
5584
5585
5586
5587
5588
5589
5590
5591
5592
5593
5594
5595
5596
5597
5598
5599
5600
5601
5602
5603
5604
5605
5606
5607
5608
5609
5610
5611
def list_service_connectors(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    connector_type: Optional[str] = None,
    auth_method: Optional[str] = None,
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
    user: Optional[Union[UUID, str]] = None,
    labels: Optional[Dict[str, Optional[str]]] = None,
    secret_id: Optional[Union[str, UUID]] = None,
    hydrate: bool = False,
) -> Page[ServiceConnectorResponse]:
    """Lists all registered service connectors.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: The id of the service connector to filter by.
        created: Filter service connectors by time of creation
        updated: Use the last updated date for filtering
        connector_type: Use the service connector type for filtering
        auth_method: Use the service connector auth method for filtering
        resource_type: Filter service connectors by the resource type that
            they can give access to.
        resource_id: Filter service connectors by the resource id that
            they can give access to.
        user: Filter by user name/ID.
        name: The name of the service connector to filter by.
        labels: The labels of the service connector to filter by.
        secret_id: Filter by the id of the secret that is referenced by the
            service connector.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of service connectors.
    """
    connector_filter_model = ServiceConnectorFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        user=user,
        name=name,
        connector_type=connector_type,
        auth_method=auth_method,
        resource_type=resource_type,
        resource_id=resource_id,
        id=id,
        created=created,
        updated=updated,
        labels=labels,
        secret_id=secret_id,
    )
    return self.zen_store.list_service_connectors(
        filter_model=connector_filter_model,
        hydrate=hydrate,
    )

list_services(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, type=None, flavor=None, user=None, project=None, hydrate=False, running=None, service_name=None, pipeline_name=None, pipeline_run_id=None, pipeline_step_name=None, model_version_id=None, config=None)

List all services.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of services to filter by.

None
created Optional[datetime]

Use to filter by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
type Optional[str]

Use the service type for filtering

None
flavor Optional[str]

Use the service flavor for filtering

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False
running Optional[bool]

Use the running status for filtering

None
pipeline_name Optional[str]

Use the pipeline name for filtering

None
service_name Optional[str]

Use the service name or model name for filtering

None
pipeline_step_name Optional[str]

Use the pipeline step name for filtering

None
model_version_id Optional[Union[str, UUID]]

Use the model version id for filtering

None
config Optional[Dict[str, Any]]

Use the config for filtering

None
pipeline_run_id Optional[str]

Use the pipeline run id for filtering

None

Returns:

Type Description
Page[ServiceResponse]

The Service response page.

Source code in src/zenml/client.py
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
def list_services(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    type: Optional[str] = None,
    flavor: Optional[str] = None,
    user: Optional[Union[UUID, str]] = None,
    project: Optional[Union[str, UUID]] = None,
    hydrate: bool = False,
    running: Optional[bool] = None,
    service_name: Optional[str] = None,
    pipeline_name: Optional[str] = None,
    pipeline_run_id: Optional[str] = None,
    pipeline_step_name: Optional[str] = None,
    model_version_id: Optional[Union[str, UUID]] = None,
    config: Optional[Dict[str, Any]] = None,
) -> Page[ServiceResponse]:
    """List all services.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of services to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        type: Use the service type for filtering
        flavor: Use the service flavor for filtering
        project: The project name/ID to filter by.
        user: Filter by user name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        running: Use the running status for filtering
        pipeline_name: Use the pipeline name for filtering
        service_name: Use the service name or model name
            for filtering
        pipeline_step_name: Use the pipeline step name for filtering
        model_version_id: Use the model version id for filtering
        config: Use the config for filtering
        pipeline_run_id: Use the pipeline run id for filtering

    Returns:
        The Service response page.
    """
    service_filter_model = ServiceFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        type=type,
        flavor=flavor,
        project=project or self.active_project.id,
        user=user,
        running=running,
        name=service_name,
        pipeline_name=pipeline_name,
        pipeline_step_name=pipeline_step_name,
        model_version_id=model_version_id,
        pipeline_run_id=pipeline_run_id,
        config=dict_to_bytes(config) if config else None,
    )
    return self.zen_store.list_services(
        filter_model=service_filter_model, hydrate=hydrate
    )

list_stack_components(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, flavor=None, type=None, connector_id=None, stack_id=None, user=None, hydrate=False)

Lists all registered stack components.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of component to filter by.

None
created Optional[datetime]

Use to component by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
flavor Optional[str]

Use the component flavor for filtering

None
type Optional[str]

Use the component type for filtering

None
connector_id Optional[Union[str, UUID]]

The id of the connector to filter by.

None
stack_id Optional[Union[str, UUID]]

The id of the stack to filter by.

None
name Optional[str]

The name of the component to filter by.

None
user Optional[Union[UUID, str]]

The ID of name of the user to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ComponentResponse]

A page of stack components.

Source code in src/zenml/client.py
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
def list_stack_components(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    flavor: Optional[str] = None,
    type: Optional[str] = None,
    connector_id: Optional[Union[str, UUID]] = None,
    stack_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[ComponentResponse]:
    """Lists all registered stack components.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of component to filter by.
        created: Use to component by time of creation
        updated: Use the last updated date for filtering
        flavor: Use the component flavor for filtering
        type: Use the component type for filtering
        connector_id: The id of the connector to filter by.
        stack_id: The id of the stack to filter by.
        name: The name of the component to filter by.
        user: The ID of name of the user to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of stack components.
    """
    component_filter_model = ComponentFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        connector_id=connector_id,
        stack_id=stack_id,
        name=name,
        flavor=flavor,
        type=type,
        id=id,
        created=created,
        updated=updated,
        user=user,
    )

    return self.zen_store.list_stack_components(
        component_filter_model=component_filter_model, hydrate=hydrate
    )

list_stacks(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, description=None, component_id=None, user=None, component=None, hydrate=False)

Lists all stacks.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of stacks to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
description Optional[str]

Use the stack description for filtering

None
component_id Optional[Union[str, UUID]]

The id of the component to filter by.

None
user Optional[Union[UUID, str]]

The name/ID of the user to filter by.

None
component Optional[Union[UUID, str]]

The name/ID of the component to filter by.

None
name Optional[str]

The name of the stack to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[StackResponse]

A page of stacks.

Source code in src/zenml/client.py
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
def list_stacks(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    description: Optional[str] = None,
    component_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    component: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[StackResponse]:
    """Lists all stacks.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of stacks to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        description: Use the stack description for filtering
        component_id: The id of the component to filter by.
        user: The name/ID of the user to filter by.
        component: The name/ID of the component to filter by.
        name: The name of the stack to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of stacks.
    """
    stack_filter_model = StackFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        component_id=component_id,
        user=user,
        component=component,
        name=name,
        description=description,
        id=id,
        created=created,
        updated=updated,
    )
    return self.zen_store.list_stacks(stack_filter_model, hydrate=hydrate)

list_tags(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, user=None, created=None, updated=None, name=None, color=None, exclusive=None, resource_type=None, hydrate=False)

Get tags by filter.

Parameters:

Name Type Description Default
sort_by str

The column to sort by.

'created'
page int

The page of items.

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or].

AND
id Optional[Union[UUID, str]]

Use the id of stacks to filter by.

None
user Optional[Union[UUID, str]]

Use the user to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation.

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering.

None
name Optional[str]

The name of the tag.

None
color Optional[Union[str, ColorVariants]]

The color of the tag.

None
exclusive Optional[bool]

Flag indicating whether the tag is exclusive.

None
resource_type Optional[Union[str, TaggableResourceTypes]]

Filter tags associated with a specific resource type.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[TagResponse]

A page of all tags.

Source code in src/zenml/client.py
7785
7786
7787
7788
7789
7790
7791
7792
7793
7794
7795
7796
7797
7798
7799
7800
7801
7802
7803
7804
7805
7806
7807
7808
7809
7810
7811
7812
7813
7814
7815
7816
7817
7818
7819
7820
7821
7822
7823
7824
7825
7826
7827
7828
7829
7830
7831
7832
7833
7834
7835
7836
7837
7838
def list_tags(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    user: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    color: Optional[Union[str, ColorVariants]] = None,
    exclusive: Optional[bool] = None,
    resource_type: Optional[Union[str, TaggableResourceTypes]] = None,
    hydrate: bool = False,
) -> Page[TagResponse]:
    """Get tags by filter.

    Args:
        sort_by: The column to sort by.
        page: The page of items.
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or].
        id: Use the id of stacks to filter by.
        user: Use the user to filter by.
        created: Use to filter by time of creation.
        updated: Use the last updated date for filtering.
        name: The name of the tag.
        color: The color of the tag.
        exclusive: Flag indicating whether the tag is exclusive.
        resource_type: Filter tags associated with a specific resource type.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of all tags.
    """
    return self.zen_store.list_tags(
        tag_filter_model=TagFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            user=user,
            created=created,
            updated=updated,
            name=name,
            color=color,
            exclusive=exclusive,
            resource_type=resource_type,
        ),
        hydrate=hydrate,
    )

list_trigger_executions(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, trigger_id=None, user=None, project=None, hydrate=False)

List all trigger executions matching the given filter criteria.

Parameters:

Name Type Description Default
sort_by str

The column to sort by.

'created'
page int

The page of items.

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages.

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or].

AND
trigger_id Optional[UUID]

ID of the trigger to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
project Optional[Union[UUID, str]]

Filter by project name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[TriggerExecutionResponse]

A list of all trigger executions matching the filter criteria.

Source code in src/zenml/client.py
6966
6967
6968
6969
6970
6971
6972
6973
6974
6975
6976
6977
6978
6979
6980
6981
6982
6983
6984
6985
6986
6987
6988
6989
6990
6991
6992
6993
6994
6995
6996
6997
6998
6999
7000
7001
7002
7003
7004
def list_trigger_executions(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    trigger_id: Optional[UUID] = None,
    user: Optional[Union[UUID, str]] = None,
    project: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[TriggerExecutionResponse]:
    """List all trigger executions matching the given filter criteria.

    Args:
        sort_by: The column to sort by.
        page: The page of items.
        size: The maximum size of all pages.
        logical_operator: Which logical operator to use [and, or].
        trigger_id: ID of the trigger to filter by.
        user: Filter by user name/ID.
        project: Filter by project name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A list of all trigger executions matching the filter criteria.
    """
    filter_model = TriggerExecutionFilter(
        trigger_id=trigger_id,
        sort_by=sort_by,
        page=page,
        size=size,
        user=user,
        logical_operator=logical_operator,
        project=project or self.active_project.id,
    )
    return self.zen_store.list_trigger_executions(
        trigger_execution_filter_model=filter_model, hydrate=hydrate
    )

list_triggers(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, event_source_id=None, action_id=None, event_source_flavor=None, event_source_subtype=None, action_flavor=None, action_subtype=None, project=None, user=None, hydrate=False)

Lists all triggers.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of triggers to filter by.

None
created Optional[datetime]

Use to filter by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
name Optional[str]

The name of the trigger to filter by.

None
event_source_id Optional[UUID]

The event source associated with the trigger.

None
action_id Optional[UUID]

The action associated with the trigger.

None
event_source_flavor Optional[str]

Flavor of the event source associated with the trigger.

None
event_source_subtype Optional[str]

Type of the event source associated with the trigger.

None
action_flavor Optional[str]

Flavor of the action associated with the trigger.

None
action_subtype Optional[str]

Type of the action associated with the trigger.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[TriggerResponse]

A page of triggers.

Source code in src/zenml/client.py
3206
3207
3208
3209
3210
3211
3212
3213
3214
3215
3216
3217
3218
3219
3220
3221
3222
3223
3224
3225
3226
3227
3228
3229
3230
3231
3232
3233
3234
3235
3236
3237
3238
3239
3240
3241
3242
3243
3244
3245
3246
3247
3248
3249
3250
3251
3252
3253
3254
3255
3256
3257
3258
3259
3260
3261
3262
3263
3264
3265
3266
3267
3268
3269
3270
3271
3272
3273
3274
@_fail_for_sql_zen_store
def list_triggers(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    event_source_id: Optional[UUID] = None,
    action_id: Optional[UUID] = None,
    event_source_flavor: Optional[str] = None,
    event_source_subtype: Optional[str] = None,
    action_flavor: Optional[str] = None,
    action_subtype: Optional[str] = None,
    project: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[TriggerResponse]:
    """Lists all triggers.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of triggers to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        project: The project name/ID to filter by.
        user: Filter by user name/ID.
        name: The name of the trigger to filter by.
        event_source_id: The event source associated with the trigger.
        action_id: The action associated with the trigger.
        event_source_flavor: Flavor of the event source associated with the
            trigger.
        event_source_subtype: Type of the event source associated with the
            trigger.
        action_flavor: Flavor of the action associated with the trigger.
        action_subtype: Type of the action associated with the trigger.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of triggers.
    """
    trigger_filter_model = TriggerFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        project=project or self.active_project.id,
        user=user,
        name=name,
        event_source_id=event_source_id,
        action_id=action_id,
        event_source_flavor=event_source_flavor,
        event_source_subtype=event_source_subtype,
        action_flavor=action_flavor,
        action_subtype=action_subtype,
        id=id,
        created=created,
        updated=updated,
    )
    return self.zen_store.list_triggers(
        trigger_filter_model, hydrate=hydrate
    )

list_users(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, external_user_id=None, created=None, updated=None, name=None, full_name=None, email=None, active=None, email_opted_in=None, hydrate=False)

List all users.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of stacks to filter by.

None
external_user_id Optional[str]

Use the external user id for filtering.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

Use the username for filtering

None
full_name Optional[str]

Use the user full name for filtering

None
email Optional[str]

Use the user email for filtering

None
active Optional[bool]

User the user active status for filtering

None
email_opted_in Optional[bool]

Use the user opt in status for filtering

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[UserResponse]

The User

Source code in src/zenml/client.py
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
def list_users(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    external_user_id: Optional[str] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    full_name: Optional[str] = None,
    email: Optional[str] = None,
    active: Optional[bool] = None,
    email_opted_in: Optional[bool] = None,
    hydrate: bool = False,
) -> Page[UserResponse]:
    """List all users.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of stacks to filter by.
        external_user_id: Use the external user id for filtering.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: Use the username for filtering
        full_name: Use the user full name for filtering
        email: Use the user email for filtering
        active: User the user active status for filtering
        email_opted_in: Use the user opt in status for filtering
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The User
    """
    return self.zen_store.list_users(
        UserFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            external_user_id=external_user_id,
            created=created,
            updated=updated,
            name=name,
            full_name=full_name,
            email=email,
            active=active,
            email_opted_in=email_opted_in,
        ),
        hydrate=hydrate,
    )

login_service_connector(name_id_or_prefix, resource_type=None, resource_id=None, **kwargs)

Use a service connector to authenticate a local client/SDK.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the service connector to use.

required
resource_type Optional[str]

The type of the resource to connect to. If not provided, the resource type from the service connector configuration will be used.

None
resource_id Optional[str]

The ID of a particular resource instance to configure the local client to connect to. If the connector instance is already configured with a resource ID that is not the same or equivalent to the one requested, a ValueError exception is raised. May be omitted for connectors and resource types that do not support multiple resource instances.

None
kwargs Any

Additional implementation specific keyword arguments to use to configure the client.

{}

Returns:

Type Description
ServiceConnector

The service connector client instance that was used to configure the

ServiceConnector

local client.

Source code in src/zenml/client.py
5931
5932
5933
5934
5935
5936
5937
5938
5939
5940
5941
5942
5943
5944
5945
5946
5947
5948
5949
5950
5951
5952
5953
5954
5955
5956
5957
5958
5959
5960
5961
5962
5963
5964
5965
5966
5967
5968
5969
5970
def login_service_connector(
    self,
    name_id_or_prefix: Union[UUID, str],
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
    **kwargs: Any,
) -> "ServiceConnector":
    """Use a service connector to authenticate a local client/SDK.

    Args:
        name_id_or_prefix: The name, id or prefix of the service connector
            to use.
        resource_type: The type of the resource to connect to. If not
            provided, the resource type from the service connector
            configuration will be used.
        resource_id: The ID of a particular resource instance to configure
            the local client to connect to. If the connector instance is
            already configured with a resource ID that is not the same or
            equivalent to the one requested, a `ValueError` exception is
            raised. May be omitted for connectors and resource types that do
            not support multiple resource instances.
        kwargs: Additional implementation specific keyword arguments to use
            to configure the client.

    Returns:
        The service connector client instance that was used to configure the
        local client.
    """
    connector_client = self.get_service_connector_client(
        name_id_or_prefix=name_id_or_prefix,
        resource_type=resource_type,
        resource_id=resource_id,
        verify=False,
    )

    connector_client.configure_local_client(
        **kwargs,
    )

    return connector_client

prune_artifacts(only_versions=True, delete_from_artifact_store=False, project=None)

Delete all unused artifacts and artifact versions.

Parameters:

Name Type Description Default
only_versions bool

Only delete artifact versions, keeping artifacts

True
delete_from_artifact_store bool

Delete data from artifact metadata

False
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None
Source code in src/zenml/client.py
4254
4255
4256
4257
4258
4259
4260
4261
4262
4263
4264
4265
4266
4267
4268
4269
4270
4271
4272
4273
4274
4275
4276
4277
4278
4279
4280
4281
4282
4283
def prune_artifacts(
    self,
    only_versions: bool = True,
    delete_from_artifact_store: bool = False,
    project: Optional[Union[str, UUID]] = None,
) -> None:
    """Delete all unused artifacts and artifact versions.

    Args:
        only_versions: Only delete artifact versions, keeping artifacts
        delete_from_artifact_store: Delete data from artifact metadata
        project: The project name/ID to filter by.
    """
    if delete_from_artifact_store:
        unused_artifact_versions = depaginate(
            self.list_artifact_versions,
            only_unused=True,
            project=project,
        )
        for unused_artifact_version in unused_artifact_versions:
            self._delete_artifact_from_artifact_store(
                unused_artifact_version
            )

    project = project or self.active_project.id

    self.zen_store.prune_artifact_versions(
        project_name_or_id=project, only_versions=only_versions
    )
    logger.info("All unused artifacts and artifact versions deleted.")

restore_secrets(ignore_errors=False, delete_secrets=False)

Restore all secrets from the configured backup secrets store.

Parameters:

Name Type Description Default
ignore_errors bool

Whether to ignore individual errors during the restore process and attempt to restore all secrets.

False
delete_secrets bool

Whether to delete the secrets that have been successfully restored from the backup secrets store. Setting this flag effectively moves all secrets from the backup secrets store to the primary secrets store.

False
Source code in src/zenml/client.py
5051
5052
5053
5054
5055
5056
5057
5058
5059
5060
5061
5062
5063
5064
5065
5066
5067
5068
def restore_secrets(
    self,
    ignore_errors: bool = False,
    delete_secrets: bool = False,
) -> None:
    """Restore all secrets from the configured backup secrets store.

    Args:
        ignore_errors: Whether to ignore individual errors during the
            restore process and attempt to restore all secrets.
        delete_secrets: Whether to delete the secrets that have been
            successfully restored from the backup secrets store. Setting
            this flag effectively moves all secrets from the backup secrets
            store to the primary secrets store.
    """
    self.zen_store.restore_secrets(
        ignore_errors=ignore_errors, delete_secrets=delete_secrets
    )

rotate_api_key(service_account_name_id_or_prefix, name_id_or_prefix, retain_period_minutes=0, set_key=False)

Rotate an API key.

Parameters:

Name Type Description Default
service_account_name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the service account to rotate the API key for.

required
name_id_or_prefix Union[UUID, str]

Name, ID or prefix of the API key to update.

required
retain_period_minutes int

The number of minutes to retain the old API key for. If set to 0, the old API key will be invalidated.

0
set_key bool

Whether to set the rotated API key as the active API key.

False

Returns:

Type Description
APIKeyResponse

The updated API key.

Source code in src/zenml/client.py
7622
7623
7624
7625
7626
7627
7628
7629
7630
7631
7632
7633
7634
7635
7636
7637
7638
7639
7640
7641
7642
7643
7644
7645
7646
7647
7648
7649
7650
7651
7652
7653
7654
7655
7656
7657
7658
7659
def rotate_api_key(
    self,
    service_account_name_id_or_prefix: Union[str, UUID],
    name_id_or_prefix: Union[UUID, str],
    retain_period_minutes: int = 0,
    set_key: bool = False,
) -> APIKeyResponse:
    """Rotate an API key.

    Args:
        service_account_name_id_or_prefix: The name, ID or prefix of the
            service account to rotate the API key for.
        name_id_or_prefix: Name, ID or prefix of the API key to update.
        retain_period_minutes: The number of minutes to retain the old API
            key for. If set to 0, the old API key will be invalidated.
        set_key: Whether to set the rotated API key as the active API key.

    Returns:
        The updated API key.
    """
    api_key = self.get_api_key(
        service_account_name_id_or_prefix=service_account_name_id_or_prefix,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
    )
    rotate_request = APIKeyRotateRequest(
        retain_period_minutes=retain_period_minutes
    )
    new_key = self.zen_store.rotate_api_key(
        service_account_id=api_key.service_account.id,
        api_key_name_or_id=api_key.id,
        rotate_request=rotate_request,
    )
    assert new_key.key is not None
    if set_key:
        self.set_api_key(key=new_key.key)

    return new_key

set_active_project(project_name_or_id)

Set the project for the local client.

Parameters:

Name Type Description Default
project_name_or_id Union[str, UUID]

The name or ID of the project to set active.

required

Returns:

Type Description
ProjectResponse

The model of the active project.

Source code in src/zenml/client.py
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
def set_active_project(
    self, project_name_or_id: Union[str, UUID]
) -> "ProjectResponse":
    """Set the project for the local client.

    Args:
        project_name_or_id: The name or ID of the project to set active.

    Returns:
        The model of the active project.
    """
    project = self.zen_store.get_project(
        project_name_or_id=project_name_or_id
    )  # raises KeyError
    if self._config:
        self._config.set_active_project(project)
        # Sanitize the client configuration to reflect the current
        # settings
        self._sanitize_config()
    else:
        # set the active project globally only if the client doesn't use
        # a local configuration
        GlobalConfiguration().set_active_project(project)
    return project

set_api_key(key)

Configure the client with an API key.

Parameters:

Name Type Description Default
key str

The API key to use.

required

Raises:

Type Description
NotImplementedError

If the client is not connected to a ZenML server.

Source code in src/zenml/client.py
7442
7443
7444
7445
7446
7447
7448
7449
7450
7451
7452
7453
7454
7455
7456
7457
7458
7459
7460
7461
7462
7463
7464
7465
7466
7467
7468
7469
def set_api_key(self, key: str) -> None:
    """Configure the client with an API key.

    Args:
        key: The API key to use.

    Raises:
        NotImplementedError: If the client is not connected to a ZenML
            server.
    """
    from zenml.login.credentials_store import get_credentials_store
    from zenml.zen_stores.rest_zen_store import RestZenStore

    zen_store = self.zen_store
    if not zen_store.TYPE == StoreType.REST:
        raise NotImplementedError(
            "API key configuration is only supported if connected to a "
            "ZenML server."
        )

    credentials_store = get_credentials_store()
    assert isinstance(zen_store, RestZenStore)

    credentials_store.set_api_key(server_url=zen_store.url, api_key=key)

    # Force a re-authentication to start using the new API key
    # right away.
    zen_store.authenticate(force=True)

trigger_pipeline(pipeline_name_or_id=None, run_configuration=None, config_path=None, template_id=None, stack_name_or_id=None, synchronous=False, project=None)

Trigger a pipeline from the server.

Usage examples: * Run the latest runnable template for a pipeline:

Client().trigger_pipeline(pipeline_name_or_id=<NAME>)
  • Run the latest runnable template for a pipeline on a specific stack:
Client().trigger_pipeline(
    pipeline_name_or_id=<NAME>,
    stack_name_or_id=<STACK_NAME_OR_ID>
)
  • Run a specific template:
Client().trigger_pipeline(template_id=<ID>)

Parameters:

Name Type Description Default
pipeline_name_or_id Union[str, UUID, None]

Name or ID of the pipeline. If this is specified, the latest runnable template for this pipeline will be used for the run (Runnable here means that the build associated with the template is for a remote stack without any custom flavor stack components). If not given, a template ID that should be run needs to be specified.

None
run_configuration Union[PipelineRunConfiguration, Dict[str, Any], None]

Configuration for the run. Either this or a path to a config file can be specified.

None
config_path Optional[str]

Path to a YAML configuration file. This file will be parsed as a PipelineRunConfiguration object. Either this or the configuration in code can be specified.

None
template_id Optional[UUID]

ID of the template to run. Either this or a pipeline can be specified.

None
stack_name_or_id Union[str, UUID, None]

Name or ID of the stack on which to run the pipeline. If not specified, this method will try to find a runnable template on any stack.

None
synchronous bool

If True, this method will wait until the triggered run is finished.

False
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Raises:

Type Description
RuntimeError

If triggering the pipeline failed.

Returns:

Type Description
PipelineRunResponse

Model of the pipeline run.

Source code in src/zenml/client.py
2442
2443
2444
2445
2446
2447
2448
2449
2450
2451
2452
2453
2454
2455
2456
2457
2458
2459
2460
2461
2462
2463
2464
2465
2466
2467
2468
2469
2470
2471
2472
2473
2474
2475
2476
2477
2478
2479
2480
2481
2482
2483
2484
2485
2486
2487
2488
2489
2490
2491
2492
2493
2494
2495
2496
2497
2498
2499
2500
2501
2502
2503
2504
2505
2506
2507
2508
2509
2510
2511
2512
2513
2514
2515
2516
2517
2518
2519
2520
2521
2522
2523
2524
2525
2526
2527
2528
2529
2530
2531
2532
2533
2534
2535
2536
2537
2538
2539
2540
2541
2542
2543
2544
2545
2546
2547
2548
2549
2550
2551
2552
2553
2554
2555
2556
2557
2558
2559
2560
2561
2562
2563
2564
2565
2566
2567
2568
2569
2570
2571
2572
2573
2574
2575
2576
2577
2578
2579
2580
2581
2582
2583
2584
2585
2586
2587
2588
2589
@_fail_for_sql_zen_store
def trigger_pipeline(
    self,
    pipeline_name_or_id: Union[str, UUID, None] = None,
    run_configuration: Union[
        PipelineRunConfiguration, Dict[str, Any], None
    ] = None,
    config_path: Optional[str] = None,
    template_id: Optional[UUID] = None,
    stack_name_or_id: Union[str, UUID, None] = None,
    synchronous: bool = False,
    project: Optional[Union[str, UUID]] = None,
) -> PipelineRunResponse:
    """Trigger a pipeline from the server.

    Usage examples:
    * Run the latest runnable template for a pipeline:
    ```python
    Client().trigger_pipeline(pipeline_name_or_id=<NAME>)
    ```
    * Run the latest runnable template for a pipeline on a specific stack:
    ```python
    Client().trigger_pipeline(
        pipeline_name_or_id=<NAME>,
        stack_name_or_id=<STACK_NAME_OR_ID>
    )
    ```
    * Run a specific template:
    ```python
    Client().trigger_pipeline(template_id=<ID>)
    ```

    Args:
        pipeline_name_or_id: Name or ID of the pipeline. If this is
            specified, the latest runnable template for this pipeline will
            be used for the run (Runnable here means that the build
            associated with the template is for a remote stack without any
            custom flavor stack components). If not given, a template ID
            that should be run needs to be specified.
        run_configuration: Configuration for the run. Either this or a
            path to a config file can be specified.
        config_path: Path to a YAML configuration file. This file will be
            parsed as a `PipelineRunConfiguration` object. Either this or
            the configuration in code can be specified.
        template_id: ID of the template to run. Either this or a pipeline
            can be specified.
        stack_name_or_id: Name or ID of the stack on which to run the
            pipeline. If not specified, this method will try to find a
            runnable template on any stack.
        synchronous: If `True`, this method will wait until the triggered
            run is finished.
        project: The project name/ID to filter by.

    Raises:
        RuntimeError: If triggering the pipeline failed.

    Returns:
        Model of the pipeline run.
    """
    from zenml.pipelines.run_utils import (
        validate_run_config_is_runnable_from_server,
        validate_stack_is_runnable_from_server,
        wait_for_pipeline_run_to_finish,
    )

    if Counter([template_id, pipeline_name_or_id])[None] != 1:
        raise RuntimeError(
            "You need to specify exactly one of pipeline or template "
            "to trigger."
        )

    if run_configuration and config_path:
        raise RuntimeError(
            "Only config path or runtime configuration can be specified."
        )

    if config_path:
        run_configuration = PipelineRunConfiguration.from_yaml(config_path)

    if isinstance(run_configuration, Dict):
        run_configuration = PipelineRunConfiguration.model_validate(
            run_configuration
        )

    if run_configuration:
        validate_run_config_is_runnable_from_server(run_configuration)

    if template_id:
        if stack_name_or_id:
            logger.warning(
                "Template ID and stack specified, ignoring the stack and "
                "using stack associated with the template instead."
            )

        run = self.zen_store.run_template(
            template_id=template_id,
            run_configuration=run_configuration,
        )
    else:
        assert pipeline_name_or_id
        pipeline = self.get_pipeline(name_id_or_prefix=pipeline_name_or_id)

        stack = None
        if stack_name_or_id:
            stack = self.get_stack(
                stack_name_or_id, allow_name_prefix_match=False
            )
            validate_stack_is_runnable_from_server(
                zen_store=self.zen_store, stack=stack
            )

        templates = depaginate(
            self.list_run_templates,
            pipeline_id=pipeline.id,
            stack_id=stack.id if stack else None,
            project=project or pipeline.project_id,
        )

        for template in templates:
            if not template.build:
                continue

            stack = template.build.stack
            if not stack:
                continue

            try:
                validate_stack_is_runnable_from_server(
                    zen_store=self.zen_store, stack=stack
                )
            except ValueError:
                continue

            run = self.zen_store.run_template(
                template_id=template.id,
                run_configuration=run_configuration,
            )
            break
        else:
            raise RuntimeError(
                "Unable to find a runnable template for the given stack "
                "and pipeline."
            )

    if synchronous:
        run = wait_for_pipeline_run_to_finish(run_id=run.id)

    return run

update_action(name_id_or_prefix, name=None, description=None, configuration=None, service_account_id=None, auth_window=None, project=None)

Update an action.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the action to update.

required
name Optional[str]

The new name of the action.

None
description Optional[str]

The new description of the action.

None
configuration Optional[Dict[str, Any]]

The new configuration of the action.

None
service_account_id Optional[UUID]

The new service account that is used to execute the action.

None
auth_window Optional[int]

The new time window in minutes for which the service account is authorized to execute the action. Set this to 0 to authorize the service account indefinitely (not recommended).

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Returns:

Type Description
ActionResponse

The updated action.

Source code in src/zenml/client.py
3074
3075
3076
3077
3078
3079
3080
3081
3082
3083
3084
3085
3086
3087
3088
3089
3090
3091
3092
3093
3094
3095
3096
3097
3098
3099
3100
3101
3102
3103
3104
3105
3106
3107
3108
3109
3110
3111
3112
3113
3114
3115
3116
3117
3118
3119
@_fail_for_sql_zen_store
def update_action(
    self,
    name_id_or_prefix: Union[UUID, str],
    name: Optional[str] = None,
    description: Optional[str] = None,
    configuration: Optional[Dict[str, Any]] = None,
    service_account_id: Optional[UUID] = None,
    auth_window: Optional[int] = None,
    project: Optional[Union[str, UUID]] = None,
) -> ActionResponse:
    """Update an action.

    Args:
        name_id_or_prefix: The name, id or prefix of the action to update.
        name: The new name of the action.
        description: The new description of the action.
        configuration: The new configuration of the action.
        service_account_id: The new service account that is used to execute
            the action.
        auth_window: The new time window in minutes for which the service
            account is authorized to execute the action. Set this to 0 to
            authorize the service account indefinitely (not recommended).
        project: The project name/ID to filter by.

    Returns:
        The updated action.
    """
    action = self.get_action(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
        project=project,
    )

    update_model = ActionUpdate(
        name=name,
        description=description,
        configuration=configuration,
        service_account_id=service_account_id,
        auth_window=auth_window,
    )

    return self.zen_store.update_action(
        action_id=action.id,
        action_update=update_model,
    )

update_api_key(service_account_name_id_or_prefix, name_id_or_prefix, name=None, description=None, active=None)

Update an API key.

Parameters:

Name Type Description Default
service_account_name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the service account to update the API key for.

required
name_id_or_prefix Union[UUID, str]

Name, ID or prefix of the API key to update.

required
name Optional[str]

New name of the API key.

None
description Optional[str]

New description of the API key.

None
active Optional[bool]

Whether the API key is active or not.

None

Returns:

Type Description
APIKeyResponse

The updated API key.

Source code in src/zenml/client.py
7587
7588
7589
7590
7591
7592
7593
7594
7595
7596
7597
7598
7599
7600
7601
7602
7603
7604
7605
7606
7607
7608
7609
7610
7611
7612
7613
7614
7615
7616
7617
7618
7619
7620
def update_api_key(
    self,
    service_account_name_id_or_prefix: Union[str, UUID],
    name_id_or_prefix: Union[UUID, str],
    name: Optional[str] = None,
    description: Optional[str] = None,
    active: Optional[bool] = None,
) -> APIKeyResponse:
    """Update an API key.

    Args:
        service_account_name_id_or_prefix: The name, ID or prefix of the
            service account to update the API key for.
        name_id_or_prefix: Name, ID or prefix of the API key to update.
        name: New name of the API key.
        description: New description of the API key.
        active: Whether the API key is active or not.

    Returns:
        The updated API key.
    """
    api_key = self.get_api_key(
        service_account_name_id_or_prefix=service_account_name_id_or_prefix,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
    )
    update = APIKeyUpdate(
        name=name, description=description, active=active
    )
    return self.zen_store.update_api_key(
        service_account_id=api_key.service_account.id,
        api_key_name_or_id=api_key.id,
        api_key_update=update,
    )

update_artifact(name_id_or_prefix, new_name=None, add_tags=None, remove_tags=None, has_custom_name=None, project=None)

Update an artifact.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the artifact to update.

required
new_name Optional[str]

The new name of the artifact.

None
add_tags Optional[List[str]]

Tags to add to the artifact.

None
remove_tags Optional[List[str]]

Tags to remove from the artifact.

None
has_custom_name Optional[bool]

Whether the artifact has a custom name.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Returns:

Type Description
ArtifactResponse

The updated artifact.

Source code in src/zenml/client.py
4200
4201
4202
4203
4204
4205
4206
4207
4208
4209
4210
4211
4212
4213
4214
4215
4216
4217
4218
4219
4220
4221
4222
4223
4224
4225
4226
4227
4228
4229
4230
4231
4232
4233
4234
def update_artifact(
    self,
    name_id_or_prefix: Union[str, UUID],
    new_name: Optional[str] = None,
    add_tags: Optional[List[str]] = None,
    remove_tags: Optional[List[str]] = None,
    has_custom_name: Optional[bool] = None,
    project: Optional[Union[str, UUID]] = None,
) -> ArtifactResponse:
    """Update an artifact.

    Args:
        name_id_or_prefix: The name, ID or prefix of the artifact to update.
        new_name: The new name of the artifact.
        add_tags: Tags to add to the artifact.
        remove_tags: Tags to remove from the artifact.
        has_custom_name: Whether the artifact has a custom name.
        project: The project name/ID to filter by.

    Returns:
        The updated artifact.
    """
    artifact = self.get_artifact(
        name_id_or_prefix=name_id_or_prefix,
        project=project,
    )
    artifact_update = ArtifactUpdate(
        name=new_name,
        add_tags=add_tags,
        remove_tags=remove_tags,
        has_custom_name=has_custom_name,
    )
    return self.zen_store.update_artifact(
        artifact_id=artifact.id, artifact_update=artifact_update
    )

update_artifact_version(name_id_or_prefix, version=None, add_tags=None, remove_tags=None, project=None)

Update an artifact version.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the artifact to update.

required
version Optional[str]

The version of the artifact to update. Only used if name_id_or_prefix is the name of the artifact. If not specified, the latest version is updated.

None
add_tags Optional[List[str]]

Tags to add to the artifact version.

None
remove_tags Optional[List[str]]

Tags to remove from the artifact version.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Returns:

Type Description
ArtifactVersionResponse

The updated artifact version.

Source code in src/zenml/client.py
4442
4443
4444
4445
4446
4447
4448
4449
4450
4451
4452
4453
4454
4455
4456
4457
4458
4459
4460
4461
4462
4463
4464
4465
4466
4467
4468
4469
4470
4471
4472
4473
4474
4475
def update_artifact_version(
    self,
    name_id_or_prefix: Union[str, UUID],
    version: Optional[str] = None,
    add_tags: Optional[List[str]] = None,
    remove_tags: Optional[List[str]] = None,
    project: Optional[Union[str, UUID]] = None,
) -> ArtifactVersionResponse:
    """Update an artifact version.

    Args:
        name_id_or_prefix: The name, ID or prefix of the artifact to update.
        version: The version of the artifact to update. Only used if
            `name_id_or_prefix` is the name of the artifact. If not
            specified, the latest version is updated.
        add_tags: Tags to add to the artifact version.
        remove_tags: Tags to remove from the artifact version.
        project: The project name/ID to filter by.

    Returns:
        The updated artifact version.
    """
    artifact_version = self.get_artifact_version(
        name_id_or_prefix=name_id_or_prefix,
        version=version,
        project=project,
    )
    artifact_version_update = ArtifactVersionUpdate(
        add_tags=add_tags, remove_tags=remove_tags
    )
    return self.zen_store.update_artifact_version(
        artifact_version_id=artifact_version.id,
        artifact_version_update=artifact_version_update,
    )

update_authorized_device(id_or_prefix, locked=None)

Update an authorized device.

Parameters:

Name Type Description Default
id_or_prefix Union[UUID, str]

The ID or ID prefix of the authorized device.

required
locked Optional[bool]

Whether to lock or unlock the authorized device.

None

Returns:

Type Description
OAuthDeviceResponse

The updated authorized device.

Source code in src/zenml/client.py
6906
6907
6908
6909
6910
6911
6912
6913
6914
6915
6916
6917
6918
6919
6920
6921
6922
6923
6924
6925
6926
6927
6928
def update_authorized_device(
    self,
    id_or_prefix: Union[UUID, str],
    locked: Optional[bool] = None,
) -> OAuthDeviceResponse:
    """Update an authorized device.

    Args:
        id_or_prefix: The ID or ID prefix of the authorized device.
        locked: Whether to lock or unlock the authorized device.

    Returns:
        The updated authorized device.
    """
    device = self.get_authorized_device(
        id_or_prefix=id_or_prefix, allow_id_prefix_match=False
    )
    return self.zen_store.update_authorized_device(
        device_id=device.id,
        update=OAuthDeviceUpdate(
            locked=locked,
        ),
    )

update_code_repository(name_id_or_prefix, name=None, description=None, logo_url=None, config=None, project=None)

Update a code repository.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

Name, ID or prefix of the code repository to update.

required
name Optional[str]

New name of the code repository.

None
description Optional[str]

New description of the code repository.

None
logo_url Optional[str]

New logo URL of the code repository.

None
config Optional[Dict[str, Any]]

New configuration options for the code repository. Will be used to update the existing configuration values. To remove values from the existing configuration, set the value for that key to None.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Returns:

Type Description
CodeRepositoryResponse

The updated code repository.

Source code in src/zenml/client.py
5210
5211
5212
5213
5214
5215
5216
5217
5218
5219
5220
5221
5222
5223
5224
5225
5226
5227
5228
5229
5230
5231
5232
5233
5234
5235
5236
5237
5238
5239
5240
5241
5242
5243
5244
5245
5246
5247
5248
5249
5250
5251
5252
5253
5254
5255
5256
5257
5258
def update_code_repository(
    self,
    name_id_or_prefix: Union[UUID, str],
    name: Optional[str] = None,
    description: Optional[str] = None,
    logo_url: Optional[str] = None,
    config: Optional[Dict[str, Any]] = None,
    project: Optional[Union[str, UUID]] = None,
) -> CodeRepositoryResponse:
    """Update a code repository.

    Args:
        name_id_or_prefix: Name, ID or prefix of the code repository to
            update.
        name: New name of the code repository.
        description: New description of the code repository.
        logo_url: New logo URL of the code repository.
        config: New configuration options for the code repository. Will
            be used to update the existing configuration values. To remove
            values from the existing configuration, set the value for that
            key to `None`.
        project: The project name/ID to filter by.

    Returns:
        The updated code repository.
    """
    repo = self.get_code_repository(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
        project=project,
    )
    update = CodeRepositoryUpdate(
        name=name, description=description, logo_url=logo_url
    )
    if config is not None:
        combined_config = repo.config
        combined_config.update(config)
        combined_config = {
            k: v for k, v in combined_config.items() if v is not None
        }

        self._validate_code_repository_config(
            source=repo.source, config=combined_config
        )
        update.config = combined_config

    return self.zen_store.update_code_repository(
        code_repository_id=repo.id, update=update
    )

update_event_source(name_id_or_prefix, name=None, description=None, configuration=None, rotate_secret=None, is_active=None, project=None)

Updates an event_source.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the event_source to update.

required
name Optional[str]

the new name of the event_source.

None
description Optional[str]

the new description of the event_source.

None
configuration Optional[Dict[str, Any]]

The event source configuration.

None
rotate_secret Optional[bool]

Allows rotating of secret, if true, the response will contain the new secret value

None
is_active Optional[bool]

Optional[bool] = Allows for activation/deactivating the event source

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Returns:

Type Description
EventSourceResponse

The model of the updated event_source.

Raises:

Type Description
EntityExistsError

If the event_source name is already taken.

Source code in src/zenml/client.py
2867
2868
2869
2870
2871
2872
2873
2874
2875
2876
2877
2878
2879
2880
2881
2882
2883
2884
2885
2886
2887
2888
2889
2890
2891
2892
2893
2894
2895
2896
2897
2898
2899
2900
2901
2902
2903
2904
2905
2906
2907
2908
2909
2910
2911
2912
2913
2914
2915
2916
2917
2918
2919
2920
2921
2922
2923
2924
@_fail_for_sql_zen_store
def update_event_source(
    self,
    name_id_or_prefix: Union[UUID, str],
    name: Optional[str] = None,
    description: Optional[str] = None,
    configuration: Optional[Dict[str, Any]] = None,
    rotate_secret: Optional[bool] = None,
    is_active: Optional[bool] = None,
    project: Optional[Union[str, UUID]] = None,
) -> EventSourceResponse:
    """Updates an event_source.

    Args:
        name_id_or_prefix: The name, id or prefix of the event_source to update.
        name: the new name of the event_source.
        description: the new description of the event_source.
        configuration: The event source configuration.
        rotate_secret: Allows rotating of secret, if true, the response will
            contain the new secret value
        is_active: Optional[bool] = Allows for activation/deactivating the
            event source
        project: The project name/ID to filter by.

    Returns:
        The model of the updated event_source.

    Raises:
        EntityExistsError: If the event_source name is already taken.
    """
    # First, get the eve
    event_source = self.get_event_source(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
        project=project,
    )

    # Create the update model
    update_model = EventSourceUpdate(
        name=name,
        description=description,
        configuration=configuration,
        rotate_secret=rotate_secret,
        is_active=is_active,
    )

    if name:
        if self.list_event_sources(name=name):
            raise EntityExistsError(
                "There are already existing event_sources with the name "
                f"'{name}'."
            )

    updated_event_source = self.zen_store.update_event_source(
        event_source_id=event_source.id,
        event_source_update=update_model,
    )
    return updated_event_source

update_model(model_name_or_id, name=None, license=None, description=None, audience=None, use_cases=None, limitations=None, trade_offs=None, ethics=None, add_tags=None, remove_tags=None, save_models_to_registry=None, project=None)

Updates an existing model in Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id Union[str, UUID]

name or id of the model to be deleted.

required
name Optional[str]

The name of the model.

None
license Optional[str]

The license under which the model is created.

None
description Optional[str]

The description of the model.

None
audience Optional[str]

The target audience of the model.

None
use_cases Optional[str]

The use cases of the model.

None
limitations Optional[str]

The known limitations of the model.

None
trade_offs Optional[str]

The tradeoffs of the model.

None
ethics Optional[str]

The ethical implications of the model.

None
add_tags Optional[List[str]]

Tags to add to the model.

None
remove_tags Optional[List[str]]

Tags to remove from to the model.

None
save_models_to_registry Optional[bool]

Whether to save the model to the registry.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Returns:

Type Description
ModelResponse

The updated model.

Source code in src/zenml/client.py
6181
6182
6183
6184
6185
6186
6187
6188
6189
6190
6191
6192
6193
6194
6195
6196
6197
6198
6199
6200
6201
6202
6203
6204
6205
6206
6207
6208
6209
6210
6211
6212
6213
6214
6215
6216
6217
6218
6219
6220
6221
6222
6223
6224
6225
6226
6227
6228
6229
6230
6231
6232
6233
6234
6235
6236
def update_model(
    self,
    model_name_or_id: Union[str, UUID],
    name: Optional[str] = None,
    license: Optional[str] = None,
    description: Optional[str] = None,
    audience: Optional[str] = None,
    use_cases: Optional[str] = None,
    limitations: Optional[str] = None,
    trade_offs: Optional[str] = None,
    ethics: Optional[str] = None,
    add_tags: Optional[List[str]] = None,
    remove_tags: Optional[List[str]] = None,
    save_models_to_registry: Optional[bool] = None,
    project: Optional[Union[str, UUID]] = None,
) -> ModelResponse:
    """Updates an existing model in Model Control Plane.

    Args:
        model_name_or_id: name or id of the model to be deleted.
        name: The name of the model.
        license: The license under which the model is created.
        description: The description of the model.
        audience: The target audience of the model.
        use_cases: The use cases of the model.
        limitations: The known limitations of the model.
        trade_offs: The tradeoffs of the model.
        ethics: The ethical implications of the model.
        add_tags: Tags to add to the model.
        remove_tags: Tags to remove from to the model.
        save_models_to_registry: Whether to save the model to the
            registry.
        project: The project name/ID to filter by.

    Returns:
        The updated model.
    """
    model = self.get_model(
        model_name_or_id=model_name_or_id, project=project
    )
    return self.zen_store.update_model(
        model_id=model.id,
        model_update=ModelUpdate(
            name=name,
            license=license,
            description=description,
            audience=audience,
            use_cases=use_cases,
            limitations=limitations,
            trade_offs=trade_offs,
            ethics=ethics,
            add_tags=add_tags,
            remove_tags=remove_tags,
            save_models_to_registry=save_models_to_registry,
        ),
    )

update_model_version(model_name_or_id, version_name_or_id, stage=None, force=False, name=None, description=None, add_tags=None, remove_tags=None, project=None)

Get all model versions by filter.

Parameters:

Name Type Description Default
model_name_or_id Union[str, UUID]

The name or ID of the model containing model version.

required
version_name_or_id Union[str, UUID]

The name or ID of model version to be updated.

required
stage Optional[Union[str, ModelStages]]

Target model version stage to be set.

None
force bool

Whether existing model version in target stage should be silently archived or an error should be raised.

False
name Optional[str]

Target model version name to be set.

None
description Optional[str]

Target model version description to be set.

None
add_tags Optional[List[str]]

Tags to add to the model version.

None
remove_tags Optional[List[str]]

Tags to remove from to the model version.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Returns:

Type Description
ModelVersionResponse

An updated model version.

Source code in src/zenml/client.py
6585
6586
6587
6588
6589
6590
6591
6592
6593
6594
6595
6596
6597
6598
6599
6600
6601
6602
6603
6604
6605
6606
6607
6608
6609
6610
6611
6612
6613
6614
6615
6616
6617
6618
6619
6620
6621
6622
6623
6624
6625
6626
6627
6628
6629
6630
6631
6632
6633
def update_model_version(
    self,
    model_name_or_id: Union[str, UUID],
    version_name_or_id: Union[str, UUID],
    stage: Optional[Union[str, ModelStages]] = None,
    force: bool = False,
    name: Optional[str] = None,
    description: Optional[str] = None,
    add_tags: Optional[List[str]] = None,
    remove_tags: Optional[List[str]] = None,
    project: Optional[Union[str, UUID]] = None,
) -> ModelVersionResponse:
    """Get all model versions by filter.

    Args:
        model_name_or_id: The name or ID of the model containing model version.
        version_name_or_id: The name or ID of model version to be updated.
        stage: Target model version stage to be set.
        force: Whether existing model version in target stage should be
            silently archived or an error should be raised.
        name: Target model version name to be set.
        description: Target model version description to be set.
        add_tags: Tags to add to the model version.
        remove_tags: Tags to remove from to the model version.
        project: The project name/ID to filter by.

    Returns:
        An updated model version.
    """
    if not is_valid_uuid(model_name_or_id):
        model = self.get_model(model_name_or_id, project=project)
        model_name_or_id = model.id
        project = project or model.project_id
    if not is_valid_uuid(version_name_or_id):
        version_name_or_id = self.get_model_version(
            model_name_or_id, version_name_or_id, project=project
        ).id

    return self.zen_store.update_model_version(
        model_version_id=version_name_or_id,  # type:ignore[arg-type]
        model_version_update_model=ModelVersionUpdate(
            stage=stage,
            force=force,
            name=name,
            description=description,
            add_tags=add_tags,
            remove_tags=remove_tags,
        ),
    )

update_project(name_id_or_prefix, new_name=None, new_display_name=None, new_description=None)

Update a project.

Parameters:

Name Type Description Default
name_id_or_prefix Optional[Union[UUID, str]]

Name, ID or prefix of the project to update.

required
new_name Optional[str]

New name of the project.

None
new_display_name Optional[str]

New display name of the project.

None
new_description Optional[str]

New description of the project.

None

Returns:

Type Description
ProjectResponse

The updated project.

Source code in src/zenml/client.py
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
def update_project(
    self,
    name_id_or_prefix: Optional[Union[UUID, str]],
    new_name: Optional[str] = None,
    new_display_name: Optional[str] = None,
    new_description: Optional[str] = None,
) -> ProjectResponse:
    """Update a project.

    Args:
        name_id_or_prefix: Name, ID or prefix of the project to update.
        new_name: New name of the project.
        new_display_name: New display name of the project.
        new_description: New description of the project.

    Returns:
        The updated project.
    """
    project = self.get_project(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )
    project_update = ProjectUpdate(
        name=new_name or project.name,
        display_name=new_display_name or project.display_name,
    )
    if new_description:
        project_update.description = new_description
    return self.zen_store.update_project(
        project_id=project.id,
        project_update=project_update,
    )

update_run_template(name_id_or_prefix, name=None, description=None, hidden=None, add_tags=None, remove_tags=None, project=None)

Update a run template.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

Name/ID/ID prefix of the template to update.

required
name Optional[str]

The new name of the run template.

None
description Optional[str]

The new description of the run template.

None
hidden Optional[bool]

The new hidden status of the run template.

None
add_tags Optional[List[str]]

Tags to add to the run template.

None
remove_tags Optional[List[str]]

Tags to remove from the run template.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Returns:

Type Description
RunTemplateResponse

The updated run template.

Source code in src/zenml/client.py
3628
3629
3630
3631
3632
3633
3634
3635
3636
3637
3638
3639
3640
3641
3642
3643
3644
3645
3646
3647
3648
3649
3650
3651
3652
3653
3654
3655
3656
3657
3658
3659
3660
3661
3662
3663
3664
3665
3666
3667
3668
3669
3670
3671
3672
3673
3674
def update_run_template(
    self,
    name_id_or_prefix: Union[str, UUID],
    name: Optional[str] = None,
    description: Optional[str] = None,
    hidden: Optional[bool] = None,
    add_tags: Optional[List[str]] = None,
    remove_tags: Optional[List[str]] = None,
    project: Optional[Union[str, UUID]] = None,
) -> RunTemplateResponse:
    """Update a run template.

    Args:
        name_id_or_prefix: Name/ID/ID prefix of the template to update.
        name: The new name of the run template.
        description: The new description of the run template.
        hidden: The new hidden status of the run template.
        add_tags: Tags to add to the run template.
        remove_tags: Tags to remove from the run template.
        project: The project name/ID to filter by.

    Returns:
        The updated run template.
    """
    if is_valid_uuid(name_id_or_prefix):
        template_id = (
            UUID(name_id_or_prefix)
            if isinstance(name_id_or_prefix, str)
            else name_id_or_prefix
        )
    else:
        template_id = self.get_run_template(
            name_id_or_prefix,
            project=project,
            hydrate=False,
        ).id

    return self.zen_store.update_run_template(
        template_id=template_id,
        template_update=RunTemplateUpdate(
            name=name,
            description=description,
            hidden=hidden,
            add_tags=add_tags,
            remove_tags=remove_tags,
        ),
    )

update_secret(name_id_or_prefix, private=None, new_name=None, update_private=None, add_or_update_values=None, remove_values=None)

Updates a secret.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix of the id for the secret to update.

required
private Optional[bool]

The private status of the secret to update.

None
new_name Optional[str]

The new name of the secret.

None
update_private Optional[bool]

New value used to update the private status of the secret.

None
add_or_update_values Optional[Dict[str, str]]

The values to add or update.

None
remove_values Optional[List[str]]

The values to remove.

None

Returns:

Type Description
SecretResponse

The updated secret.

Raises:

Type Description
KeyError

If trying to remove a value that doesn't exist.

ValueError

If a key is provided in both add_or_update_values and remove_values.

Source code in src/zenml/client.py
4863
4864
4865
4866
4867
4868
4869
4870
4871
4872
4873
4874
4875
4876
4877
4878
4879
4880
4881
4882
4883
4884
4885
4886
4887
4888
4889
4890
4891
4892
4893
4894
4895
4896
4897
4898
4899
4900
4901
4902
4903
4904
4905
4906
4907
4908
4909
4910
4911
4912
4913
4914
4915
4916
4917
4918
4919
4920
4921
4922
4923
4924
4925
4926
4927
4928
4929
4930
4931
def update_secret(
    self,
    name_id_or_prefix: Union[str, UUID],
    private: Optional[bool] = None,
    new_name: Optional[str] = None,
    update_private: Optional[bool] = None,
    add_or_update_values: Optional[Dict[str, str]] = None,
    remove_values: Optional[List[str]] = None,
) -> SecretResponse:
    """Updates a secret.

    Args:
        name_id_or_prefix: The name, id or prefix of the id for the
            secret to update.
        private: The private status of the secret to update.
        new_name: The new name of the secret.
        update_private: New value used to update the private status of the
            secret.
        add_or_update_values: The values to add or update.
        remove_values: The values to remove.

    Returns:
        The updated secret.

    Raises:
        KeyError: If trying to remove a value that doesn't exist.
        ValueError: If a key is provided in both add_or_update_values and
            remove_values.
    """
    secret = self.get_secret(
        name_id_or_prefix=name_id_or_prefix,
        private=private,
        # Don't allow partial name matches, but allow partial ID matches
        allow_partial_name_match=False,
        allow_partial_id_match=True,
        hydrate=True,
    )

    secret_update = SecretUpdate(name=new_name or secret.name)

    if update_private:
        secret_update.private = update_private
    values: Dict[str, Optional[SecretStr]] = {}
    if add_or_update_values:
        values.update(
            {
                key: SecretStr(value)
                for key, value in add_or_update_values.items()
            }
        )
    if remove_values:
        for key in remove_values:
            if key not in secret.values:
                raise KeyError(
                    f"Cannot remove value '{key}' from secret "
                    f"'{secret.name}' because it does not exist."
                )
            if key in values:
                raise ValueError(
                    f"Key '{key}' is supplied both in the values to add or "
                    f"update and the values to be removed."
                )
            values[key] = None
    if values:
        secret_update.values = values

    return Client().zen_store.update_secret(
        secret_id=secret.id, secret_update=secret_update
    )

update_server_settings(updated_name=None, updated_logo_url=None, updated_enable_analytics=None, updated_enable_announcements=None, updated_enable_updates=None, updated_onboarding_state=None)

Update the server settings.

Parameters:

Name Type Description Default
updated_name Optional[str]

Updated name for the server.

None
updated_logo_url Optional[str]

Updated logo URL for the server.

None
updated_enable_analytics Optional[bool]

Updated value whether to enable analytics for the server.

None
updated_enable_announcements Optional[bool]

Updated value whether to display announcements about ZenML.

None
updated_enable_updates Optional[bool]

Updated value whether to display updates about ZenML.

None
updated_onboarding_state Optional[Dict[str, Any]]

Updated onboarding state for the server.

None

Returns:

Type Description
ServerSettingsResponse

The updated server settings.

Source code in src/zenml/client.py
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
def update_server_settings(
    self,
    updated_name: Optional[str] = None,
    updated_logo_url: Optional[str] = None,
    updated_enable_analytics: Optional[bool] = None,
    updated_enable_announcements: Optional[bool] = None,
    updated_enable_updates: Optional[bool] = None,
    updated_onboarding_state: Optional[Dict[str, Any]] = None,
) -> ServerSettingsResponse:
    """Update the server settings.

    Args:
        updated_name: Updated name for the server.
        updated_logo_url: Updated logo URL for the server.
        updated_enable_analytics: Updated value whether to enable
            analytics for the server.
        updated_enable_announcements: Updated value whether to display
            announcements about ZenML.
        updated_enable_updates: Updated value whether to display updates
            about ZenML.
        updated_onboarding_state: Updated onboarding state for the server.

    Returns:
        The updated server settings.
    """
    update_model = ServerSettingsUpdate(
        server_name=updated_name,
        logo_url=updated_logo_url,
        enable_analytics=updated_enable_analytics,
        display_announcements=updated_enable_announcements,
        display_updates=updated_enable_updates,
        onboarding_state=updated_onboarding_state,
    )
    return self.zen_store.update_server_settings(update_model)

update_service(id, name=None, service_source=None, admin_state=None, status=None, endpoint=None, labels=None, prediction_url=None, health_check_url=None, model_version_id=None)

Update a service.

Parameters:

Name Type Description Default
id UUID

The ID of the service to update.

required
name Optional[str]

The new name of the service.

None
admin_state Optional[ServiceState]

The new admin state of the service.

None
status Optional[Dict[str, Any]]

The new status of the service.

None
endpoint Optional[Dict[str, Any]]

The new endpoint of the service.

None
service_source Optional[str]

The new service source of the service.

None
labels Optional[Dict[str, str]]

The new labels of the service.

None
prediction_url Optional[str]

The new prediction url of the service.

None
health_check_url Optional[str]

The new health check url of the service.

None
model_version_id Optional[UUID]

The new model version id of the service.

None

Returns:

Type Description
ServiceResponse

The updated service.

Source code in src/zenml/client.py
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
def update_service(
    self,
    id: UUID,
    name: Optional[str] = None,
    service_source: Optional[str] = None,
    admin_state: Optional[ServiceState] = None,
    status: Optional[Dict[str, Any]] = None,
    endpoint: Optional[Dict[str, Any]] = None,
    labels: Optional[Dict[str, str]] = None,
    prediction_url: Optional[str] = None,
    health_check_url: Optional[str] = None,
    model_version_id: Optional[UUID] = None,
) -> ServiceResponse:
    """Update a service.

    Args:
        id: The ID of the service to update.
        name: The new name of the service.
        admin_state: The new admin state of the service.
        status: The new status of the service.
        endpoint: The new endpoint of the service.
        service_source: The new service source of the service.
        labels: The new labels of the service.
        prediction_url: The new prediction url of the service.
        health_check_url: The new health check url of the service.
        model_version_id: The new model version id of the service.

    Returns:
        The updated service.
    """
    service_update = ServiceUpdate()
    if name:
        service_update.name = name
    if service_source:
        service_update.service_source = service_source
    if admin_state:
        service_update.admin_state = admin_state
    if status:
        service_update.status = status
    if endpoint:
        service_update.endpoint = endpoint
    if labels:
        service_update.labels = labels
    if prediction_url:
        service_update.prediction_url = prediction_url
    if health_check_url:
        service_update.health_check_url = health_check_url
    if model_version_id:
        service_update.model_version_id = model_version_id
    return self.zen_store.update_service(
        service_id=id, update=service_update
    )

update_service_account(name_id_or_prefix, updated_name=None, description=None, active=None)

Update a service account.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name or ID of the service account to update.

required
updated_name Optional[str]

The new name of the service account.

None
description Optional[str]

The new description of the service account.

None
active Optional[bool]

The new active status of the service account.

None

Returns:

Type Description
ServiceAccountResponse

The updated service account.

Source code in src/zenml/client.py
7355
7356
7357
7358
7359
7360
7361
7362
7363
7364
7365
7366
7367
7368
7369
7370
7371
7372
7373
7374
7375
7376
7377
7378
7379
7380
7381
7382
7383
7384
7385
def update_service_account(
    self,
    name_id_or_prefix: Union[str, UUID],
    updated_name: Optional[str] = None,
    description: Optional[str] = None,
    active: Optional[bool] = None,
) -> ServiceAccountResponse:
    """Update a service account.

    Args:
        name_id_or_prefix: The name or ID of the service account to update.
        updated_name: The new name of the service account.
        description: The new description of the service account.
        active: The new active status of the service account.

    Returns:
        The updated service account.
    """
    service_account = self.get_service_account(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )
    service_account_update = ServiceAccountUpdate(
        name=updated_name,
        description=description,
        active=active,
    )

    return self.zen_store.update_service_account(
        service_account_name_or_id=service_account.id,
        service_account_update=service_account_update,
    )

update_service_connector(name_id_or_prefix, name=None, auth_method=None, resource_type=None, configuration=None, resource_id=None, description=None, expires_at=None, expires_skew_tolerance=None, expiration_seconds=None, labels=None, verify=True, list_resources=True, update=True)

Validate and/or register an updated service connector.

If the resource_type, resource_id and expiration_seconds parameters are set to their "empty" values (empty string for resource type and resource ID, 0 for expiration seconds), the existing values will be removed from the service connector. Setting them to None or omitting them will not affect the existing values.

If supplied, the configuration parameter is a full replacement of the existing configuration rather than a partial update.

Labels can be updated or removed by setting the label value to None.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the service connector to update.

required
name Optional[str]

The new name of the service connector.

None
auth_method Optional[str]

The new authentication method of the service connector.

None
resource_type Optional[str]

The new resource type for the service connector. If set to the empty string, the existing resource type will be removed.

None
configuration Optional[Dict[str, str]]

The new configuration of the service connector. If set, this needs to be a full replacement of the existing configuration rather than a partial update.

None
resource_id Optional[str]

The new resource id of the service connector. If set to the empty string, the existing resource ID will be removed.

None
description Optional[str]

The description of the service connector.

None
expires_at Optional[datetime]

The new UTC expiration time of the service connector.

None
expires_skew_tolerance Optional[int]

The allowed expiration skew for the service connector credentials.

None
expiration_seconds Optional[int]

The expiration time of the service connector. If set to 0, the existing expiration time will be removed.

None
labels Optional[Dict[str, Optional[str]]]

The service connector to update or remove. If a label value is set to None, the label will be removed.

None
verify bool

Whether to verify that the service connector configuration and credentials can be used to gain access to the resource.

True
list_resources bool

Whether to also list the resources that the service connector can give access to (if verify is True).

True
update bool

Whether to update the service connector or not.

True

Returns:

Type Description
Optional[Union[ServiceConnectorResponse, ServiceConnectorUpdate]]

The model of the registered service connector and the resources

Optional[ServiceConnectorResourcesModel]

that the service connector can give access to (if verify is True).

Raises:

Type Description
AuthorizationException

If the service connector verification fails due to invalid credentials or insufficient permissions.

Source code in src/zenml/client.py
5613
5614
5615
5616
5617
5618
5619
5620
5621
5622
5623
5624
5625
5626
5627
5628
5629
5630
5631
5632
5633
5634
5635
5636
5637
5638
5639
5640
5641
5642
5643
5644
5645
5646
5647
5648
5649
5650
5651
5652
5653
5654
5655
5656
5657
5658
5659
5660
5661
5662
5663
5664
5665
5666
5667
5668
5669
5670
5671
5672
5673
5674
5675
5676
5677
5678
5679
5680
5681
5682
5683
5684
5685
5686
5687
5688
5689
5690
5691
5692
5693
5694
5695
5696
5697
5698
5699
5700
5701
5702
5703
5704
5705
5706
5707
5708
5709
5710
5711
5712
5713
5714
5715
5716
5717
5718
5719
5720
5721
5722
5723
5724
5725
5726
5727
5728
5729
5730
5731
5732
5733
5734
5735
5736
5737
5738
5739
5740
5741
5742
5743
5744
5745
5746
5747
5748
5749
5750
5751
5752
5753
5754
5755
5756
5757
5758
5759
5760
5761
5762
5763
5764
5765
5766
5767
5768
5769
5770
5771
5772
5773
5774
5775
5776
5777
5778
5779
5780
5781
5782
5783
5784
5785
5786
5787
5788
5789
5790
5791
5792
5793
5794
5795
5796
5797
5798
5799
5800
5801
5802
5803
5804
5805
5806
5807
5808
5809
5810
5811
5812
5813
5814
5815
5816
5817
5818
5819
5820
5821
5822
5823
5824
5825
5826
5827
5828
5829
5830
5831
5832
5833
5834
def update_service_connector(
    self,
    name_id_or_prefix: Union[UUID, str],
    name: Optional[str] = None,
    auth_method: Optional[str] = None,
    resource_type: Optional[str] = None,
    configuration: Optional[Dict[str, str]] = None,
    resource_id: Optional[str] = None,
    description: Optional[str] = None,
    expires_at: Optional[datetime] = None,
    expires_skew_tolerance: Optional[int] = None,
    expiration_seconds: Optional[int] = None,
    labels: Optional[Dict[str, Optional[str]]] = None,
    verify: bool = True,
    list_resources: bool = True,
    update: bool = True,
) -> Tuple[
    Optional[
        Union[
            ServiceConnectorResponse,
            ServiceConnectorUpdate,
        ]
    ],
    Optional[ServiceConnectorResourcesModel],
]:
    """Validate and/or register an updated service connector.

    If the `resource_type`, `resource_id` and `expiration_seconds`
    parameters are set to their "empty" values (empty string for resource
    type and resource ID, 0 for expiration seconds), the existing values
    will be removed from the service connector. Setting them to None or
    omitting them will not affect the existing values.

    If supplied, the `configuration` parameter is a full replacement of the
    existing configuration rather than a partial update.

    Labels can be updated or removed by setting the label value to None.

    Args:
        name_id_or_prefix: The name, id or prefix of the service connector
            to update.
        name: The new name of the service connector.
        auth_method: The new authentication method of the service connector.
        resource_type: The new resource type for the service connector.
            If set to the empty string, the existing resource type will be
            removed.
        configuration: The new configuration of the service connector. If
            set, this needs to be a full replacement of the existing
            configuration rather than a partial update.
        resource_id: The new resource id of the service connector.
            If set to the empty string, the existing resource ID will be
            removed.
        description: The description of the service connector.
        expires_at: The new UTC expiration time of the service connector.
        expires_skew_tolerance: The allowed expiration skew for the service
            connector credentials.
        expiration_seconds: The expiration time of the service connector.
            If set to 0, the existing expiration time will be removed.
        labels: The service connector to update or remove. If a label value
            is set to None, the label will be removed.
        verify: Whether to verify that the service connector configuration
            and credentials can be used to gain access to the resource.
        list_resources: Whether to also list the resources that the service
            connector can give access to (if verify is True).
        update: Whether to update the service connector or not.

    Returns:
        The model of the registered service connector and the resources
        that the service connector can give access to (if verify is True).

    Raises:
        AuthorizationException: If the service connector verification
            fails due to invalid credentials or insufficient permissions.
    """
    from zenml.service_connectors.service_connector_registry import (
        service_connector_registry,
    )

    connector_model = self.get_service_connector(
        name_id_or_prefix,
        allow_name_prefix_match=False,
        load_secrets=True,
    )

    connector_instance: Optional[ServiceConnector] = None
    connector_resources: Optional[ServiceConnectorResourcesModel] = None

    if isinstance(connector_model.connector_type, str):
        connector = self.get_service_connector_type(
            connector_model.connector_type
        )
    else:
        connector = connector_model.connector_type

    resource_types: Optional[Union[str, List[str]]] = None
    if resource_type == "":
        resource_types = None
    elif resource_type is None:
        resource_types = connector_model.resource_types
    else:
        resource_types = resource_type

    if not resource_type and len(connector.resource_types) == 1:
        resource_types = connector.resource_types[0].resource_type

    if resource_id == "":
        resource_id = None
    elif resource_id is None:
        resource_id = connector_model.resource_id

    if expiration_seconds == 0:
        expiration_seconds = None
    elif expiration_seconds is None:
        expiration_seconds = connector_model.expiration_seconds

    connector_update = ServiceConnectorUpdate(
        name=name or connector_model.name,
        connector_type=connector.connector_type,
        description=description or connector_model.description,
        auth_method=auth_method or connector_model.auth_method,
        expires_at=expires_at,
        expires_skew_tolerance=expires_skew_tolerance,
        expiration_seconds=expiration_seconds,
    )

    # Validate and configure the resources
    if configuration is not None:
        # The supplied configuration is a drop-in replacement for the
        # existing configuration and secrets
        connector_update.validate_and_configure_resources(
            connector_type=connector,
            resource_types=resource_types,
            resource_id=resource_id,
            configuration=configuration,
        )
    else:
        connector_update.validate_and_configure_resources(
            connector_type=connector,
            resource_types=resource_types,
            resource_id=resource_id,
            configuration=connector_model.configuration,
            secrets=connector_model.secrets,
        )

    # Add the labels
    if labels is not None:
        # Apply the new label values, but don't keep any labels that
        # have been set to None in the update
        connector_update.labels = {
            **{
                label: value
                for label, value in connector_model.labels.items()
                if label not in labels
            },
            **{
                label: value
                for label, value in labels.items()
                if value is not None
            },
        }
    else:
        connector_update.labels = connector_model.labels

    if verify:
        # Prefer to verify the connector config server-side if the
        # implementation, if available there, because it ensures
        # that the connector can be shared with other users or used
        # from other machines and because some auth methods rely on the
        # server-side authentication environment

        # Convert the update model to a request model for validation
        connector_request_dict = connector_update.model_dump()
        connector_request = ServiceConnectorRequest.model_validate(
            connector_request_dict
        )

        if connector.remote:
            connector_resources = (
                self.zen_store.verify_service_connector_config(
                    service_connector=connector_request,
                    list_resources=list_resources,
                )
            )
        else:
            connector_instance = (
                service_connector_registry.instantiate_connector(
                    model=connector_request,
                )
            )
            connector_resources = connector_instance.verify(
                list_resources=list_resources
            )

        if connector_resources.error:
            raise AuthorizationException(connector_resources.error)

        # For resource types that don't support multi-instances, it's
        # better to save the default resource ID in the connector, if
        # available. Otherwise, we'll need to instantiate the connector
        # again to get the default resource ID.
        connector_update.resource_id = (
            connector_update.resource_id
            or connector_resources.get_default_resource_id()
        )

    if not update:
        return connector_update, connector_resources

    # Update the model
    connector_response = self.zen_store.update_service_connector(
        service_connector_id=connector_model.id,
        update=connector_update,
    )

    if connector_resources:
        connector_resources.id = connector_response.id
        connector_resources.name = connector_response.name
        connector_resources.connector_type = (
            connector_response.connector_type
        )

    return connector_response, connector_resources

update_stack(name_id_or_prefix=None, name=None, stack_spec_file=None, labels=None, description=None, component_updates=None)

Updates a stack and its components.

Parameters:

Name Type Description Default
name_id_or_prefix Optional[Union[UUID, str]]

The name, id or prefix of the stack to update.

None
name Optional[str]

the new name of the stack.

None
stack_spec_file Optional[str]

path to the stack spec file.

None
labels Optional[Dict[str, Any]]

The new labels of the stack component.

None
description Optional[str]

the new description of the stack.

None
component_updates Optional[Dict[StackComponentType, List[Union[UUID, str]]]]

dictionary which maps stack component types to lists of new stack component names or ids.

None

Returns:

Type Description
StackResponse

The model of the updated stack.

Raises:

Type Description
EntityExistsError

If the stack name is already taken.

Source code in src/zenml/client.py
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
def update_stack(
    self,
    name_id_or_prefix: Optional[Union[UUID, str]] = None,
    name: Optional[str] = None,
    stack_spec_file: Optional[str] = None,
    labels: Optional[Dict[str, Any]] = None,
    description: Optional[str] = None,
    component_updates: Optional[
        Dict[StackComponentType, List[Union[UUID, str]]]
    ] = None,
) -> StackResponse:
    """Updates a stack and its components.

    Args:
        name_id_or_prefix: The name, id or prefix of the stack to update.
        name: the new name of the stack.
        stack_spec_file: path to the stack spec file.
        labels: The new labels of the stack component.
        description: the new description of the stack.
        component_updates: dictionary which maps stack component types to
            lists of new stack component names or ids.

    Returns:
        The model of the updated stack.

    Raises:
        EntityExistsError: If the stack name is already taken.
    """
    # First, get the stack
    stack = self.get_stack(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )

    # Create the update model
    update_model = StackUpdate(
        stack_spec_path=stack_spec_file,
    )

    if name:
        if self.list_stacks(name=name):
            raise EntityExistsError(
                "There are already existing stacks with the name "
                f"'{name}'."
            )

        update_model.name = name

    if description:
        update_model.description = description

    # Get the current components
    if component_updates:
        components_dict = stack.components.copy()

        for component_type, component_id_list in component_updates.items():
            if component_id_list is not None:
                components_dict[component_type] = [
                    self.get_stack_component(
                        name_id_or_prefix=component_id,
                        component_type=component_type,
                    )
                    for component_id in component_id_list
                ]

        update_model.components = {
            c_type: [c.id for c in c_list]
            for c_type, c_list in components_dict.items()
        }

    if labels is not None:
        existing_labels = stack.labels or {}
        existing_labels.update(labels)

        existing_labels = {
            k: v for k, v in existing_labels.items() if v is not None
        }
        update_model.labels = existing_labels

    updated_stack = self.zen_store.update_stack(
        stack_id=stack.id,
        stack_update=update_model,
    )
    if updated_stack.id == self.active_stack_model.id:
        if self._config:
            self._config.set_active_stack(updated_stack)
        else:
            GlobalConfiguration().set_active_stack(updated_stack)
    return updated_stack

update_stack_component(name_id_or_prefix, component_type, name=None, configuration=None, labels=None, disconnect=None, connector_id=None, connector_resource_id=None)

Updates a stack component.

Parameters:

Name Type Description Default
name_id_or_prefix Optional[Union[UUID, str]]

The name, id or prefix of the stack component to update.

required
component_type StackComponentType

The type of the stack component to update.

required
name Optional[str]

The new name of the stack component.

None
configuration Optional[Dict[str, Any]]

The new configuration of the stack component.

None
labels Optional[Dict[str, Any]]

The new labels of the stack component.

None
disconnect Optional[bool]

Whether to disconnect the stack component from its service connector.

None
connector_id Optional[UUID]

The new connector id of the stack component.

None
connector_resource_id Optional[str]

The new connector resource id of the stack component.

None

Returns:

Type Description
ComponentResponse

The updated stack component.

Raises:

Type Description
EntityExistsError

If the new name is already taken.

Source code in src/zenml/client.py
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
2113
2114
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
def update_stack_component(
    self,
    name_id_or_prefix: Optional[Union[UUID, str]],
    component_type: StackComponentType,
    name: Optional[str] = None,
    configuration: Optional[Dict[str, Any]] = None,
    labels: Optional[Dict[str, Any]] = None,
    disconnect: Optional[bool] = None,
    connector_id: Optional[UUID] = None,
    connector_resource_id: Optional[str] = None,
) -> ComponentResponse:
    """Updates a stack component.

    Args:
        name_id_or_prefix: The name, id or prefix of the stack component to
            update.
        component_type: The type of the stack component to update.
        name: The new name of the stack component.
        configuration: The new configuration of the stack component.
        labels: The new labels of the stack component.
        disconnect: Whether to disconnect the stack component from its
            service connector.
        connector_id: The new connector id of the stack component.
        connector_resource_id: The new connector resource id of the
            stack component.

    Returns:
        The updated stack component.

    Raises:
        EntityExistsError: If the new name is already taken.
    """
    # Get the existing component model
    component = self.get_stack_component(
        name_id_or_prefix=name_id_or_prefix,
        component_type=component_type,
        allow_name_prefix_match=False,
    )

    update_model = ComponentUpdate()

    if name is not None:
        existing_components = self.list_stack_components(
            name=name,
            type=component_type,
        )
        if existing_components.total > 0:
            raise EntityExistsError(
                f"There are already existing components with the "
                f"name '{name}'."
            )
        update_model.name = name

    if configuration is not None:
        existing_configuration = component.configuration
        existing_configuration.update(configuration)
        existing_configuration = {
            k: v
            for k, v in existing_configuration.items()
            if v is not None
        }

        from zenml.stack.utils import (
            validate_stack_component_config,
            warn_if_config_server_mismatch,
        )

        validated_config = validate_stack_component_config(
            configuration_dict=existing_configuration,
            flavor=component.flavor,
            component_type=component.type,
            # Always enforce validation of custom flavors
            validate_custom_flavors=True,
        )
        # Guaranteed to not be None by setting
        # `validate_custom_flavors=True` above
        assert validated_config is not None
        warn_if_config_server_mismatch(validated_config)

        update_model.configuration = validated_config.model_dump(
            mode="json", exclude_unset=True
        )

    if labels is not None:
        existing_labels = component.labels or {}
        existing_labels.update(labels)

        existing_labels = {
            k: v for k, v in existing_labels.items() if v is not None
        }
        update_model.labels = existing_labels

    if disconnect:
        update_model.connector = None
        update_model.connector_resource_id = None
    else:
        existing_component = self.get_stack_component(
            name_id_or_prefix=name_id_or_prefix,
            component_type=component_type,
            allow_name_prefix_match=False,
        )
        update_model.connector = connector_id
        update_model.connector_resource_id = connector_resource_id
        if connector_id is None and existing_component.connector:
            update_model.connector = existing_component.connector.id
            update_model.connector_resource_id = (
                existing_component.connector_resource_id
            )

    # Send the updated component to the ZenStore
    return self.zen_store.update_stack_component(
        component_id=component.id,
        component_update=update_model,
    )

update_tag(tag_name_or_id, name=None, exclusive=None, color=None)

Updates an existing tag.

Parameters:

Name Type Description Default
tag_name_or_id Union[str, UUID]

name or UUID of the tag to be updated.

required
name Optional[str]

the name of the tag.

None
exclusive Optional[bool]

the boolean to decide whether the tag is an exclusive tag. An exclusive tag means that the tag can exist only for a single: - pipeline run within the scope of a pipeline - artifact version within the scope of an artifact - run template

None
color Optional[Union[str, ColorVariants]]

the color of the tag

None

Returns:

Type Description
TagResponse

The updated tag.

Source code in src/zenml/client.py
7724
7725
7726
7727
7728
7729
7730
7731
7732
7733
7734
7735
7736
7737
7738
7739
7740
7741
7742
7743
7744
7745
7746
7747
7748
7749
7750
7751
7752
7753
7754
7755
7756
7757
7758
7759
7760
7761
7762
7763
def update_tag(
    self,
    tag_name_or_id: Union[str, UUID],
    name: Optional[str] = None,
    exclusive: Optional[bool] = None,
    color: Optional[Union[str, ColorVariants]] = None,
) -> TagResponse:
    """Updates an existing tag.

    Args:
        tag_name_or_id: name or UUID of the tag to be updated.
        name: the name of the tag.
        exclusive: the boolean to decide whether the tag is an exclusive tag.
            An exclusive tag means that the tag can exist only for a single:
                - pipeline run within the scope of a pipeline
                - artifact version within the scope of an artifact
                - run template
        color: the color of the tag

    Returns:
        The updated tag.
    """
    update_model = TagUpdate()

    if name is not None:
        update_model.name = name

    if exclusive is not None:
        update_model.exclusive = exclusive

    if color is not None:
        if isinstance(color, str):
            update_model.color = ColorVariants(color)
        else:
            update_model.color = color

    return self.zen_store.update_tag(
        tag_name_or_id=tag_name_or_id,
        tag_update_model=update_model,
    )

update_trigger(name_id_or_prefix, name=None, description=None, event_filter=None, is_active=None, project=None)

Updates a trigger.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the trigger to update.

required
name Optional[str]

the new name of the trigger.

None
description Optional[str]

the new description of the trigger.

None
event_filter Optional[Dict[str, Any]]

The event filter configuration.

None
is_active Optional[bool]

Whether the trigger is active or not.

None
project Optional[Union[str, UUID]]

The project name/ID to filter by.

None

Returns:

Type Description
TriggerResponse

The model of the updated trigger.

Raises:

Type Description
EntityExistsError

If the trigger name is already taken.

Source code in src/zenml/client.py
3276
3277
3278
3279
3280
3281
3282
3283
3284
3285
3286
3287
3288
3289
3290
3291
3292
3293
3294
3295
3296
3297
3298
3299
3300
3301
3302
3303
3304
3305
3306
3307
3308
3309
3310
3311
3312
3313
3314
3315
3316
3317
3318
3319
3320
3321
3322
3323
3324
3325
3326
3327
3328
@_fail_for_sql_zen_store
def update_trigger(
    self,
    name_id_or_prefix: Union[UUID, str],
    name: Optional[str] = None,
    description: Optional[str] = None,
    event_filter: Optional[Dict[str, Any]] = None,
    is_active: Optional[bool] = None,
    project: Optional[Union[str, UUID]] = None,
) -> TriggerResponse:
    """Updates a trigger.

    Args:
        name_id_or_prefix: The name, id or prefix of the trigger to update.
        name: the new name of the trigger.
        description: the new description of the trigger.
        event_filter: The event filter configuration.
        is_active: Whether the trigger is active or not.
        project: The project name/ID to filter by.

    Returns:
        The model of the updated trigger.

    Raises:
        EntityExistsError: If the trigger name is already taken.
    """
    # First, get the eve
    trigger = self.get_trigger(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
        project=project,
    )

    # Create the update model
    update_model = TriggerUpdate(
        name=name,
        description=description,
        event_filter=event_filter,
        is_active=is_active,
    )

    if name:
        if self.list_triggers(name=name):
            raise EntityExistsError(
                "There are already is an existing trigger with the name "
                f"'{name}'."
            )

    updated_trigger = self.zen_store.update_trigger(
        trigger_id=trigger.id,
        trigger_update=update_model,
    )
    return updated_trigger

update_user(name_id_or_prefix, updated_name=None, updated_full_name=None, updated_email=None, updated_email_opt_in=None, updated_password=None, old_password=None, updated_is_admin=None, updated_metadata=None, updated_default_project_id=None, active=None)

Update a user.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name or ID of the user to update.

required
updated_name Optional[str]

The new name of the user.

None
updated_full_name Optional[str]

The new full name of the user.

None
updated_email Optional[str]

The new email of the user.

None
updated_email_opt_in Optional[bool]

The new email opt-in status of the user.

None
updated_password Optional[str]

The new password of the user.

None
old_password Optional[str]

The old password of the user. Required for password update.

None
updated_is_admin Optional[bool]

Whether the user should be an admin.

None
updated_metadata Optional[Dict[str, Any]]

The new metadata for the user.

None
updated_default_project_id Optional[UUID]

The new default project ID for the user.

None
active Optional[bool]

Use to activate or deactivate the user.

None

Returns:

Type Description
UserResponse

The updated user.

Raises:

Type Description
ValidationError

If the old password is not provided when updating the password.

Source code in src/zenml/client.py
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
def update_user(
    self,
    name_id_or_prefix: Union[str, UUID],
    updated_name: Optional[str] = None,
    updated_full_name: Optional[str] = None,
    updated_email: Optional[str] = None,
    updated_email_opt_in: Optional[bool] = None,
    updated_password: Optional[str] = None,
    old_password: Optional[str] = None,
    updated_is_admin: Optional[bool] = None,
    updated_metadata: Optional[Dict[str, Any]] = None,
    updated_default_project_id: Optional[UUID] = None,
    active: Optional[bool] = None,
) -> UserResponse:
    """Update a user.

    Args:
        name_id_or_prefix: The name or ID of the user to update.
        updated_name: The new name of the user.
        updated_full_name: The new full name of the user.
        updated_email: The new email of the user.
        updated_email_opt_in: The new email opt-in status of the user.
        updated_password: The new password of the user.
        old_password: The old password of the user. Required for password
            update.
        updated_is_admin: Whether the user should be an admin.
        updated_metadata: The new metadata for the user.
        updated_default_project_id: The new default project ID for the user.
        active: Use to activate or deactivate the user.

    Returns:
        The updated user.

    Raises:
        ValidationError: If the old password is not provided when updating
            the password.
    """
    user = self.get_user(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )
    user_update = UserUpdate(name=updated_name or user.name)
    if updated_full_name:
        user_update.full_name = updated_full_name
    if updated_email is not None:
        user_update.email = updated_email
        user_update.email_opted_in = (
            updated_email_opt_in or user.email_opted_in
        )
    if updated_email_opt_in is not None:
        user_update.email_opted_in = updated_email_opt_in
    if updated_password is not None:
        user_update.password = updated_password
        if old_password is None:
            raise ValidationError(
                "Old password is required to update the password."
            )
        user_update.old_password = old_password
    if updated_is_admin is not None:
        user_update.is_admin = updated_is_admin
    if active is not None:
        user_update.active = active

    if updated_metadata is not None:
        user_update.user_metadata = updated_metadata

    if updated_default_project_id is not None:
        user_update.default_project_id = updated_default_project_id

    return self.zen_store.update_user(
        user_id=user.id, user_update=user_update
    )

verify_service_connector(name_id_or_prefix, resource_type=None, resource_id=None, list_resources=True)

Verifies if a service connector has access to one or more resources.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the service connector to verify.

required
resource_type Optional[str]

The type of the resource for which to verify access. If not provided, the resource type from the service connector configuration will be used.

None
resource_id Optional[str]

The ID of the resource for which to verify access. If not provided, the resource ID from the service connector configuration will be used.

None
list_resources bool

Whether to list the resources that the service connector has access to.

True

Returns:

Type Description
ServiceConnectorResourcesModel

The list of resources that the service connector has access to,

ServiceConnectorResourcesModel

scoped to the supplied resource type and ID, if provided.

Raises:

Type Description
AuthorizationException

If the service connector does not have access to the resources.

Source code in src/zenml/client.py
5859
5860
5861
5862
5863
5864
5865
5866
5867
5868
5869
5870
5871
5872
5873
5874
5875
5876
5877
5878
5879
5880
5881
5882
5883
5884
5885
5886
5887
5888
5889
5890
5891
5892
5893
5894
5895
5896
5897
5898
5899
5900
5901
5902
5903
5904
5905
5906
5907
5908
5909
5910
5911
5912
5913
5914
5915
5916
5917
5918
5919
5920
5921
5922
5923
5924
5925
5926
5927
5928
5929
def verify_service_connector(
    self,
    name_id_or_prefix: Union[UUID, str],
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
    list_resources: bool = True,
) -> "ServiceConnectorResourcesModel":
    """Verifies if a service connector has access to one or more resources.

    Args:
        name_id_or_prefix: The name, id or prefix of the service connector
            to verify.
        resource_type: The type of the resource for which to verify access.
            If not provided, the resource type from the service connector
            configuration will be used.
        resource_id: The ID of the resource for which to verify access. If
            not provided, the resource ID from the service connector
            configuration will be used.
        list_resources: Whether to list the resources that the service
            connector has access to.

    Returns:
        The list of resources that the service connector has access to,
        scoped to the supplied resource type and ID, if provided.

    Raises:
        AuthorizationException: If the service connector does not have
            access to the resources.
    """
    from zenml.service_connectors.service_connector_registry import (
        service_connector_registry,
    )

    # Get the service connector model
    service_connector = self.get_service_connector(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
    )

    connector_type = self.get_service_connector_type(
        service_connector.type
    )

    # Prefer to verify the connector config server-side if the
    # implementation if available there, because it ensures
    # that the connector can be shared with other users or used
    # from other machines and because some auth methods rely on the
    # server-side authentication environment
    if connector_type.remote:
        connector_resources = self.zen_store.verify_service_connector(
            service_connector_id=service_connector.id,
            resource_type=resource_type,
            resource_id=resource_id,
            list_resources=list_resources,
        )
    else:
        connector_instance = (
            service_connector_registry.instantiate_connector(
                model=service_connector
            )
        )
        connector_resources = connector_instance.verify(
            resource_type=resource_type,
            resource_id=resource_id,
            list_resources=list_resources,
        )

    if connector_resources.error:
        raise AuthorizationException(connector_resources.error)

    return connector_resources

ClientConfiguration

Bases: FileSyncModel

Pydantic object used for serializing client configuration options.

Source code in src/zenml/client.py
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
class ClientConfiguration(FileSyncModel):
    """Pydantic object used for serializing client configuration options."""

    _active_project: Optional["ProjectResponse"] = None
    active_project_id: Optional[UUID] = None
    active_stack_id: Optional[UUID] = None
    _active_stack: Optional["StackResponse"] = None

    @property
    def active_project(self) -> "ProjectResponse":
        """Get the active project for the local client.

        Returns:
            The active project.

        Raises:
            RuntimeError: If no active project is set.
        """
        if self._active_project:
            return self._active_project
        else:
            raise RuntimeError(
                "No active project is configured. Run "
                "`zenml project set <NAME>` to set the active "
                "project."
            )

    def set_active_project(self, project: "ProjectResponse") -> None:
        """Set the project for the local client.

        Args:
            project: The project to set active.
        """
        self._active_project = project
        self.active_project_id = project.id

    def set_active_stack(self, stack: "StackResponse") -> None:
        """Set the stack for the local client.

        Args:
            stack: The stack to set active.
        """
        self.active_stack_id = stack.id
        self._active_stack = stack

    model_config = ConfigDict(
        # Validate attributes when assigning them. We need to set this in order
        # to have a mix of mutable and immutable attributes
        validate_assignment=True,
        # Allow extra attributes from configs of previous ZenML versions to
        # permit downgrading
        extra="allow",
    )

active_project property

Get the active project for the local client.

Returns:

Type Description
ProjectResponse

The active project.

Raises:

Type Description
RuntimeError

If no active project is set.

set_active_project(project)

Set the project for the local client.

Parameters:

Name Type Description Default
project ProjectResponse

The project to set active.

required
Source code in src/zenml/client.py
242
243
244
245
246
247
248
249
def set_active_project(self, project: "ProjectResponse") -> None:
    """Set the project for the local client.

    Args:
        project: The project to set active.
    """
    self._active_project = project
    self.active_project_id = project.id

set_active_stack(stack)

Set the stack for the local client.

Parameters:

Name Type Description Default
stack StackResponse

The stack to set active.

required
Source code in src/zenml/client.py
251
252
253
254
255
256
257
258
def set_active_stack(self, stack: "StackResponse") -> None:
    """Set the stack for the local client.

    Args:
        stack: The stack to set active.
    """
    self.active_stack_id = stack.id
    self._active_stack = stack

ClientMetaClass

Bases: ABCMeta

Client singleton metaclass.

This metaclass is used to enforce a singleton instance of the Client class with the following additional properties:

  • the singleton Client instance is created on first access to reflect the global configuration and local client configuration.
  • the Client shouldn't be accessed from within pipeline steps (a warning is logged if this is attempted).
Source code in src/zenml/client.py
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
class ClientMetaClass(ABCMeta):
    """Client singleton metaclass.

    This metaclass is used to enforce a singleton instance of the Client
    class with the following additional properties:

    * the singleton Client instance is created on first access to reflect
    the global configuration and local client configuration.
    * the Client shouldn't be accessed from within pipeline steps (a warning
    is logged if this is attempted).
    """

    def __init__(cls, *args: Any, **kwargs: Any) -> None:
        """Initialize the Client class.

        Args:
            *args: Positional arguments.
            **kwargs: Keyword arguments.
        """
        super().__init__(*args, **kwargs)
        cls._global_client: Optional["Client"] = None

    def __call__(cls, *args: Any, **kwargs: Any) -> "Client":
        """Create or return the global Client instance.

        If the Client constructor is called with custom arguments,
        the singleton functionality of the metaclass is bypassed: a new
        Client instance is created and returned immediately and without
        saving it as the global Client singleton.

        Args:
            *args: Positional arguments.
            **kwargs: Keyword arguments.

        Returns:
            Client: The global Client instance.
        """
        if args or kwargs:
            return cast("Client", super().__call__(*args, **kwargs))

        if not cls._global_client:
            cls._global_client = cast(
                "Client", super().__call__(*args, **kwargs)
            )

        return cls._global_client

__call__(*args, **kwargs)

Create or return the global Client instance.

If the Client constructor is called with custom arguments, the singleton functionality of the metaclass is bypassed: a new Client instance is created and returned immediately and without saving it as the global Client singleton.

Parameters:

Name Type Description Default
*args Any

Positional arguments.

()
**kwargs Any

Keyword arguments.

{}

Returns:

Name Type Description
Client Client

The global Client instance.

Source code in src/zenml/client.py
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
def __call__(cls, *args: Any, **kwargs: Any) -> "Client":
    """Create or return the global Client instance.

    If the Client constructor is called with custom arguments,
    the singleton functionality of the metaclass is bypassed: a new
    Client instance is created and returned immediately and without
    saving it as the global Client singleton.

    Args:
        *args: Positional arguments.
        **kwargs: Keyword arguments.

    Returns:
        Client: The global Client instance.
    """
    if args or kwargs:
        return cast("Client", super().__call__(*args, **kwargs))

    if not cls._global_client:
        cls._global_client = cast(
            "Client", super().__call__(*args, **kwargs)
        )

    return cls._global_client

__init__(*args, **kwargs)

Initialize the Client class.

Parameters:

Name Type Description Default
*args Any

Positional arguments.

()
**kwargs Any

Keyword arguments.

{}
Source code in src/zenml/client.py
282
283
284
285
286
287
288
289
290
def __init__(cls, *args: Any, **kwargs: Any) -> None:
    """Initialize the Client class.

    Args:
        *args: Positional arguments.
        **kwargs: Keyword arguments.
    """
    super().__init__(*args, **kwargs)
    cls._global_client: Optional["Client"] = None

Code Repositories

Initialization of the ZenML code repository base abstraction.

BaseCodeRepository

Bases: ABC

Base class for code repositories.

Code repositories are used to connect to a remote code repository and store information about the repository, such as the URL, the owner, the repository name, and the host. They also provide methods to download files from the repository when a pipeline is run remotely.

Source code in src/zenml/code_repositories/base_code_repository.py
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
class BaseCodeRepository(ABC):
    """Base class for code repositories.

    Code repositories are used to connect to a remote code repository and
    store information about the repository, such as the URL, the owner,
    the repository name, and the host. They also provide methods to
    download files from the repository when a pipeline is run remotely.
    """

    def __init__(
        self,
        id: UUID,
        name: str,
        config: Dict[str, Any],
    ) -> None:
        """Initializes a code repository.

        Args:
            id: The ID of the code repository.
            name: The name of the code repository.
            config: The config of the code repository.
        """
        self._id = id
        self._name = name
        self._config = config
        self.login()

    @property
    def config(self) -> "BaseCodeRepositoryConfig":
        """Config class for Code Repository.

        Returns:
            The config class.
        """
        return BaseCodeRepositoryConfig(**self._config)

    @classmethod
    def from_model(cls, model: CodeRepositoryResponse) -> "BaseCodeRepository":
        """Loads a code repository from a model.

        Args:
            model: The CodeRepositoryResponseModel to load from.

        Returns:
            The loaded code repository object.
        """
        class_: Type[BaseCodeRepository] = (
            source_utils.load_and_validate_class(
                source=model.source, expected_class=BaseCodeRepository
            )
        )
        return class_(id=model.id, name=model.name, config=model.config)

    @classmethod
    def validate_config(cls, config: Dict[str, Any]) -> None:
        """Validate the code repository config.

        This method should check that the config/credentials are valid and
        the configured repository exists.

        Args:
            config: The configuration.
        """
        # The initialization calls the login to verify the credentials
        code_repo = cls(id=uuid4(), name="", config=config)

        # Explicitly access the config for pydantic validation
        _ = code_repo.config

    @property
    def id(self) -> UUID:
        """ID of the code repository.

        Returns:
            The ID of the code repository.
        """
        return self._id

    @property
    def name(self) -> str:
        """Name of the code repository.

        Returns:
            The name of the code repository.
        """
        return self._name

    @property
    def requirements(self) -> Set[str]:
        """Set of PyPI requirements for the repository.

        Returns:
            A set of PyPI requirements for the repository.
        """
        from zenml.integrations.utils import get_requirements_for_module

        return set(get_requirements_for_module(self.__module__))

    @abstractmethod
    def login(self) -> None:
        """Logs into the code repository.

        This method is called when the code repository is initialized.
        It should be used to authenticate with the code repository.

        Raises:
            RuntimeError: If the login fails.
        """
        pass

    @abstractmethod
    def download_files(
        self, commit: str, directory: str, repo_sub_directory: Optional[str]
    ) -> None:
        """Downloads files from the code repository to a local directory.

        Args:
            commit: The commit hash to download files from.
            directory: The directory to download files to.
            repo_sub_directory: The subdirectory in the repository to
                download files from.

        Raises:
            RuntimeError: If the download fails.
        """
        pass

    @abstractmethod
    def get_local_context(
        self, path: str
    ) -> Optional["LocalRepositoryContext"]:
        """Gets a local repository context from a path.

        Args:
            path: The path to the local repository.

        Returns:
            The local repository context object.
        """
        pass

config property

Config class for Code Repository.

Returns:

Type Description
BaseCodeRepositoryConfig

The config class.

id property

ID of the code repository.

Returns:

Type Description
UUID

The ID of the code repository.

name property

Name of the code repository.

Returns:

Type Description
str

The name of the code repository.

requirements property

Set of PyPI requirements for the repository.

Returns:

Type Description
Set[str]

A set of PyPI requirements for the repository.

__init__(id, name, config)

Initializes a code repository.

Parameters:

Name Type Description Default
id UUID

The ID of the code repository.

required
name str

The name of the code repository.

required
config Dict[str, Any]

The config of the code repository.

required
Source code in src/zenml/code_repositories/base_code_repository.py
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
def __init__(
    self,
    id: UUID,
    name: str,
    config: Dict[str, Any],
) -> None:
    """Initializes a code repository.

    Args:
        id: The ID of the code repository.
        name: The name of the code repository.
        config: The config of the code repository.
    """
    self._id = id
    self._name = name
    self._config = config
    self.login()

download_files(commit, directory, repo_sub_directory) abstractmethod

Downloads files from the code repository to a local directory.

Parameters:

Name Type Description Default
commit str

The commit hash to download files from.

required
directory str

The directory to download files to.

required
repo_sub_directory Optional[str]

The subdirectory in the repository to download files from.

required

Raises:

Type Description
RuntimeError

If the download fails.

Source code in src/zenml/code_repositories/base_code_repository.py
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
@abstractmethod
def download_files(
    self, commit: str, directory: str, repo_sub_directory: Optional[str]
) -> None:
    """Downloads files from the code repository to a local directory.

    Args:
        commit: The commit hash to download files from.
        directory: The directory to download files to.
        repo_sub_directory: The subdirectory in the repository to
            download files from.

    Raises:
        RuntimeError: If the download fails.
    """
    pass

from_model(model) classmethod

Loads a code repository from a model.

Parameters:

Name Type Description Default
model CodeRepositoryResponse

The CodeRepositoryResponseModel to load from.

required

Returns:

Type Description
BaseCodeRepository

The loaded code repository object.

Source code in src/zenml/code_repositories/base_code_repository.py
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
@classmethod
def from_model(cls, model: CodeRepositoryResponse) -> "BaseCodeRepository":
    """Loads a code repository from a model.

    Args:
        model: The CodeRepositoryResponseModel to load from.

    Returns:
        The loaded code repository object.
    """
    class_: Type[BaseCodeRepository] = (
        source_utils.load_and_validate_class(
            source=model.source, expected_class=BaseCodeRepository
        )
    )
    return class_(id=model.id, name=model.name, config=model.config)

get_local_context(path) abstractmethod

Gets a local repository context from a path.

Parameters:

Name Type Description Default
path str

The path to the local repository.

required

Returns:

Type Description
Optional[LocalRepositoryContext]

The local repository context object.

Source code in src/zenml/code_repositories/base_code_repository.py
162
163
164
165
166
167
168
169
170
171
172
173
174
@abstractmethod
def get_local_context(
    self, path: str
) -> Optional["LocalRepositoryContext"]:
    """Gets a local repository context from a path.

    Args:
        path: The path to the local repository.

    Returns:
        The local repository context object.
    """
    pass

login() abstractmethod

Logs into the code repository.

This method is called when the code repository is initialized. It should be used to authenticate with the code repository.

Raises:

Type Description
RuntimeError

If the login fails.

Source code in src/zenml/code_repositories/base_code_repository.py
133
134
135
136
137
138
139
140
141
142
143
@abstractmethod
def login(self) -> None:
    """Logs into the code repository.

    This method is called when the code repository is initialized.
    It should be used to authenticate with the code repository.

    Raises:
        RuntimeError: If the login fails.
    """
    pass

validate_config(config) classmethod

Validate the code repository config.

This method should check that the config/credentials are valid and the configured repository exists.

Parameters:

Name Type Description Default
config Dict[str, Any]

The configuration.

required
Source code in src/zenml/code_repositories/base_code_repository.py
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
@classmethod
def validate_config(cls, config: Dict[str, Any]) -> None:
    """Validate the code repository config.

    This method should check that the config/credentials are valid and
    the configured repository exists.

    Args:
        config: The configuration.
    """
    # The initialization calls the login to verify the credentials
    code_repo = cls(id=uuid4(), name="", config=config)

    # Explicitly access the config for pydantic validation
    _ = code_repo.config

LocalRepositoryContext

Bases: ABC

Base class for local repository contexts.

This class is used to represent a local repository. It is used to track the current state of the repository and to provide information about the repository, such as the root path, the current commit, and whether the repository is dirty.

Source code in src/zenml/code_repositories/local_repository_context.py
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
class LocalRepositoryContext(ABC):
    """Base class for local repository contexts.

    This class is used to represent a local repository. It is used
    to track the current state of the repository and to provide
    information about the repository, such as the root path, the current
    commit, and whether the repository is dirty.
    """

    def __init__(self, code_repository: "BaseCodeRepository") -> None:
        """Initializes a local repository context.

        Args:
            code_repository: The code repository.
        """
        self._code_repository = code_repository

    @property
    def code_repository(self) -> "BaseCodeRepository":
        """Returns the code repository.

        Returns:
            The code repository.
        """
        return self._code_repository

    @property
    @abstractmethod
    def root(self) -> str:
        """Returns the root path of the local repository.

        Returns:
            The root path of the local repository.
        """
        pass

    @property
    @abstractmethod
    def is_dirty(self) -> bool:
        """Returns whether the local repository is dirty.

        A repository counts as dirty if it has any untracked or uncommitted
        changes.

        Returns:
            Whether the local repository is dirty.
        """
        pass

    @property
    @abstractmethod
    def has_local_changes(self) -> bool:
        """Returns whether the local repository has local changes.

        A repository has local changes if it is dirty or there are some commits
        which have not been pushed yet.

        Returns:
            Whether the local repository has local changes.
        """
        pass

    @property
    @abstractmethod
    def current_commit(self) -> str:
        """Returns the current commit of the local repository.

        Returns:
            The current commit of the local repository.
        """
        pass

code_repository property

Returns the code repository.

Returns:

Type Description
BaseCodeRepository

The code repository.

current_commit abstractmethod property

Returns the current commit of the local repository.

Returns:

Type Description
str

The current commit of the local repository.

has_local_changes abstractmethod property

Returns whether the local repository has local changes.

A repository has local changes if it is dirty or there are some commits which have not been pushed yet.

Returns:

Type Description
bool

Whether the local repository has local changes.

is_dirty abstractmethod property

Returns whether the local repository is dirty.

A repository counts as dirty if it has any untracked or uncommitted changes.

Returns:

Type Description
bool

Whether the local repository is dirty.

root abstractmethod property

Returns the root path of the local repository.

Returns:

Type Description
str

The root path of the local repository.

__init__(code_repository)

Initializes a local repository context.

Parameters:

Name Type Description Default
code_repository BaseCodeRepository

The code repository.

required
Source code in src/zenml/code_repositories/local_repository_context.py
36
37
38
39
40
41
42
def __init__(self, code_repository: "BaseCodeRepository") -> None:
    """Initializes a local repository context.

    Args:
        code_repository: The code repository.
    """
    self._code_repository = code_repository

Config

The config module contains classes and functions that manage user-specific configuration.

ZenML's configuration is stored in a file called config.yaml, located on the user's directory for configuration files. (The exact location differs from operating system to operating system.)

The GlobalConfiguration class is the main class in this module. It provides a Pydantic configuration object that is used to store and retrieve configuration. This GlobalConfiguration object handles the serialization and deserialization of the configuration options that are stored in the file in order to persist the configuration across sessions.

DockerSettings

Bases: BaseSettings

Settings for building Docker images to run ZenML pipelines.

Build process:

  • No dockerfile specified: If any of the options regarding requirements, environment variables or copying files require us to build an image, ZenML will build this image. Otherwise, the parent_image will be used to run the pipeline.
  • dockerfile specified: ZenML will first build an image based on the specified Dockerfile. If any of the options regarding requirements, environment variables or copying files require an additional image built on top of that, ZenML will build a second image. If not, the image build from the specified Dockerfile will be used to run the pipeline.

Requirements installation order:

Depending on the configuration of this object, requirements will be installed in the following order (each step optional): - The packages installed in your local python environment (extracted using pip freeze) - The packages required by the stack unless this is disabled by setting install_stack_requirements=False - The packages specified via the required_integrations - The packages defined inside a pyproject.toml file given by the pyproject_path attribute. - The packages specified via the requirements attribute

If neither replicate_local_python_environment, pyproject_path or requirements are specified, ZenML will try to automatically find a requirements.txt or pyproject.toml file in your current source root and installs packages from the first one it finds. You can disable this behavior by setting disable_automatic_requirements_detection=True.

Attributes:

Name Type Description
parent_image Optional[str]

Full name of the Docker image that should be used as the parent for the image that will be built. Defaults to a ZenML image built for the active Python and ZenML version.

Additional notes: * If you specify a custom image here, you need to make sure it has ZenML installed. * If this is a non-local image, the environment which is running the pipeline and building the Docker image needs to be able to pull this image. * If a custom dockerfile is specified for this settings object, this parent image will be ignored.

dockerfile Optional[str]

Path to a custom Dockerfile that should be built. Depending on the other values you specify in this object, the resulting image will be used directly to run your pipeline or ZenML will use it as a parent image to build on top of. See the general docstring of this class for more information.

Additional notes: * If you specify this, the parent_image attribute will be ignored. * If you specify this, the image built from this Dockerfile needs to have ZenML installed.

build_context_root Optional[str]

Build context root for the Docker build, only used when the dockerfile attribute is set. If this is left empty, the build context will only contain the Dockerfile.

parent_image_build_config Optional[DockerBuildConfig]

Configuration for the parent image build.

skip_build bool

If set to True, the parent image will be used directly to run the steps of your pipeline.

prevent_build_reuse bool

Prevent the reuse of an existing build.

target_repository Optional[str]

Name of the Docker repository to which the image should be pushed. This repository will be appended to the registry URI of the container registry of your stack and should therefore not include any registry. If not specified, the default repository name configured in the container registry stack component settings will be used.

python_package_installer PythonPackageInstaller

The package installer to use for python packages.

python_package_installer_args Dict[str, Any]

Arguments to pass to the python package installer.

disable_automatic_requirements_detection bool

If set to True, ZenML will not automatically detect requirements.txt files or pyproject.toml files in your source root.

replicate_local_python_environment Optional[Union[List[str], PythonEnvironmentExportMethod, bool]]

If set to True, ZenML will run pip freeze to gather the requirements of the local Python environment and then install them in the Docker image.

pyproject_path Optional[str]

Path to a pyproject.toml file. If given, the dependencies will be exported to a requirements.txt formatted file using the pyproject_export_command and then installed inside the Docker image.

pyproject_export_command Optional[List[str]]

Command to export the dependencies inside a pyproject.toml file to a requirements.txt formatted file. If not given and ZenML needs to export the requirements anyway, uv export and poetry export will be tried to see if one of them works. This command can contain a {directory} placeholder which will be replaced with the directory in which the pyproject.toml file is stored. Note: This command will be run before any code files are copied into the image. It is therefore not possible to install a local project using this command. This command should exclude any local projects, and you can specify a local_project_install_command instead which will be run after the code files are copied into the image.

requirements Union[None, str, List[str]]

Path to a requirements file or a list of required pip packages. During the image build, these requirements will be installed using pip. If you need to use a different tool to resolve and/or install your packages, please use a custom parent image or specify a custom dockerfile.

required_integrations List[str]

List of ZenML integrations that should be installed. All requirements for the specified integrations will be installed inside the Docker image.

install_stack_requirements bool

If True, ZenML will automatically detect if components of your active stack are part of a ZenML integration and install the corresponding requirements and apt packages. If you set this to False or use custom components in your stack, you need to make sure these get installed by specifying them in the requirements and apt_packages attributes.

local_project_install_command Optional[str]

Command to install a local project in the Docker image. This is run after the code files are copied into the image, and it is therefore only possible when code is included in the image, not downloaded at runtime.

apt_packages List[str]

APT packages to install inside the Docker image.

environment Dict[str, Any]

Dictionary of environment variables to set inside the Docker image.

build_config Optional[DockerBuildConfig]

Configuration for the main image build.

user Optional[str]

If not None, will set the user, make it owner of the /app directory which contains all the user code and run the container entrypoint as this user.

allow_including_files_in_images bool

If True, code can be included in the Docker images if code download from a code repository or artifact store is disabled or not possible.

allow_download_from_code_repository bool

If True, code can be downloaded from a code repository if possible.

allow_download_from_artifact_store bool

If True, code can be downloaded from the artifact store.

Source code in src/zenml/config/docker_settings.py
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
class DockerSettings(BaseSettings):
    """Settings for building Docker images to run ZenML pipelines.

    Build process:
    --------------
    * No `dockerfile` specified: If any of the options regarding
    requirements, environment variables or copying files require us to build an
    image, ZenML will build this image. Otherwise, the `parent_image` will be
    used to run the pipeline.
    * `dockerfile` specified: ZenML will first build an image based on the
    specified Dockerfile. If any of the options regarding
    requirements, environment variables or copying files require an additional
    image built on top of that, ZenML will build a second image. If not, the
    image build from the specified Dockerfile will be used to run the pipeline.

    Requirements installation order:
    --------------------------------
    Depending on the configuration of this object, requirements will be
    installed in the following order (each step optional):
    - The packages installed in your local python environment (extracted using
      `pip freeze`)
    - The packages required by the stack unless this is disabled by setting
      `install_stack_requirements=False`
    - The packages specified via the `required_integrations`
    - The packages defined inside a pyproject.toml file given by the
      `pyproject_path` attribute.
    - The packages specified via the `requirements` attribute

    If neither `replicate_local_python_environment`, `pyproject_path` or
    `requirements` are specified, ZenML will try to automatically find a
    requirements.txt or pyproject.toml file in your current source root
    and installs packages from the first one it finds. You can disable this
    behavior by setting `disable_automatic_requirements_detection=True`.

    Attributes:
        parent_image: Full name of the Docker image that should be
            used as the parent for the image that will be built. Defaults to
            a ZenML image built for the active Python and ZenML version.

            Additional notes:
            * If you specify a custom image here, you need to make sure it has
            ZenML installed.
            * If this is a non-local image, the environment which is running
            the pipeline and building the Docker image needs to be able to pull
            this image.
            * If a custom `dockerfile` is specified for this settings
            object, this parent image will be ignored.
        dockerfile: Path to a custom Dockerfile that should be built. Depending
            on the other values you specify in this object, the resulting
            image will be used directly to run your pipeline or ZenML will use
            it as a parent image to build on top of. See the general docstring
            of this class for more information.

            Additional notes:
            * If you specify this, the `parent_image` attribute will be ignored.
            * If you specify this, the image built from this Dockerfile needs
            to have ZenML installed.
        build_context_root: Build context root for the Docker build, only used
            when the `dockerfile` attribute is set. If this is left empty, the
            build context will only contain the Dockerfile.
        parent_image_build_config: Configuration for the parent image build.
        skip_build: If set to `True`, the parent image will be used directly to
            run the steps of your pipeline.
        prevent_build_reuse: Prevent the reuse of an existing build.
        target_repository: Name of the Docker repository to which the
            image should be pushed. This repository will be appended to the
            registry URI of the container registry of your stack and should
            therefore **not** include any registry. If not specified, the
            default repository name configured in the container registry
            stack component settings will be used.
        python_package_installer: The package installer to use for python
            packages.
        python_package_installer_args: Arguments to pass to the python package
            installer.
        disable_automatic_requirements_detection: If set to True, ZenML will
            not automatically detect requirements.txt files or pyproject.toml
            files in your source root.
        replicate_local_python_environment: If set to True, ZenML will run
            `pip freeze` to gather the requirements of the local Python
            environment and then install them in the Docker image.
        pyproject_path: Path to a pyproject.toml file. If given, the
            dependencies will be exported to a requirements.txt
            formatted file using the `pyproject_export_command` and then
            installed inside the Docker image.
        pyproject_export_command: Command to export the dependencies inside a
            pyproject.toml file to a requirements.txt formatted file. If not
            given and ZenML needs to export the requirements anyway, `uv export`
            and `poetry export` will be tried to see if one of them works. This
            command can contain a `{directory}` placeholder which will be
            replaced with the directory in which the pyproject.toml file is
            stored.
            **Note**: This command will be run before any code files are copied
            into the image. It is therefore not possible to install a local
            project using this command. This command should exclude any local
            projects, and you can specify a `local_project_install_command`
            instead which will be run after the code files are copied into the
            image.
        requirements: Path to a requirements file or a list of required pip
            packages. During the image build, these requirements will be
            installed using pip. If you need to use a different tool to
            resolve and/or install your packages, please use a custom parent
            image or specify a custom `dockerfile`.
        required_integrations: List of ZenML integrations that should be
            installed. All requirements for the specified integrations will
            be installed inside the Docker image.
        install_stack_requirements: If `True`, ZenML will automatically detect
            if components of your active stack are part of a ZenML integration
            and install the corresponding requirements and apt packages.
            If you set this to `False` or use custom components in your stack,
            you need to make sure these get installed by specifying them in
            the `requirements` and `apt_packages` attributes.
        local_project_install_command: Command to install a local project in
            the Docker image. This is run after the code files are copied into
            the image, and it is therefore only possible when code is included
            in the image, not downloaded at runtime.
        apt_packages: APT packages to install inside the Docker image.
        environment: Dictionary of environment variables to set inside the
            Docker image.
        build_config: Configuration for the main image build.
        user: If not `None`, will set the user, make it owner of the `/app`
            directory which contains all the user code and run the container
            entrypoint as this user.
        allow_including_files_in_images: If `True`, code can be included in the
            Docker images if code download from a code repository or artifact
            store is disabled or not possible.
        allow_download_from_code_repository: If `True`, code can be downloaded
            from a code repository if possible.
        allow_download_from_artifact_store: If `True`, code can be downloaded
            from the artifact store.
    """

    parent_image: Optional[str] = None
    dockerfile: Optional[str] = None
    build_context_root: Optional[str] = None
    parent_image_build_config: Optional[DockerBuildConfig] = None
    skip_build: bool = False
    prevent_build_reuse: bool = False
    target_repository: Optional[str] = None
    python_package_installer: PythonPackageInstaller = (
        PythonPackageInstaller.PIP
    )
    python_package_installer_args: Dict[str, Any] = {}
    disable_automatic_requirements_detection: bool = True
    replicate_local_python_environment: Optional[
        Union[List[str], PythonEnvironmentExportMethod, bool]
    ] = Field(default=None, union_mode="left_to_right")
    pyproject_path: Optional[str] = None
    pyproject_export_command: Optional[List[str]] = None
    requirements: Union[None, str, List[str]] = Field(
        default=None, union_mode="left_to_right"
    )
    required_integrations: List[str] = []
    install_stack_requirements: bool = True
    local_project_install_command: Optional[str] = None
    apt_packages: List[str] = []
    environment: Dict[str, Any] = {}
    user: Optional[str] = None
    build_config: Optional[DockerBuildConfig] = None

    allow_including_files_in_images: bool = True
    allow_download_from_code_repository: bool = True
    allow_download_from_artifact_store: bool = True

    # Deprecated attributes
    build_options: Dict[str, Any] = {}
    dockerignore: Optional[str] = None
    copy_files: bool = True
    copy_global_config: bool = True
    source_files: Optional[str] = None
    required_hub_plugins: List[str] = []

    _deprecation_validator = deprecation_utils.deprecate_pydantic_attributes(
        "copy_files",
        "copy_global_config",
        "source_files",
        "required_hub_plugins",
        "build_options",
        "dockerignore",
    )

    @model_validator(mode="before")
    @classmethod
    @before_validator_handler
    def _migrate_source_files(cls, data: Dict[str, Any]) -> Dict[str, Any]:
        """Migrate old source_files values.

        Args:
            data: The model data.

        Raises:
            ValueError: If an invalid source file mode is specified.

        Returns:
            The migrated data.
        """
        source_files = data.get("source_files", None)

        if source_files is None:
            return data

        replacement_attributes = [
            "allow_including_files_in_images",
            "allow_download_from_code_repository",
            "allow_download_from_artifact_store",
        ]
        if any(v in data for v in replacement_attributes):
            logger.warning(
                "Both `source_files` and one of %s specified for the "
                "DockerSettings, ignoring the `source_files` value.",
                replacement_attributes,
            )
            return data

        allow_including_files_in_images = False
        allow_download_from_code_repository = False
        allow_download_from_artifact_store = False

        if source_files == "download":
            allow_download_from_code_repository = True
        elif source_files == "include":
            allow_including_files_in_images = True
        elif source_files == "download_or_include":
            allow_including_files_in_images = True
            allow_download_from_code_repository = True
        elif source_files == "ignore":
            pass
        else:
            raise ValueError(f"Invalid source file mode `{source_files}`.")

        data["allow_including_files_in_images"] = (
            allow_including_files_in_images
        )
        data["allow_download_from_code_repository"] = (
            allow_download_from_code_repository
        )
        data["allow_download_from_artifact_store"] = (
            allow_download_from_artifact_store
        )

        return data

    @model_validator(mode="after")
    def _validate_skip_build(self) -> "DockerSettings":
        """Ensures that a parent image is passed when trying to skip the build.

        Returns:
            The validated settings values.

        Raises:
            ValueError: If the build should be skipped but no parent image
                was specified.
        """
        if self.skip_build and not self.parent_image:
            raise ValueError(
                "Docker settings that specify `skip_build=True` must always "
                "contain a `parent_image`. This parent image will be used "
                "to run the steps of your pipeline directly without additional "
                "Docker builds on top of it."
            )

        return self

    @model_validator(mode="after")
    def _validate_code_files_included_if_installing_local_project(
        self,
    ) -> "DockerSettings":
        """Ensures that files are included when installing a local package.

        Raises:
            ValueError: If files are not included in the Docker image
                when trying to install a local package.

        Returns:
            The validated settings values.
        """
        if (
            self.local_project_install_command
            and not self.allow_including_files_in_images
        ):
            raise ValueError(
                "Files must be included in the Docker image when trying to "
                "install a local python package. You can do so by setting "
                "the `allow_including_files_in_images` attribute of your "
                "DockerSettings to `True`."
            )

        return self

    @model_validator(mode="after")
    def _deprecate_replicate_local_environment_commands(
        self,
    ) -> "DockerSettings":
        """Deprecates some values for `replicate_local_python_environment`.

        Returns:
            The validated settings values.
        """
        if isinstance(
            self.replicate_local_python_environment,
            (str, list, PythonEnvironmentExportMethod),
        ) and (
            "replicate_local_python_environment"
            not in _docker_settings_warnings_logged
        ):
            logger.warning(
                "Specifying a command (`%s`) for "
                "`DockerSettings.replicate_local_python_environment` is "
                "deprecated. If you want to replicate your exact local "
                "environment using `pip freeze`, set "
                "`DockerSettings.replicate_local_python_environment=True`. "
                "If you want to export requirements from a pyproject.toml "
                "file, use `DockerSettings.pyproject_path` and "
                "`DockerSettings.pyproject_export_command` instead."
            )
            _docker_settings_warnings_logged.append(
                "replicate_local_python_environment"
            )
        return self

    @model_validator(mode="before")
    @classmethod
    @before_validator_handler
    def _warn_about_future_default_installer(
        cls, data: Dict[str, Any]
    ) -> Dict[str, Any]:
        """Warns about the future change of default package installer from pip to uv.

        Args:
            data: The model data.

        Returns:
            The validated settings values.
        """
        if (
            "python_package_installer" not in data
            and "python_package_installer"
            not in _docker_settings_warnings_logged
        ):
            logger.warning(
                "In a future release, the default Python package installer "
                "used by ZenML to build container images for your "
                "containerized pipelines will change from 'pip' to 'uv'. "
                "To maintain current behavior, you can explicitly set "
                "`python_package_installer=PythonPackageInstaller.PIP` "
                "in your DockerSettings."
            )
            _docker_settings_warnings_logged.append("python_package_installer")
        return data

    model_config = ConfigDict(
        # public attributes are immutable
        frozen=True,
        # prevent extra attributes during model initialization
        extra="ignore",
    )

ResourceSettings

Bases: BaseSettings

Hardware resource settings.

Attributes:

Name Type Description
cpu_count Optional[PositiveFloat]

The amount of CPU cores that should be configured.

gpu_count Optional[NonNegativeInt]

The amount of GPUs that should be configured.

memory Optional[str]

The amount of memory that should be configured.

Source code in src/zenml/config/resource_settings.py
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
class ResourceSettings(BaseSettings):
    """Hardware resource settings.

    Attributes:
        cpu_count: The amount of CPU cores that should be configured.
        gpu_count: The amount of GPUs that should be configured.
        memory: The amount of memory that should be configured.
    """

    cpu_count: Optional[PositiveFloat] = None
    gpu_count: Optional[NonNegativeInt] = None
    memory: Optional[str] = Field(pattern=MEMORY_REGEX, default=None)

    @property
    def empty(self) -> bool:
        """Returns if this object is "empty" (=no values configured) or not.

        Returns:
            `True` if no values were configured, `False` otherwise.
        """
        # To detect whether this config is empty (= no values specified), we
        # check if there are any attributes which are explicitly set to any
        # value other than `None`.
        return len(self.model_dump(exclude_unset=True, exclude_none=True)) == 0

    def get_memory(
        self, unit: Union[str, ByteUnit] = ByteUnit.GB
    ) -> Optional[float]:
        """Gets the memory configuration in a specific unit.

        Args:
            unit: The unit to which the memory should be converted.

        Raises:
            ValueError: If the memory string is invalid.

        Returns:
            The memory configuration converted to the requested unit, or None
            if no memory was configured.
        """
        if not self.memory:
            return None

        if isinstance(unit, str):
            unit = ByteUnit(unit)

        memory = self.memory
        for memory_unit in ByteUnit:
            if memory.endswith(memory_unit.value):
                memory_value = int(memory[: -len(memory_unit.value)])
                return memory_value * memory_unit.byte_value / unit.byte_value
        else:
            # Should never happen due to the regex validation
            raise ValueError(f"Unable to parse memory unit from '{memory}'.")

    model_config = SettingsConfigDict(
        # public attributes are immutable
        frozen=True,
        # prevent extra attributes during model initialization
        extra="ignore",
    )

empty property

Returns if this object is "empty" (=no values configured) or not.

Returns:

Type Description
bool

True if no values were configured, False otherwise.

get_memory(unit=ByteUnit.GB)

Gets the memory configuration in a specific unit.

Parameters:

Name Type Description Default
unit Union[str, ByteUnit]

The unit to which the memory should be converted.

GB

Raises:

Type Description
ValueError

If the memory string is invalid.

Returns:

Type Description
Optional[float]

The memory configuration converted to the requested unit, or None

Optional[float]

if no memory was configured.

Source code in src/zenml/config/resource_settings.py
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
def get_memory(
    self, unit: Union[str, ByteUnit] = ByteUnit.GB
) -> Optional[float]:
    """Gets the memory configuration in a specific unit.

    Args:
        unit: The unit to which the memory should be converted.

    Raises:
        ValueError: If the memory string is invalid.

    Returns:
        The memory configuration converted to the requested unit, or None
        if no memory was configured.
    """
    if not self.memory:
        return None

    if isinstance(unit, str):
        unit = ByteUnit(unit)

    memory = self.memory
    for memory_unit in ByteUnit:
        if memory.endswith(memory_unit.value):
            memory_value = int(memory[: -len(memory_unit.value)])
            return memory_value * memory_unit.byte_value / unit.byte_value
    else:
        # Should never happen due to the regex validation
        raise ValueError(f"Unable to parse memory unit from '{memory}'.")

StepRetryConfig

Bases: StrictBaseModel

Retry configuration for a step.

Delay is an integer (specified in seconds).

Source code in src/zenml/config/retry_config.py
19
20
21
22
23
24
25
26
27
class StepRetryConfig(StrictBaseModel):
    """Retry configuration for a step.

    Delay is an integer (specified in seconds).
    """

    max_retries: int = 1
    delay: int = 0  # in seconds
    backoff: int = 0

Console

ZenML console implementation.

Constants

ZenML constants.

handle_bool_env_var(var, default=False)

Converts normal env var to boolean.

Parameters:

Name Type Description Default
var str

The environment variable to convert.

required
default bool

The default value to return if the env var is not set.

False

Returns:

Type Description
bool

The converted value.

Source code in src/zenml/constants.py
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
def handle_bool_env_var(var: str, default: bool = False) -> bool:
    """Converts normal env var to boolean.

    Args:
        var: The environment variable to convert.
        default: The default value to return if the env var is not set.

    Returns:
        The converted value.
    """
    value = os.getenv(var)
    if is_true_string_value(value):
        return True
    elif is_false_string_value(value):
        return False
    return default

handle_int_env_var(var, default=0)

Converts normal env var to int.

Parameters:

Name Type Description Default
var str

The environment variable to convert.

required
default int

The default value to return if the env var is not set.

0

Returns:

Type Description
int

The converted value.

Source code in src/zenml/constants.py
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
def handle_int_env_var(var: str, default: int = 0) -> int:
    """Converts normal env var to int.

    Args:
        var: The environment variable to convert.
        default: The default value to return if the env var is not set.

    Returns:
        The converted value.
    """
    value = os.getenv(var, "")
    try:
        return int(value)
    except (ValueError, TypeError):
        return default

handle_json_env_var(var, expected_type, default=None)

Converts a json env var into a Python object.

Parameters:

Name Type Description Default
var str

The environment variable to convert.

required
default Optional[List[str]]

The default value to return if the env var is not set.

None
expected_type Type[T]

The type of the expected Python object.

required

Returns:

Type Description
Any

The converted list value.

Raises:

Type Description
TypeError

In case the value of the environment variable is not of a valid type.

Source code in src/zenml/constants.py
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
def handle_json_env_var(
    var: str,
    expected_type: Type[T],
    default: Optional[List[str]] = None,
) -> Any:
    """Converts a json env var into a Python object.

    Args:
        var:  The environment variable to convert.
        default: The default value to return if the env var is not set.
        expected_type: The type of the expected Python object.

    Returns:
        The converted list value.

    Raises:
        TypeError: In case the value of the environment variable is not of a
                   valid type.

    """
    # this needs to be here to avoid mutable defaults
    if default is None:
        default = []

    value = os.getenv(var)
    if value:
        try:
            loaded_value = json.loads(value)
            # check if loaded value is of correct type
            if expected_type is None or isinstance(
                loaded_value, expected_type
            ):
                return loaded_value
            else:
                raise TypeError  # if not correct type, raise TypeError
        except (TypeError, json.JSONDecodeError):
            # Use raw logging to avoid cyclic dependency
            logging.warning(
                f"Environment Variable {var} could not be loaded, into type "
                f"{expected_type}, defaulting to: {default}."
            )
            return default
    else:
        return default

is_false_string_value(value)

Checks if the given value is a string representation of 'False'.

Parameters:

Name Type Description Default
value Any

the value to check.

required

Returns:

Type Description
bool

Whether the input value represents a string version of 'False'.

Source code in src/zenml/constants.py
84
85
86
87
88
89
90
91
92
93
def is_false_string_value(value: Any) -> bool:
    """Checks if the given value is a string representation of 'False'.

    Args:
        value: the value to check.

    Returns:
        Whether the input value represents a string version of 'False'.
    """
    return value in ["0", "n", "no", "False", "false"]

is_true_string_value(value)

Checks if the given value is a string representation of 'True'.

Parameters:

Name Type Description Default
value Any

the value to check.

required

Returns:

Type Description
bool

Whether the input value represents a string version of 'True'.

Source code in src/zenml/constants.py
72
73
74
75
76
77
78
79
80
81
def is_true_string_value(value: Any) -> bool:
    """Checks if the given value is a string representation of 'True'.

    Args:
        value: the value to check.

    Returns:
        Whether the input value represents a string version of 'True'.
    """
    return value in ["1", "y", "yes", "True", "true"]

Container Registries

Initialization for ZenML's container registries module.

A container registry is a store for (Docker) containers. A ZenML workflow involving a container registry would automatically containerize your code to be transported across stacks running remotely. As part of the deployment to the cluster, the ZenML base image would be downloaded (from a cloud container registry) and used as the basis for the deployed 'run'.

For instance, when you are running a local container-based stack, you would therefore have a local container registry which stores the container images you create that bundle up your pipeline code. You could also use a remote container registry like the Elastic Container Registry at AWS in a more production setting.

AzureContainerRegistryFlavor

Bases: BaseContainerRegistryFlavor

Class for Azure Container Registry.

Source code in src/zenml/container_registries/azure_container_registry.py
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
class AzureContainerRegistryFlavor(BaseContainerRegistryFlavor):
    """Class for Azure Container Registry."""

    @property
    def name(self) -> str:
        """Name of the flavor.

        Returns:
            The name of the flavor.
        """
        return ContainerRegistryFlavor.AZURE.value

    @property
    def service_connector_requirements(
        self,
    ) -> Optional[ServiceConnectorRequirements]:
        """Service connector resource requirements for service connectors.

        Specifies resource requirements that are used to filter the available
        service connector types that are compatible with this flavor.

        Returns:
            Requirements for compatible service connectors, if a service
            connector is required for this flavor.
        """
        return ServiceConnectorRequirements(
            connector_type="azure",
            resource_type=DOCKER_REGISTRY_RESOURCE_TYPE,
            resource_id_attr="uri",
        )

    @property
    def docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_docs_url()

    @property
    def sdk_docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_sdk_docs_url()

    @property
    def logo_url(self) -> str:
        """A url to represent the flavor in the dashboard.

        Returns:
            The flavor logo.
        """
        return "https://public-flavor-logos.s3.eu-central-1.amazonaws.com/container_registry/azure.png"

docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

logo_url property

A url to represent the flavor in the dashboard.

Returns:

Type Description
str

The flavor logo.

name property

Name of the flavor.

Returns:

Type Description
str

The name of the flavor.

sdk_docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

service_connector_requirements property

Service connector resource requirements for service connectors.

Specifies resource requirements that are used to filter the available service connector types that are compatible with this flavor.

Returns:

Type Description
Optional[ServiceConnectorRequirements]

Requirements for compatible service connectors, if a service

Optional[ServiceConnectorRequirements]

connector is required for this flavor.

BaseContainerRegistry

Bases: AuthenticationMixin

Base class for all ZenML container registries.

Source code in src/zenml/container_registries/base_container_registry.py
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
class BaseContainerRegistry(AuthenticationMixin):
    """Base class for all ZenML container registries."""

    _docker_client: Optional["DockerClient"] = None

    @property
    def config(self) -> BaseContainerRegistryConfig:
        """Returns the `BaseContainerRegistryConfig` config.

        Returns:
            The configuration.
        """
        return cast(BaseContainerRegistryConfig, self._config)

    @property
    def requires_authentication(self) -> bool:
        """Returns whether the container registry requires authentication.

        Returns:
            `True` if the container registry requires authentication,
            `False` otherwise.
        """
        return bool(self.config.authentication_secret)

    @property
    def credentials(self) -> Optional[Tuple[str, str]]:
        """Username and password to authenticate with this container registry.

        Returns:
            Tuple with username and password if this container registry
            requires authentication, `None` otherwise.
        """
        secret = self.get_typed_authentication_secret(
            expected_schema_type=BasicAuthSecretSchema
        )
        if secret:
            return secret.username, secret.password

        connector = self.get_connector()
        if connector:
            from zenml.service_connectors.docker_service_connector import (
                DockerServiceConnector,
            )

            if isinstance(connector, DockerServiceConnector):
                return (
                    connector.config.username.get_secret_value(),
                    connector.config.password.get_secret_value(),
                )

        return None

    @property
    def docker_client(self) -> "DockerClient":
        """Returns a Docker client for this container registry.

        Returns:
            The Docker client.

        Raises:
            RuntimeError: If the connector does not return a Docker client.
        """
        from docker.client import DockerClient

        # Refresh the client also if the connector has expired
        if self._docker_client and not self.connector_has_expired():
            return self._docker_client

        connector = self.get_connector()
        if connector:
            client = connector.connect()
            if not isinstance(client, DockerClient):
                raise RuntimeError(
                    f"Expected a DockerClient while trying to use the "
                    f"linked connector, but got {type(client)}."
                )
            self._docker_client = client
        else:
            self._docker_client = (
                docker_utils._try_get_docker_client_from_env()
            )

            credentials = self.credentials
            if credentials:
                username, password = credentials
                self._docker_client.login(
                    username=username,
                    password=password,
                    registry=self.config.uri,
                    reauth=True,
                )

        return self._docker_client

    def is_valid_image_name_for_registry(self, image_name: str) -> bool:
        """Check if the image name is valid for the container registry.

        Args:
            image_name: The name of the image.

        Returns:
            `True` if the image name is valid for the container registry,
            `False` otherwise.
        """
        # Remove prefixes to make sure this logic also works for DockerHub
        image_name = image_name.removeprefix("index.docker.io/")
        image_name = image_name.removeprefix("docker.io/")

        registry_uri = self.config.uri.removeprefix("index.docker.io/")
        registry_uri = registry_uri.removeprefix("docker.io/")

        return image_name.startswith(registry_uri)

    def prepare_image_push(self, image_name: str) -> None:
        """Preparation before an image gets pushed.

        Subclasses can overwrite this to do any necessary checks or
        preparations before an image gets pushed.

        Args:
            image_name: Name of the docker image that will be pushed.
        """

    def push_image(self, image_name: str) -> str:
        """Pushes a docker image.

        Args:
            image_name: Name of the docker image that will be pushed.

        Returns:
            The Docker repository digest of the pushed image.

        Raises:
            ValueError: If the image name is not associated with this
                container registry.
        """
        if not self.is_valid_image_name_for_registry(image_name):
            raise ValueError(
                f"Docker image `{image_name}` does not belong to container "
                f"registry `{self.config.uri}`."
            )

        self.prepare_image_push(image_name)
        return docker_utils.push_image(
            image_name, docker_client=self.docker_client
        )

    def get_image_repo_digest(self, image_name: str) -> Optional[str]:
        """Get the repository digest of an image.

        Args:
            image_name: The name of the image.

        Returns:
            The repository digest of the image.
        """
        if not self.is_valid_image_name_for_registry(image_name):
            return None

        try:
            metadata = self.docker_client.images.get_registry_data(image_name)
        except Exception:
            return None

        return cast(str, metadata.id.split(":")[-1])

config property

Returns the BaseContainerRegistryConfig config.

Returns:

Type Description
BaseContainerRegistryConfig

The configuration.

credentials property

Username and password to authenticate with this container registry.

Returns:

Type Description
Optional[Tuple[str, str]]

Tuple with username and password if this container registry

Optional[Tuple[str, str]]

requires authentication, None otherwise.

docker_client property

Returns a Docker client for this container registry.

Returns:

Type Description
DockerClient

The Docker client.

Raises:

Type Description
RuntimeError

If the connector does not return a Docker client.

requires_authentication property

Returns whether the container registry requires authentication.

Returns:

Type Description
bool

True if the container registry requires authentication,

bool

False otherwise.

get_image_repo_digest(image_name)

Get the repository digest of an image.

Parameters:

Name Type Description Default
image_name str

The name of the image.

required

Returns:

Type Description
Optional[str]

The repository digest of the image.

Source code in src/zenml/container_registries/base_container_registry.py
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
def get_image_repo_digest(self, image_name: str) -> Optional[str]:
    """Get the repository digest of an image.

    Args:
        image_name: The name of the image.

    Returns:
        The repository digest of the image.
    """
    if not self.is_valid_image_name_for_registry(image_name):
        return None

    try:
        metadata = self.docker_client.images.get_registry_data(image_name)
    except Exception:
        return None

    return cast(str, metadata.id.split(":")[-1])

is_valid_image_name_for_registry(image_name)

Check if the image name is valid for the container registry.

Parameters:

Name Type Description Default
image_name str

The name of the image.

required

Returns:

Type Description
bool

True if the image name is valid for the container registry,

bool

False otherwise.

Source code in src/zenml/container_registries/base_container_registry.py
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
def is_valid_image_name_for_registry(self, image_name: str) -> bool:
    """Check if the image name is valid for the container registry.

    Args:
        image_name: The name of the image.

    Returns:
        `True` if the image name is valid for the container registry,
        `False` otherwise.
    """
    # Remove prefixes to make sure this logic also works for DockerHub
    image_name = image_name.removeprefix("index.docker.io/")
    image_name = image_name.removeprefix("docker.io/")

    registry_uri = self.config.uri.removeprefix("index.docker.io/")
    registry_uri = registry_uri.removeprefix("docker.io/")

    return image_name.startswith(registry_uri)

prepare_image_push(image_name)

Preparation before an image gets pushed.

Subclasses can overwrite this to do any necessary checks or preparations before an image gets pushed.

Parameters:

Name Type Description Default
image_name str

Name of the docker image that will be pushed.

required
Source code in src/zenml/container_registries/base_container_registry.py
182
183
184
185
186
187
188
189
190
def prepare_image_push(self, image_name: str) -> None:
    """Preparation before an image gets pushed.

    Subclasses can overwrite this to do any necessary checks or
    preparations before an image gets pushed.

    Args:
        image_name: Name of the docker image that will be pushed.
    """

push_image(image_name)

Pushes a docker image.

Parameters:

Name Type Description Default
image_name str

Name of the docker image that will be pushed.

required

Returns:

Type Description
str

The Docker repository digest of the pushed image.

Raises:

Type Description
ValueError

If the image name is not associated with this container registry.

Source code in src/zenml/container_registries/base_container_registry.py
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
def push_image(self, image_name: str) -> str:
    """Pushes a docker image.

    Args:
        image_name: Name of the docker image that will be pushed.

    Returns:
        The Docker repository digest of the pushed image.

    Raises:
        ValueError: If the image name is not associated with this
            container registry.
    """
    if not self.is_valid_image_name_for_registry(image_name):
        raise ValueError(
            f"Docker image `{image_name}` does not belong to container "
            f"registry `{self.config.uri}`."
        )

    self.prepare_image_push(image_name)
    return docker_utils.push_image(
        image_name, docker_client=self.docker_client
    )

DefaultContainerRegistryFlavor

Bases: BaseContainerRegistryFlavor

Class for default ZenML container registries.

Source code in src/zenml/container_registries/default_container_registry.py
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
class DefaultContainerRegistryFlavor(BaseContainerRegistryFlavor):
    """Class for default ZenML container registries."""

    @property
    def name(self) -> str:
        """Name of the flavor.

        Returns:
            The name of the flavor.
        """
        return ContainerRegistryFlavor.DEFAULT.value

    @property
    def docs_url(self) -> Optional[str]:
        """A URL to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_docs_url()

    @property
    def sdk_docs_url(self) -> Optional[str]:
        """A URL to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_sdk_docs_url()

    @property
    def logo_url(self) -> str:
        """A URL to represent the flavor in the dashboard.

        Returns:
            The flavor logo.
        """
        return "https://public-flavor-logos.s3.eu-central-1.amazonaws.com/container_registry/local.svg"

docs_url property

A URL to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

logo_url property

A URL to represent the flavor in the dashboard.

Returns:

Type Description
str

The flavor logo.

name property

Name of the flavor.

Returns:

Type Description
str

The name of the flavor.

sdk_docs_url property

A URL to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

DockerHubContainerRegistryFlavor

Bases: BaseContainerRegistryFlavor

Class for DockerHub Container Registry.

Source code in src/zenml/container_registries/dockerhub_container_registry.py
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
class DockerHubContainerRegistryFlavor(BaseContainerRegistryFlavor):
    """Class for DockerHub Container Registry."""

    @property
    def name(self) -> str:
        """Name of the flavor.

        Returns:
            The name of the flavor.
        """
        return ContainerRegistryFlavor.DOCKERHUB.value

    @property
    def service_connector_requirements(
        self,
    ) -> Optional[ServiceConnectorRequirements]:
        """Service connector resource requirements for service connectors.

        Specifies resource requirements that are used to filter the available
        service connector types that are compatible with this flavor.

        Returns:
            Requirements for compatible service connectors, if a service
            connector is required for this flavor.
        """
        return ServiceConnectorRequirements(
            connector_type="docker",
            resource_type=DOCKER_REGISTRY_RESOURCE_TYPE,
            resource_id_attr="uri",
        )

    @property
    def docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_docs_url()

    @property
    def sdk_docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_sdk_docs_url()

    @property
    def logo_url(self) -> str:
        """A url to represent the flavor in the dashboard.

        Returns:
            The flavor logo.
        """
        return "https://public-flavor-logos.s3.eu-central-1.amazonaws.com/container_registry/docker.png"

docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

logo_url property

A url to represent the flavor in the dashboard.

Returns:

Type Description
str

The flavor logo.

name property

Name of the flavor.

Returns:

Type Description
str

The name of the flavor.

sdk_docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

service_connector_requirements property

Service connector resource requirements for service connectors.

Specifies resource requirements that are used to filter the available service connector types that are compatible with this flavor.

Returns:

Type Description
Optional[ServiceConnectorRequirements]

Requirements for compatible service connectors, if a service

Optional[ServiceConnectorRequirements]

connector is required for this flavor.

GCPContainerRegistryFlavor

Bases: BaseContainerRegistryFlavor

Class for GCP Container Registry.

Source code in src/zenml/container_registries/gcp_container_registry.py
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
class GCPContainerRegistryFlavor(BaseContainerRegistryFlavor):
    """Class for GCP Container Registry."""

    @property
    def name(self) -> str:
        """Name of the flavor.

        Returns:
            The name of the flavor.
        """
        return ContainerRegistryFlavor.GCP.value

    @property
    def service_connector_requirements(
        self,
    ) -> Optional[ServiceConnectorRequirements]:
        """Service connector resource requirements for service connectors.

        Specifies resource requirements that are used to filter the available
        service connector types that are compatible with this flavor.

        Returns:
            Requirements for compatible service connectors, if a service
            connector is required for this flavor.
        """
        return ServiceConnectorRequirements(
            connector_type="gcp",
            resource_type=DOCKER_REGISTRY_RESOURCE_TYPE,
            resource_id_attr="uri",
        )

    @property
    def docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_docs_url()

    @property
    def sdk_docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_sdk_docs_url()

    @property
    def logo_url(self) -> str:
        """A url to represent the flavor in the dashboard.

        Returns:
            The flavor logo.
        """
        return "https://public-flavor-logos.s3.eu-central-1.amazonaws.com/container_registry/gcp.png"

docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

logo_url property

A url to represent the flavor in the dashboard.

Returns:

Type Description
str

The flavor logo.

name property

Name of the flavor.

Returns:

Type Description
str

The name of the flavor.

sdk_docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

service_connector_requirements property

Service connector resource requirements for service connectors.

Specifies resource requirements that are used to filter the available service connector types that are compatible with this flavor.

Returns:

Type Description
Optional[ServiceConnectorRequirements]

Requirements for compatible service connectors, if a service

Optional[ServiceConnectorRequirements]

connector is required for this flavor.

GitHubContainerRegistryFlavor

Bases: BaseContainerRegistryFlavor

Class for GitHub Container Registry.

Source code in src/zenml/container_registries/github_container_registry.py
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
class GitHubContainerRegistryFlavor(BaseContainerRegistryFlavor):
    """Class for GitHub Container Registry."""

    @property
    def name(self) -> str:
        """Name of the flavor.

        Returns:
            The name of the flavor.
        """
        return ContainerRegistryFlavor.GITHUB

    @property
    def docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_docs_url()

    @property
    def sdk_docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_sdk_docs_url()

    @property
    def logo_url(self) -> str:
        """A url to represent the flavor in the dashboard.

        Returns:
            The flavor logo.
        """
        return "https://public-flavor-logos.s3.eu-central-1.amazonaws.com/container_registry/github.png"

docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

logo_url property

A url to represent the flavor in the dashboard.

Returns:

Type Description
str

The flavor logo.

name property

Name of the flavor.

Returns:

Type Description
str

The name of the flavor.

sdk_docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

Data Validators

Data validators are stack components responsible for data profiling and validation.

BaseDataValidator

Bases: StackComponent

Base class for all ZenML data validators.

Source code in src/zenml/data_validators/base_data_validator.py
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
class BaseDataValidator(StackComponent):
    """Base class for all ZenML data validators."""

    NAME: ClassVar[str]
    FLAVOR: ClassVar[Type["BaseDataValidatorFlavor"]]

    @property
    def config(self) -> BaseDataValidatorConfig:
        """Returns the config of this data validator.

        Returns:
            The config of this data validator.
        """
        return cast(BaseDataValidatorConfig, self._config)

    @classmethod
    def get_active_data_validator(cls) -> "BaseDataValidator":
        """Get the data validator registered in the active stack.

        Returns:
            The data validator registered in the active stack.

        Raises:
            TypeError: if a data validator is not part of the
                active stack.
        """
        flavor: BaseDataValidatorFlavor = cls.FLAVOR()
        client = Client()
        data_validator = client.active_stack.data_validator
        if not data_validator or not isinstance(data_validator, cls):
            raise TypeError(
                f"The active stack needs to have a {cls.NAME} data "
                f"validator component registered to be able to run data validation "
                f"actions with {cls.NAME}. You can create a new stack with "
                f"a {cls.NAME} data validator component or update your "
                f"active stack to add this component, e.g.:\n\n"
                f"  `zenml data-validator register {flavor.name} "
                f"--flavor={flavor.name} ...`\n"
                f"  `zenml stack register <STACK-NAME> -dv {flavor.name} ...`\n"
                f"  or:\n"
                f"  `zenml stack update -dv {flavor.name}`\n\n"
            )

        return data_validator

    def data_profiling(
        self,
        dataset: Any,
        comparison_dataset: Optional[Any] = None,
        profile_list: Optional[Sequence[Any]] = None,
        **kwargs: Any,
    ) -> Any:
        """Analyze one or more datasets and generate a data profile.

        This method should be implemented by data validators that support
        analyzing a dataset and generating a data profile (e.g. schema,
        statistical summary, data distribution profile, validation
        rules, data drift reports etc.).
        The method should return a data profile object.

        This method also accepts an optional second dataset argument to
        accommodate different categories of data profiling, e.g.:

        * profiles generated from a single dataset: schema inference, validation
        rules inference, statistical profiles, data integrity reports
        * differential profiles that need a second dataset for comparison:
        differential statistical profiles, data drift reports

        Data validators that support generating multiple categories of data
        profiles should also take in a `profile_list` argument that lists the
        subset of profiles to be generated. If not supplied, the behavior is
        implementation specific, but it is recommended to provide a good default
        (e.g. a single default data profile type may be generated and returned,
        or all available data profiles may be generated and returned as a single
        result).

        Args:
            dataset: Target dataset to be profiled.
            comparison_dataset: Optional second dataset to be used for data
                comparison profiles (e.g data drift reports).
            profile_list: Optional list identifying the categories of data
                profiles to be generated.
            **kwargs: Implementation specific keyword arguments.

        Raises:
            NotImplementedError: if data profiling is not supported by this
                data validator.
        """
        raise NotImplementedError(
            f"Data profiling is not supported by the {self.__class__} data "
            f"validator."
        )

    def data_validation(
        self,
        dataset: Any,
        comparison_dataset: Optional[Any] = None,
        check_list: Optional[Sequence[Any]] = None,
        **kwargs: Any,
    ) -> Any:
        """Run data validation checks on a dataset.

        This method should be implemented by data validators that support
        running data quality checks an input dataset (e.g. data integrity
        checks, data drift checks).

        This method also accepts an optional second dataset argument to
        accommodate different categories of data validation tests, e.g.:

        * single dataset checks: data integrity checks (e.g. missing
        values, conflicting labels, mixed data types etc.)
        * checks that compare two datasets: data drift checks (e.g. new labels,
        feature drift, label drift etc.)

        Data validators that support running multiple categories of data
        integrity checks should also take in a `check_list` argument that
        lists the subset of checks to be performed. If not supplied, the
        behavior is implementation specific, but it is recommended to provide a
        good default (e.g. a single default validation check may be performed,
        or all available validation checks may be performed and their results
        returned as a list of objects).

        Args:
            dataset: Target dataset to be validated.
            comparison_dataset: Optional second dataset to be used for data
                comparison checks (e.g data drift checks).
            check_list: Optional list identifying the data checks to
                be performed.
            **kwargs: Implementation specific keyword arguments.

        Raises:
            NotImplementedError: if data validation is not
                supported by this data validator.
        """
        raise NotImplementedError(
            f"Data validation not implemented for {self}."
        )

    def model_validation(
        self,
        dataset: Any,
        model: Any,
        comparison_dataset: Optional[Any] = None,
        check_list: Optional[Sequence[Any]] = None,
        **kwargs: Any,
    ) -> Any:
        """Run model validation checks.

        This method should be implemented by data validators that support
        running model validation checks (e.g. confusion matrix validation,
        performance reports, model error analyzes, etc).

        Unlike `data_validation`, model validation checks require that a model
        be present as an active component during the validation process.

        This method also accepts an optional second dataset argument to
        accommodate different categories of data validation tests, e.g.:

        * single dataset tests: confusion matrix validation,
        performance reports, model error analyzes, etc
        * model comparison tests: tests that identify changes in a model
        behavior by comparing how it performs on two different datasets.

        Data validators that support running multiple categories of model
        validation checks should also take in a `check_list` argument that
        lists the subset of checks to be performed. If not supplied, the
        behavior is implementation specific, but it is recommended to provide a
        good default (e.g. a single default validation check may be performed,
        or all available validation checks may be performed and their results
        returned as a list of objects).

        Args:
            dataset: Target dataset to be validated.
            model: Target model to be validated.
            comparison_dataset: Optional second dataset to be used for model
                comparison checks (e.g model performance comparison checks).
            check_list: Optional list identifying the model validation checks to
                be performed.
            **kwargs: Implementation specific keyword arguments.

        Raises:
            NotImplementedError: if model validation is not supported by this
                data validator.
        """
        raise NotImplementedError(
            f"Model validation not implemented for {self}."
        )

config property

Returns the config of this data validator.

Returns:

Type Description
BaseDataValidatorConfig

The config of this data validator.

data_profiling(dataset, comparison_dataset=None, profile_list=None, **kwargs)

Analyze one or more datasets and generate a data profile.

This method should be implemented by data validators that support analyzing a dataset and generating a data profile (e.g. schema, statistical summary, data distribution profile, validation rules, data drift reports etc.). The method should return a data profile object.

This method also accepts an optional second dataset argument to accommodate different categories of data profiling, e.g.:

  • profiles generated from a single dataset: schema inference, validation rules inference, statistical profiles, data integrity reports
  • differential profiles that need a second dataset for comparison: differential statistical profiles, data drift reports

Data validators that support generating multiple categories of data profiles should also take in a profile_list argument that lists the subset of profiles to be generated. If not supplied, the behavior is implementation specific, but it is recommended to provide a good default (e.g. a single default data profile type may be generated and returned, or all available data profiles may be generated and returned as a single result).

Parameters:

Name Type Description Default
dataset Any

Target dataset to be profiled.

required
comparison_dataset Optional[Any]

Optional second dataset to be used for data comparison profiles (e.g data drift reports).

None
profile_list Optional[Sequence[Any]]

Optional list identifying the categories of data profiles to be generated.

None
**kwargs Any

Implementation specific keyword arguments.

{}

Raises:

Type Description
NotImplementedError

if data profiling is not supported by this data validator.

Source code in src/zenml/data_validators/base_data_validator.py
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
def data_profiling(
    self,
    dataset: Any,
    comparison_dataset: Optional[Any] = None,
    profile_list: Optional[Sequence[Any]] = None,
    **kwargs: Any,
) -> Any:
    """Analyze one or more datasets and generate a data profile.

    This method should be implemented by data validators that support
    analyzing a dataset and generating a data profile (e.g. schema,
    statistical summary, data distribution profile, validation
    rules, data drift reports etc.).
    The method should return a data profile object.

    This method also accepts an optional second dataset argument to
    accommodate different categories of data profiling, e.g.:

    * profiles generated from a single dataset: schema inference, validation
    rules inference, statistical profiles, data integrity reports
    * differential profiles that need a second dataset for comparison:
    differential statistical profiles, data drift reports

    Data validators that support generating multiple categories of data
    profiles should also take in a `profile_list` argument that lists the
    subset of profiles to be generated. If not supplied, the behavior is
    implementation specific, but it is recommended to provide a good default
    (e.g. a single default data profile type may be generated and returned,
    or all available data profiles may be generated and returned as a single
    result).

    Args:
        dataset: Target dataset to be profiled.
        comparison_dataset: Optional second dataset to be used for data
            comparison profiles (e.g data drift reports).
        profile_list: Optional list identifying the categories of data
            profiles to be generated.
        **kwargs: Implementation specific keyword arguments.

    Raises:
        NotImplementedError: if data profiling is not supported by this
            data validator.
    """
    raise NotImplementedError(
        f"Data profiling is not supported by the {self.__class__} data "
        f"validator."
    )

data_validation(dataset, comparison_dataset=None, check_list=None, **kwargs)

Run data validation checks on a dataset.

This method should be implemented by data validators that support running data quality checks an input dataset (e.g. data integrity checks, data drift checks).

This method also accepts an optional second dataset argument to accommodate different categories of data validation tests, e.g.:

  • single dataset checks: data integrity checks (e.g. missing values, conflicting labels, mixed data types etc.)
  • checks that compare two datasets: data drift checks (e.g. new labels, feature drift, label drift etc.)

Data validators that support running multiple categories of data integrity checks should also take in a check_list argument that lists the subset of checks to be performed. If not supplied, the behavior is implementation specific, but it is recommended to provide a good default (e.g. a single default validation check may be performed, or all available validation checks may be performed and their results returned as a list of objects).

Parameters:

Name Type Description Default
dataset Any

Target dataset to be validated.

required
comparison_dataset Optional[Any]

Optional second dataset to be used for data comparison checks (e.g data drift checks).

None
check_list Optional[Sequence[Any]]

Optional list identifying the data checks to be performed.

None
**kwargs Any

Implementation specific keyword arguments.

{}

Raises:

Type Description
NotImplementedError

if data validation is not supported by this data validator.

Source code in src/zenml/data_validators/base_data_validator.py
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
def data_validation(
    self,
    dataset: Any,
    comparison_dataset: Optional[Any] = None,
    check_list: Optional[Sequence[Any]] = None,
    **kwargs: Any,
) -> Any:
    """Run data validation checks on a dataset.

    This method should be implemented by data validators that support
    running data quality checks an input dataset (e.g. data integrity
    checks, data drift checks).

    This method also accepts an optional second dataset argument to
    accommodate different categories of data validation tests, e.g.:

    * single dataset checks: data integrity checks (e.g. missing
    values, conflicting labels, mixed data types etc.)
    * checks that compare two datasets: data drift checks (e.g. new labels,
    feature drift, label drift etc.)

    Data validators that support running multiple categories of data
    integrity checks should also take in a `check_list` argument that
    lists the subset of checks to be performed. If not supplied, the
    behavior is implementation specific, but it is recommended to provide a
    good default (e.g. a single default validation check may be performed,
    or all available validation checks may be performed and their results
    returned as a list of objects).

    Args:
        dataset: Target dataset to be validated.
        comparison_dataset: Optional second dataset to be used for data
            comparison checks (e.g data drift checks).
        check_list: Optional list identifying the data checks to
            be performed.
        **kwargs: Implementation specific keyword arguments.

    Raises:
        NotImplementedError: if data validation is not
            supported by this data validator.
    """
    raise NotImplementedError(
        f"Data validation not implemented for {self}."
    )

get_active_data_validator() classmethod

Get the data validator registered in the active stack.

Returns:

Type Description
BaseDataValidator

The data validator registered in the active stack.

Raises:

Type Description
TypeError

if a data validator is not part of the active stack.

Source code in src/zenml/data_validators/base_data_validator.py
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
@classmethod
def get_active_data_validator(cls) -> "BaseDataValidator":
    """Get the data validator registered in the active stack.

    Returns:
        The data validator registered in the active stack.

    Raises:
        TypeError: if a data validator is not part of the
            active stack.
    """
    flavor: BaseDataValidatorFlavor = cls.FLAVOR()
    client = Client()
    data_validator = client.active_stack.data_validator
    if not data_validator or not isinstance(data_validator, cls):
        raise TypeError(
            f"The active stack needs to have a {cls.NAME} data "
            f"validator component registered to be able to run data validation "
            f"actions with {cls.NAME}. You can create a new stack with "
            f"a {cls.NAME} data validator component or update your "
            f"active stack to add this component, e.g.:\n\n"
            f"  `zenml data-validator register {flavor.name} "
            f"--flavor={flavor.name} ...`\n"
            f"  `zenml stack register <STACK-NAME> -dv {flavor.name} ...`\n"
            f"  or:\n"
            f"  `zenml stack update -dv {flavor.name}`\n\n"
        )

    return data_validator

model_validation(dataset, model, comparison_dataset=None, check_list=None, **kwargs)

Run model validation checks.

This method should be implemented by data validators that support running model validation checks (e.g. confusion matrix validation, performance reports, model error analyzes, etc).

Unlike data_validation, model validation checks require that a model be present as an active component during the validation process.

This method also accepts an optional second dataset argument to accommodate different categories of data validation tests, e.g.:

  • single dataset tests: confusion matrix validation, performance reports, model error analyzes, etc
  • model comparison tests: tests that identify changes in a model behavior by comparing how it performs on two different datasets.

Data validators that support running multiple categories of model validation checks should also take in a check_list argument that lists the subset of checks to be performed. If not supplied, the behavior is implementation specific, but it is recommended to provide a good default (e.g. a single default validation check may be performed, or all available validation checks may be performed and their results returned as a list of objects).

Parameters:

Name Type Description Default
dataset Any

Target dataset to be validated.

required
model Any

Target model to be validated.

required
comparison_dataset Optional[Any]

Optional second dataset to be used for model comparison checks (e.g model performance comparison checks).

None
check_list Optional[Sequence[Any]]

Optional list identifying the model validation checks to be performed.

None
**kwargs Any

Implementation specific keyword arguments.

{}

Raises:

Type Description
NotImplementedError

if model validation is not supported by this data validator.

Source code in src/zenml/data_validators/base_data_validator.py
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
def model_validation(
    self,
    dataset: Any,
    model: Any,
    comparison_dataset: Optional[Any] = None,
    check_list: Optional[Sequence[Any]] = None,
    **kwargs: Any,
) -> Any:
    """Run model validation checks.

    This method should be implemented by data validators that support
    running model validation checks (e.g. confusion matrix validation,
    performance reports, model error analyzes, etc).

    Unlike `data_validation`, model validation checks require that a model
    be present as an active component during the validation process.

    This method also accepts an optional second dataset argument to
    accommodate different categories of data validation tests, e.g.:

    * single dataset tests: confusion matrix validation,
    performance reports, model error analyzes, etc
    * model comparison tests: tests that identify changes in a model
    behavior by comparing how it performs on two different datasets.

    Data validators that support running multiple categories of model
    validation checks should also take in a `check_list` argument that
    lists the subset of checks to be performed. If not supplied, the
    behavior is implementation specific, but it is recommended to provide a
    good default (e.g. a single default validation check may be performed,
    or all available validation checks may be performed and their results
    returned as a list of objects).

    Args:
        dataset: Target dataset to be validated.
        model: Target model to be validated.
        comparison_dataset: Optional second dataset to be used for model
            comparison checks (e.g model performance comparison checks).
        check_list: Optional list identifying the model validation checks to
            be performed.
        **kwargs: Implementation specific keyword arguments.

    Raises:
        NotImplementedError: if model validation is not supported by this
            data validator.
    """
    raise NotImplementedError(
        f"Model validation not implemented for {self}."
    )

BaseDataValidatorFlavor

Bases: Flavor

Base class for data validator flavors.

Source code in src/zenml/data_validators/base_data_validator.py
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
class BaseDataValidatorFlavor(Flavor):
    """Base class for data validator flavors."""

    @property
    def type(self) -> StackComponentType:
        """The type of the component.

        Returns:
            The type of the component.
        """
        return StackComponentType.DATA_VALIDATOR

    @property
    def config_class(self) -> Type[BaseDataValidatorConfig]:
        """Config class for data validator.

        Returns:
            Config class for data validator.
        """
        return BaseDataValidatorConfig

    @property
    def implementation_class(self) -> Type[BaseDataValidator]:
        """Implementation for data validator.

        Returns:
            Implementation for data validator.
        """
        return BaseDataValidator

config_class property

Config class for data validator.

Returns:

Type Description
Type[BaseDataValidatorConfig]

Config class for data validator.

implementation_class property

Implementation for data validator.

Returns:

Type Description
Type[BaseDataValidator]

Implementation for data validator.

type property

The type of the component.

Returns:

Type Description
StackComponentType

The type of the component.

Entrypoints

Initializations for ZenML entrypoints module.

PipelineEntrypointConfiguration

Bases: BaseEntrypointConfiguration

Base class for entrypoint configurations that run an entire pipeline.

Source code in src/zenml/entrypoints/pipeline_entrypoint_configuration.py
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
class PipelineEntrypointConfiguration(BaseEntrypointConfiguration):
    """Base class for entrypoint configurations that run an entire pipeline."""

    def run(self) -> None:
        """Prepares the environment and runs the configured pipeline."""
        deployment = self.load_deployment()

        # Activate all the integrations. This makes sure that all materializers
        # and stack component flavors are registered.
        integration_registry.activate_integrations()

        self.download_code_if_necessary(deployment=deployment)

        orchestrator = Client().active_stack.orchestrator
        orchestrator._prepare_run(deployment=deployment)

        for step in deployment.step_configurations.values():
            orchestrator.run_step(step)

run()

Prepares the environment and runs the configured pipeline.

Source code in src/zenml/entrypoints/pipeline_entrypoint_configuration.py
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
def run(self) -> None:
    """Prepares the environment and runs the configured pipeline."""
    deployment = self.load_deployment()

    # Activate all the integrations. This makes sure that all materializers
    # and stack component flavors are registered.
    integration_registry.activate_integrations()

    self.download_code_if_necessary(deployment=deployment)

    orchestrator = Client().active_stack.orchestrator
    orchestrator._prepare_run(deployment=deployment)

    for step in deployment.step_configurations.values():
        orchestrator.run_step(step)

StepEntrypointConfiguration

Bases: BaseEntrypointConfiguration

Base class for entrypoint configurations that run a single step.

If an orchestrator needs to run steps in a separate process or environment (e.g. a docker container), this class can either be used directly or subclassed if custom behavior is necessary.

How to subclass:

Passing additional arguments to the entrypoint: If you need to pass additional arguments to the entrypoint, there are two methods that you need to implement: * get_entrypoint_options(): This method should return all the options that are required in the entrypoint. Make sure to include the result from the superclass method so the options are complete.

    * `get_entrypoint_arguments(...)`: This method should return
        a list of arguments that should be passed to the entrypoint.
        Make sure to include the result from the superclass method so
        the arguments are complete.

You'll be able to access the argument values from `self.entrypoint_args`
inside your `StepEntrypointConfiguration` subclass.

How to use:

After you created your StepEntrypointConfiguration subclass, you only have to run the entrypoint somewhere. To do this, you should execute the command returned by the get_entrypoint_command() method with the arguments returned by the get_entrypoint_arguments(...) method.

Example:

class MyStepEntrypointConfiguration(StepEntrypointConfiguration):
    ...

class MyOrchestrator(BaseOrchestrator):
    def prepare_or_run_pipeline(
        self,
        deployment: "PipelineDeployment",
        stack: "Stack",
        environment: Dict[str, str],
        placeholder_run: Optional["PipelineRunResponse"] = None,
    ) -> Any:
        ...

        cmd = MyStepEntrypointConfiguration.get_entrypoint_command()
        for step_name, step in pipeline.steps.items():
            ...

            args = MyStepEntrypointConfiguration.get_entrypoint_arguments(
                step_name=step_name
            )
            # Run the command and pass it the arguments. Our example
            # orchestrator here executes the entrypoint in a separate
            # process, but in a real-world scenario you would probably run
            # it inside a docker container or a different environment.
            import subprocess
            subprocess.check_call(cmd + args)
Source code in src/zenml/entrypoints/step_entrypoint_configuration.py
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
class StepEntrypointConfiguration(BaseEntrypointConfiguration):
    """Base class for entrypoint configurations that run a single step.

    If an orchestrator needs to run steps in a separate process or environment
    (e.g. a docker container), this class can either be used directly or
    subclassed if custom behavior is necessary.

    How to subclass:
    ----------------
    Passing additional arguments to the entrypoint:
        If you need to pass additional arguments to the entrypoint, there are
        two methods that you need to implement:
            * `get_entrypoint_options()`: This method should return all
                the options that are required in the entrypoint. Make sure to
                include the result from the superclass method so the options
                are complete.

            * `get_entrypoint_arguments(...)`: This method should return
                a list of arguments that should be passed to the entrypoint.
                Make sure to include the result from the superclass method so
                the arguments are complete.

        You'll be able to access the argument values from `self.entrypoint_args`
        inside your `StepEntrypointConfiguration` subclass.

    How to use:
    -----------
    After you created your `StepEntrypointConfiguration` subclass, you only
    have to run the entrypoint somewhere. To do this, you should execute the
    command returned by the `get_entrypoint_command()` method with the
    arguments returned by the `get_entrypoint_arguments(...)` method.

    Example:
    ```python
    class MyStepEntrypointConfiguration(StepEntrypointConfiguration):
        ...

    class MyOrchestrator(BaseOrchestrator):
        def prepare_or_run_pipeline(
            self,
            deployment: "PipelineDeployment",
            stack: "Stack",
            environment: Dict[str, str],
            placeholder_run: Optional["PipelineRunResponse"] = None,
        ) -> Any:
            ...

            cmd = MyStepEntrypointConfiguration.get_entrypoint_command()
            for step_name, step in pipeline.steps.items():
                ...

                args = MyStepEntrypointConfiguration.get_entrypoint_arguments(
                    step_name=step_name
                )
                # Run the command and pass it the arguments. Our example
                # orchestrator here executes the entrypoint in a separate
                # process, but in a real-world scenario you would probably run
                # it inside a docker container or a different environment.
                import subprocess
                subprocess.check_call(cmd + args)
    ```
    """

    def post_run(
        self,
        pipeline_name: str,
        step_name: str,
    ) -> None:
        """Does cleanup or post-processing after the step finished running.

        Subclasses should overwrite this method if they need to run any
        additional code after the step execution.

        Args:
            pipeline_name: Name of the parent pipeline of the step that was
                executed.
            step_name: Name of the step that was executed.
        """

    @classmethod
    def get_entrypoint_options(cls) -> Set[str]:
        """Gets all options required for running with this configuration.

        Returns:
            The superclass options as well as an option for the name of the
            step to run.
        """
        return super().get_entrypoint_options() | {STEP_NAME_OPTION}

    @classmethod
    def get_entrypoint_arguments(
        cls,
        **kwargs: Any,
    ) -> List[str]:
        """Gets all arguments that the entrypoint command should be called with.

        The argument list should be something that
        `argparse.ArgumentParser.parse_args(...)` can handle (e.g.
        `["--some_option", "some_value"]` or `["--some_option=some_value"]`).
        It needs to provide values for all options returned by the
        `get_entrypoint_options()` method of this class.

        Args:
            **kwargs: Kwargs, must include the step name.

        Returns:
            The superclass arguments as well as arguments for the name of the
            step to run.
        """
        return super().get_entrypoint_arguments(**kwargs) + [
            f"--{STEP_NAME_OPTION}",
            kwargs[STEP_NAME_OPTION],
        ]

    def load_deployment(self) -> "PipelineDeploymentResponse":
        """Loads the deployment.

        Returns:
            The deployment.
        """
        deployment_id = UUID(self.entrypoint_args[DEPLOYMENT_ID_OPTION])
        step_name = self.entrypoint_args[STEP_NAME_OPTION]
        return Client().zen_store.get_deployment(
            deployment_id=deployment_id, step_configuration_filter=[step_name]
        )

    def run(self) -> None:
        """Prepares the environment and runs the configured step."""
        deployment = self.load_deployment()

        # Activate all the integrations. This makes sure that all materializers
        # and stack component flavors are registered.
        integration_registry.activate_integrations()

        step_name = self.entrypoint_args[STEP_NAME_OPTION]

        # Change the working directory to make sure we're in the correct
        # directory where the files in the Docker image should be included.
        # This is necessary as some services overwrite the working directory
        # configured in the Docker image itself.
        os.makedirs("/app", exist_ok=True)
        os.chdir("/app")

        self.download_code_if_necessary(
            deployment=deployment, step_name=step_name
        )

        # If the working directory is not in the sys.path, we include it to make
        # sure user code gets correctly imported.
        cwd = os.getcwd()
        if cwd not in sys.path:
            sys.path.insert(0, cwd)

        pipeline_name = deployment.pipeline_configuration.name

        step = deployment.step_configurations[step_name]
        self._run_step(step, deployment=deployment)

        self.post_run(
            pipeline_name=pipeline_name,
            step_name=step_name,
        )

    def _run_step(
        self,
        step: "Step",
        deployment: "PipelineDeploymentResponse",
    ) -> None:
        """Runs a single step.

        Args:
            step: The step to run.
            deployment: The deployment configuration.
        """
        orchestrator = Client().active_stack.orchestrator
        orchestrator._prepare_run(deployment=deployment)
        orchestrator.run_step(step=step)

get_entrypoint_arguments(**kwargs) classmethod

Gets all arguments that the entrypoint command should be called with.

The argument list should be something that argparse.ArgumentParser.parse_args(...) can handle (e.g. ["--some_option", "some_value"] or ["--some_option=some_value"]). It needs to provide values for all options returned by the get_entrypoint_options() method of this class.

Parameters:

Name Type Description Default
**kwargs Any

Kwargs, must include the step name.

{}

Returns:

Type Description
List[str]

The superclass arguments as well as arguments for the name of the

List[str]

step to run.

Source code in src/zenml/entrypoints/step_entrypoint_configuration.py
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
@classmethod
def get_entrypoint_arguments(
    cls,
    **kwargs: Any,
) -> List[str]:
    """Gets all arguments that the entrypoint command should be called with.

    The argument list should be something that
    `argparse.ArgumentParser.parse_args(...)` can handle (e.g.
    `["--some_option", "some_value"]` or `["--some_option=some_value"]`).
    It needs to provide values for all options returned by the
    `get_entrypoint_options()` method of this class.

    Args:
        **kwargs: Kwargs, must include the step name.

    Returns:
        The superclass arguments as well as arguments for the name of the
        step to run.
    """
    return super().get_entrypoint_arguments(**kwargs) + [
        f"--{STEP_NAME_OPTION}",
        kwargs[STEP_NAME_OPTION],
    ]

get_entrypoint_options() classmethod

Gets all options required for running with this configuration.

Returns:

Type Description
Set[str]

The superclass options as well as an option for the name of the

Set[str]

step to run.

Source code in src/zenml/entrypoints/step_entrypoint_configuration.py
117
118
119
120
121
122
123
124
125
@classmethod
def get_entrypoint_options(cls) -> Set[str]:
    """Gets all options required for running with this configuration.

    Returns:
        The superclass options as well as an option for the name of the
        step to run.
    """
    return super().get_entrypoint_options() | {STEP_NAME_OPTION}

load_deployment()

Loads the deployment.

Returns:

Type Description
PipelineDeploymentResponse

The deployment.

Source code in src/zenml/entrypoints/step_entrypoint_configuration.py
152
153
154
155
156
157
158
159
160
161
162
def load_deployment(self) -> "PipelineDeploymentResponse":
    """Loads the deployment.

    Returns:
        The deployment.
    """
    deployment_id = UUID(self.entrypoint_args[DEPLOYMENT_ID_OPTION])
    step_name = self.entrypoint_args[STEP_NAME_OPTION]
    return Client().zen_store.get_deployment(
        deployment_id=deployment_id, step_configuration_filter=[step_name]
    )

post_run(pipeline_name, step_name)

Does cleanup or post-processing after the step finished running.

Subclasses should overwrite this method if they need to run any additional code after the step execution.

Parameters:

Name Type Description Default
pipeline_name str

Name of the parent pipeline of the step that was executed.

required
step_name str

Name of the step that was executed.

required
Source code in src/zenml/entrypoints/step_entrypoint_configuration.py
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
def post_run(
    self,
    pipeline_name: str,
    step_name: str,
) -> None:
    """Does cleanup or post-processing after the step finished running.

    Subclasses should overwrite this method if they need to run any
    additional code after the step execution.

    Args:
        pipeline_name: Name of the parent pipeline of the step that was
            executed.
        step_name: Name of the step that was executed.
    """

run()

Prepares the environment and runs the configured step.

Source code in src/zenml/entrypoints/step_entrypoint_configuration.py
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
def run(self) -> None:
    """Prepares the environment and runs the configured step."""
    deployment = self.load_deployment()

    # Activate all the integrations. This makes sure that all materializers
    # and stack component flavors are registered.
    integration_registry.activate_integrations()

    step_name = self.entrypoint_args[STEP_NAME_OPTION]

    # Change the working directory to make sure we're in the correct
    # directory where the files in the Docker image should be included.
    # This is necessary as some services overwrite the working directory
    # configured in the Docker image itself.
    os.makedirs("/app", exist_ok=True)
    os.chdir("/app")

    self.download_code_if_necessary(
        deployment=deployment, step_name=step_name
    )

    # If the working directory is not in the sys.path, we include it to make
    # sure user code gets correctly imported.
    cwd = os.getcwd()
    if cwd not in sys.path:
        sys.path.insert(0, cwd)

    pipeline_name = deployment.pipeline_configuration.name

    step = deployment.step_configurations[step_name]
    self._run_step(step, deployment=deployment)

    self.post_run(
        pipeline_name=pipeline_name,
        step_name=step_name,
    )

Enums

ZenML enums.

APITokenType

Bases: StrEnum

The API token type.

Source code in src/zenml/enums.py
242
243
244
245
246
class APITokenType(StrEnum):
    """The API token type."""

    GENERIC = "generic"
    WORKLOAD = "workload"

AnalyticsEventSource

Bases: StrEnum

Enum to identify analytics events source.

Source code in src/zenml/enums.py
207
208
209
210
211
212
class AnalyticsEventSource(StrEnum):
    """Enum to identify analytics events source."""

    ZENML_GO = "zenml go"
    ZENML_INIT = "zenml init"
    ZENML_SERVER = "zenml server"

AnnotationTasks

Bases: StrEnum

Supported annotation tasks.

Source code in src/zenml/enums.py
183
184
185
186
187
188
189
class AnnotationTasks(StrEnum):
    """Supported annotation tasks."""

    IMAGE_CLASSIFICATION = "image_classification"
    OBJECT_DETECTION_BOUNDING_BOXES = "object_detection_bounding_boxes"
    OCR = "optical_character_recognition"
    TEXT_CLASSIFICATION = "text_classification"

ArtifactSaveType

Bases: StrEnum

All possible method types of how artifact versions can be saved.

Source code in src/zenml/enums.py
45
46
47
48
49
50
51
52
53
class ArtifactSaveType(StrEnum):
    """All possible method types of how artifact versions can be saved."""

    STEP_OUTPUT = "step_output"  # output of the current step
    MANUAL = "manual"  # manually saved via `zenml.save_artifact()`
    PREEXISTING = "preexisting"  # register via `zenml.register_artifact()`
    EXTERNAL = (
        "external"  # saved via `zenml.ExternalArtifact.upload_by_value()`
    )

ArtifactType

Bases: StrEnum

All possible types an artifact can have.

Source code in src/zenml/enums.py
22
23
24
25
26
27
28
29
30
31
class ArtifactType(StrEnum):
    """All possible types an artifact can have."""

    DATA_ANALYSIS = "DataAnalysisArtifact"
    DATA = "DataArtifact"
    MODEL = "ModelArtifact"
    SCHEMA = "SchemaArtifact"  # deprecated
    SERVICE = "ServiceArtifact"
    STATISTICS = "StatisticsArtifact"  # deprecated in favor of `DATA_ANALYSIS`
    BASE = "BaseArtifact"

AuthScheme

Bases: StrEnum

The authentication scheme.

Source code in src/zenml/enums.py
215
216
217
218
219
220
221
class AuthScheme(StrEnum):
    """The authentication scheme."""

    NO_AUTH = "NO_AUTH"
    HTTP_BASIC = "HTTP_BASIC"
    OAUTH2_PASSWORD_BEARER = "OAUTH2_PASSWORD_BEARER"
    EXTERNAL = "EXTERNAL"

CliCategories

Bases: StrEnum

All possible categories for CLI commands.

Note: The order of the categories is important. The same order is used to sort the commands in the CLI help output.

Source code in src/zenml/enums.py
167
168
169
170
171
172
173
174
175
176
177
178
179
180
class CliCategories(StrEnum):
    """All possible categories for CLI commands.

    Note: The order of the categories is important. The same
    order is used to sort the commands in the CLI help output.
    """

    STACK_COMPONENTS = "Stack Components"
    MODEL_DEPLOYMENT = "Model Deployment"
    INTEGRATIONS = "Integrations"
    MANAGEMENT_TOOLS = "Management Tools"
    MODEL_CONTROL_PLANE = "Model Control Plane"
    IDENTITY_AND_SECURITY = "Identity and Security"
    OTHER_COMMANDS = "Other Commands"

ColorVariants

Bases: StrEnum

All possible color variants for frontend.

Source code in src/zenml/enums.py
330
331
332
333
334
335
336
337
338
339
340
341
342
343
class ColorVariants(StrEnum):
    """All possible color variants for frontend."""

    GREY = "grey"
    PURPLE = "purple"
    RED = "red"
    GREEN = "green"
    YELLOW = "yellow"
    ORANGE = "orange"
    LIME = "lime"
    TEAL = "teal"
    TURQUOISE = "turquoise"
    MAGENTA = "magenta"
    BLUE = "blue"

ContainerRegistryFlavor

Bases: StrEnum

Flavors of container registries.

Source code in src/zenml/enums.py
157
158
159
160
161
162
163
164
class ContainerRegistryFlavor(StrEnum):
    """Flavors of container registries."""

    DEFAULT = "default"
    GITHUB = "github"
    DOCKERHUB = "dockerhub"
    GCP = "gcp"
    AZURE = "azure"

DatabaseBackupStrategy

Bases: StrEnum

All available database backup strategies.

Source code in src/zenml/enums.py
376
377
378
379
380
381
382
383
384
385
386
class DatabaseBackupStrategy(StrEnum):
    """All available database backup strategies."""

    # Backup disabled
    DISABLED = "disabled"
    # In-memory backup
    IN_MEMORY = "in-memory"
    # Dump the database to a file
    DUMP_FILE = "dump-file"
    # Create a backup of the database in the remote database service
    DATABASE = "database"

EnvironmentType

Bases: StrEnum

Enum for environment types.

Source code in src/zenml/enums.py
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
class EnvironmentType(StrEnum):
    """Enum for environment types."""

    BITBUCKET_CI = "bitbucket_ci"
    CIRCLE_CI = "circle_ci"
    COLAB = "colab"
    CONTAINER = "container"
    DOCKER = "docker"
    GENERIC_CI = "generic_ci"
    GITHUB_ACTION = "github_action"
    GITLAB_CI = "gitlab_ci"
    KUBERNETES = "kubernetes"
    NATIVE = "native"
    NOTEBOOK = "notebook"
    PAPERSPACE = "paperspace"
    WSL = "wsl"
    LIGHTNING_AI_STUDIO = "lightning_ai_studio"
    GITHUB_CODESPACES = "github_codespaces"
    VSCODE_REMOTE_CONTAINER = "vscode_remote_container"
    ZENML_CODESPACE = "zenml_codespace"

ExecutionStatus

Bases: StrEnum

Enum that represents the current status of a step or pipeline run.

Source code in src/zenml/enums.py
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
class ExecutionStatus(StrEnum):
    """Enum that represents the current status of a step or pipeline run."""

    INITIALIZING = "initializing"
    FAILED = "failed"
    COMPLETED = "completed"
    RUNNING = "running"
    CACHED = "cached"

    @property
    def is_finished(self) -> bool:
        """Whether the execution status refers to a finished execution.

        Returns:
            Whether the execution status refers to a finished execution.
        """
        return self in {
            ExecutionStatus.FAILED,
            ExecutionStatus.COMPLETED,
            ExecutionStatus.CACHED,
        }

is_finished property

Whether the execution status refers to a finished execution.

Returns:

Type Description
bool

Whether the execution status refers to a finished execution.

GenericFilterOps

Bases: StrEnum

Ops for all filters for string values on list methods.

Source code in src/zenml/enums.py
249
250
251
252
253
254
255
256
257
258
259
260
261
262
class GenericFilterOps(StrEnum):
    """Ops for all filters for string values on list methods."""

    EQUALS = "equals"
    NOT_EQUALS = "notequals"
    CONTAINS = "contains"
    STARTSWITH = "startswith"
    ENDSWITH = "endswith"
    ONEOF = "oneof"
    GTE = "gte"
    GT = "gt"
    LTE = "lte"
    LT = "lt"
    IN = "in"

LoggingLevels

Bases: Enum

Enum for logging levels.

Source code in src/zenml/enums.py
 96
 97
 98
 99
100
101
102
103
104
class LoggingLevels(Enum):
    """Enum for logging levels."""

    NOTSET = logging.NOTSET
    ERROR = logging.ERROR
    WARN = logging.WARN
    INFO = logging.INFO
    DEBUG = logging.DEBUG
    CRITICAL = logging.CRITICAL

LogicalOperators

Bases: StrEnum

Logical Ops to use to combine filters on list methods.

Source code in src/zenml/enums.py
272
273
274
275
276
class LogicalOperators(StrEnum):
    """Logical Ops to use to combine filters on list methods."""

    OR = "or"
    AND = "and"

MetadataResourceTypes

Bases: StrEnum

All possible resource types for adding metadata.

Source code in src/zenml/enums.py
366
367
368
369
370
371
372
373
class MetadataResourceTypes(StrEnum):
    """All possible resource types for adding metadata."""

    PIPELINE_RUN = "pipeline_run"
    STEP_RUN = "step_run"
    ARTIFACT_VERSION = "artifact_version"
    MODEL_VERSION = "model_version"
    SCHEDULE = "schedule"

ModelStages

Bases: StrEnum

All possible stages of a Model Version.

Source code in src/zenml/enums.py
320
321
322
323
324
325
326
327
class ModelStages(StrEnum):
    """All possible stages of a Model Version."""

    NONE = "none"
    STAGING = "staging"
    PRODUCTION = "production"
    ARCHIVED = "archived"
    LATEST = "latest"

OAuthDeviceStatus

Bases: StrEnum

The OAuth device status.

Source code in src/zenml/enums.py
233
234
235
236
237
238
239
class OAuthDeviceStatus(StrEnum):
    """The OAuth device status."""

    PENDING = "pending"
    VERIFIED = "verified"
    ACTIVE = "active"
    LOCKED = "locked"

OAuthGrantTypes

Bases: StrEnum

The OAuth grant types.

Source code in src/zenml/enums.py
224
225
226
227
228
229
230
class OAuthGrantTypes(StrEnum):
    """The OAuth grant types."""

    OAUTH_PASSWORD = "password"
    OAUTH_DEVICE_CODE = "urn:ietf:params:oauth:grant-type:device_code"
    ZENML_EXTERNAL = "zenml-external"
    ZENML_API_KEY = "zenml-api-key"

OnboardingStep

Bases: StrEnum

All onboarding steps.

Source code in src/zenml/enums.py
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
class OnboardingStep(StrEnum):
    """All onboarding steps."""

    DEVICE_VERIFIED = "device_verified"
    PROJECT_CREATED = "project_created"
    PIPELINE_RUN = "pipeline_run"
    SECOND_PIPELINE_RUN = "second_pipeline_run"
    THIRD_PIPELINE_RUN = "third_pipeline_run"
    STARTER_SETUP_COMPLETED = "starter_setup_completed"
    STACK_WITH_REMOTE_ORCHESTRATOR_CREATED = (
        "stack_with_remote_orchestrator_created"
    )
    STACK_WITH_REMOTE_ARTIFACT_STORE_CREATED = (
        "stack_with_remote_artifact_store_created"
    )
    PIPELINE_RUN_WITH_REMOTE_ORCHESTRATOR = (
        "pipeline_run_with_remote_orchestrator"
    )
    PIPELINE_RUN_WITH_REMOTE_ARTIFACT_STORE = (
        "pipeline_run_with_remote_artifact_store"
    )
    PRODUCTION_SETUP_COMPLETED = "production_setup_completed"
    PRO_ONBOARDING_COMPLETED = "pro_onboarding_completed"

OperatingSystemType

Bases: StrEnum

Enum for OS types.

Source code in src/zenml/enums.py
279
280
281
282
283
284
class OperatingSystemType(StrEnum):
    """Enum for OS types."""

    LINUX = "Linux"
    WINDOWS = "Windows"
    MACOS = "Darwin"

PluginSubType

Bases: StrEnum

All possible types of Plugins.

Source code in src/zenml/enums.py
396
397
398
399
400
401
402
class PluginSubType(StrEnum):
    """All possible types of Plugins."""

    # Event Source Subtypes
    WEBHOOK = "webhook"
    # Action Subtypes
    PIPELINE_RUN = "pipeline_run"

PluginType

Bases: StrEnum

All possible types of Plugins.

Source code in src/zenml/enums.py
389
390
391
392
393
class PluginType(StrEnum):
    """All possible types of Plugins."""

    EVENT_SOURCE = "event_source"
    ACTION = "action"

ResponseUpdateStrategy

Bases: StrEnum

All available strategies to handle updated properties in the response.

Source code in src/zenml/enums.py
358
359
360
361
362
363
class ResponseUpdateStrategy(StrEnum):
    """All available strategies to handle updated properties in the response."""

    ALLOW = "allow"
    IGNORE = "ignore"
    DENY = "deny"

SecretValidationLevel

Bases: StrEnum

Secret validation levels.

Source code in src/zenml/enums.py
192
193
194
195
196
197
class SecretValidationLevel(StrEnum):
    """Secret validation levels."""

    SECRET_AND_KEY_EXISTS = "SECRET_AND_KEY_EXISTS"
    SECRET_EXISTS = "SECRET_EXISTS"
    NONE = "NONE"

SecretsStoreType

Bases: StrEnum

Secrets Store Backend Types.

Source code in src/zenml/enums.py
145
146
147
148
149
150
151
152
153
154
class SecretsStoreType(StrEnum):
    """Secrets Store Backend Types."""

    NONE = "none"  # indicates that no secrets store is used
    SQL = "sql"
    AWS = "aws"
    GCP = "gcp"
    AZURE = "azure"
    HASHICORP = "hashicorp"
    CUSTOM = "custom"  # indicates that the secrets store uses a custom backend

ServerProviderType

Bases: StrEnum

ZenML server providers.

Source code in src/zenml/enums.py
200
201
202
203
204
class ServerProviderType(StrEnum):
    """ZenML server providers."""

    DAEMON = "daemon"
    DOCKER = "docker"

ServiceState

Bases: StrEnum

Possible states for the service and service endpoint.

Source code in src/zenml/enums.py
438
439
440
441
442
443
444
445
446
class ServiceState(StrEnum):
    """Possible states for the service and service endpoint."""

    INACTIVE = "inactive"
    ACTIVE = "active"
    PENDING_STARTUP = "pending_startup"
    PENDING_SHUTDOWN = "pending_shutdown"
    ERROR = "error"
    SCALED_TO_ZERO = "scaled_to_zero"

SorterOps

Bases: StrEnum

Ops for all filters for string values on list methods.

Source code in src/zenml/enums.py
265
266
267
268
269
class SorterOps(StrEnum):
    """Ops for all filters for string values on list methods."""

    ASCENDING = "asc"
    DESCENDING = "desc"

SourceContextTypes

Bases: StrEnum

Enum for event source types.

Source code in src/zenml/enums.py
287
288
289
290
291
292
293
294
295
class SourceContextTypes(StrEnum):
    """Enum for event source types."""

    CLI = "cli"
    PYTHON = "python"
    DASHBOARD = "dashboard"
    DASHBOARD_V2 = "dashboard-v2"
    API = "api"
    UNKNOWN = "unknown"

StackComponentType

Bases: StrEnum

All possible types a StackComponent can have.

Source code in src/zenml/enums.py
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
class StackComponentType(StrEnum):
    """All possible types a `StackComponent` can have."""

    ALERTER = "alerter"
    ANNOTATOR = "annotator"
    ARTIFACT_STORE = "artifact_store"
    CONTAINER_REGISTRY = "container_registry"
    DATA_VALIDATOR = "data_validator"
    EXPERIMENT_TRACKER = "experiment_tracker"
    FEATURE_STORE = "feature_store"
    IMAGE_BUILDER = "image_builder"
    MODEL_DEPLOYER = "model_deployer"
    ORCHESTRATOR = "orchestrator"
    STEP_OPERATOR = "step_operator"
    MODEL_REGISTRY = "model_registry"

    @property
    def plural(self) -> str:
        """Returns the plural of the enum value.

        Returns:
            The plural of the enum value.
        """
        if self == StackComponentType.CONTAINER_REGISTRY:
            return "container_registries"
        elif self == StackComponentType.MODEL_REGISTRY:
            return "model_registries"

        return f"{self.value}s"

plural property

Returns the plural of the enum value.

Returns:

Type Description
str

The plural of the enum value.

StackDeploymentProvider

Bases: StrEnum

All possible stack deployment providers.

Source code in src/zenml/enums.py
430
431
432
433
434
435
class StackDeploymentProvider(StrEnum):
    """All possible stack deployment providers."""

    AWS = "aws"
    GCP = "gcp"
    AZURE = "azure"

StepRunInputArtifactType

Bases: StrEnum

All possible types of a step run input artifact.

Source code in src/zenml/enums.py
34
35
36
37
38
39
40
41
42
class StepRunInputArtifactType(StrEnum):
    """All possible types of a step run input artifact."""

    STEP_OUTPUT = (
        "step_output"  # input argument that is the output of a previous step
    )
    MANUAL = "manual"  # manually loaded via `zenml.load_artifact()`
    EXTERNAL = "external"  # loaded via `ExternalArtifact(value=...)`
    LAZY_LOADED = "lazy"  # loaded via various lazy methods

StoreType

Bases: StrEnum

Zen Store Backend Types.

Source code in src/zenml/enums.py
138
139
140
141
142
class StoreType(StrEnum):
    """Zen Store Backend Types."""

    SQL = "sql"
    REST = "rest"

TaggableResourceTypes

Bases: StrEnum

All possible resource types for tagging.

Source code in src/zenml/enums.py
346
347
348
349
350
351
352
353
354
355
class TaggableResourceTypes(StrEnum):
    """All possible resource types for tagging."""

    ARTIFACT = "artifact"
    ARTIFACT_VERSION = "artifact_version"
    MODEL = "model"
    MODEL_VERSION = "model_version"
    PIPELINE = "pipeline"
    PIPELINE_RUN = "pipeline_run"
    RUN_TEMPLATE = "run_template"

VisualizationType

Bases: StrEnum

All currently available visualization types.

Source code in src/zenml/enums.py
56
57
58
59
60
61
62
63
class VisualizationType(StrEnum):
    """All currently available visualization types."""

    CSV = "csv"
    HTML = "html"
    IMAGE = "image"
    MARKDOWN = "markdown"
    JSON = "json"

ZenMLServiceType

Bases: StrEnum

All possible types a service can have.

Source code in src/zenml/enums.py
66
67
68
69
70
class ZenMLServiceType(StrEnum):
    """All possible types a service can have."""

    ZEN_SERVER = "zen_server"
    MODEL_SERVING = "model-serving"

Environment

Environment implementation.

Environment

Provides environment information.

Source code in src/zenml/environment.py
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
class Environment(metaclass=SingletonMetaClass):
    """Provides environment information."""

    def __init__(self) -> None:
        """Initializes an Environment instance.

        Note: Environment is a singleton class, which means this method will
        only get called once. All following `Environment()` calls will return
        the previously initialized instance.
        """

    @staticmethod
    def get_system_info() -> Dict[str, str]:
        """Information about the operating system.

        Returns:
            A dictionary containing information about the operating system.
        """
        system = platform.system()

        if system == "Windows":
            release, version, csd, ptype = platform.win32_ver()

            return {
                "os": "windows",
                "windows_version_release": release,
                "windows_version": version,
                "windows_version_service_pack": csd,
                "windows_version_os_type": ptype,
            }

        if system == "Darwin":
            return {"os": "mac", "mac_version": platform.mac_ver()[0]}

        if system == "Linux":
            return {
                "os": "linux",
                "linux_distro": distro.id(),
                "linux_distro_like": distro.like(),
                "linux_distro_version": distro.version(),
            }

        # We don't collect data for any other system.
        return {"os": "unknown"}

    @staticmethod
    def python_version() -> str:
        """Returns the python version of the running interpreter.

        Returns:
            str: the python version
        """
        return platform.python_version()

    @staticmethod
    def in_container() -> bool:
        """If the current python process is running in a container.

        Returns:
            `True` if the current python process is running in a
            container, `False` otherwise.
        """
        # TODO [ENG-167]: Make this more reliable and add test.
        return INSIDE_ZENML_CONTAINER

    @staticmethod
    def in_docker() -> bool:
        """If the current python process is running in a docker container.

        Returns:
            `True` if the current python process is running in a docker
            container, `False` otherwise.
        """
        if os.path.exists("./dockerenv") or os.path.exists("/.dockerinit"):
            return True

        try:
            with open("/proc/1/cgroup", "rt") as ifh:
                info = ifh.read()
                return "docker" in info
        except (FileNotFoundError, Exception):
            return False

    @staticmethod
    def in_kubernetes() -> bool:
        """If the current python process is running in a kubernetes pod.

        Returns:
            `True` if the current python process is running in a kubernetes
            pod, `False` otherwise.
        """
        if "KUBERNETES_SERVICE_HOST" in os.environ:
            return True

        try:
            with open("/proc/1/cgroup", "rt") as ifh:
                info = ifh.read()
                return "kubepod" in info
        except (FileNotFoundError, Exception):
            return False

    @staticmethod
    def in_google_colab() -> bool:
        """If the current Python process is running in a Google Colab.

        Returns:
            `True` if the current Python process is running in a Google Colab,
            `False` otherwise.
        """
        try:
            import google.colab  # noqa

            return True

        except ModuleNotFoundError:
            return False

    @staticmethod
    def in_notebook() -> bool:
        """If the current Python process is running in a notebook.

        Returns:
            `True` if the current Python process is running in a notebook,
            `False` otherwise.
        """
        if Environment.in_google_colab():
            return True

        try:
            ipython = get_ipython()  # type: ignore[name-defined]
        except NameError:
            return False

        if ipython.__class__.__name__ in [
            "TerminalInteractiveShell",
            "ZMQInteractiveShell",
            "DatabricksShell",
        ]:
            return True
        return False

    @staticmethod
    def in_github_codespaces() -> bool:
        """If the current Python process is running in GitHub Codespaces.

        Returns:
            `True` if the current Python process is running in GitHub Codespaces,
            `False` otherwise.
        """
        return (
            "CODESPACES" in os.environ
            or "GITHUB_CODESPACE_TOKEN" in os.environ
            or "GITHUB_CODESPACES_PORT_FORWARDING_DOMAIN" in os.environ
        )

    @staticmethod
    def in_zenml_codespace() -> bool:
        """If the current Python process is running in ZenML Codespaces.

        Returns:
            `True` if the current Python process is running in ZenML Codespaces,
            `False` otherwise.
        """
        return os.environ.get("ZENML_ENVIRONMENT") == "codespace"

    @staticmethod
    def in_vscode_remote_container() -> bool:
        """If the current Python process is running in a VS Code Remote Container.

        Returns:
            `True` if the current Python process is running in a VS Code Remote Container,
            `False` otherwise.
        """
        return (
            "REMOTE_CONTAINERS" in os.environ
            or "VSCODE_REMOTE_CONTAINERS_SESSION" in os.environ
        )

    @staticmethod
    def in_paperspace_gradient() -> bool:
        """If the current Python process is running in Paperspace Gradient.

        Returns:
            `True` if the current Python process is running in Paperspace
            Gradient, `False` otherwise.
        """
        return "PAPERSPACE_NOTEBOOK_REPO_ID" in os.environ

    @staticmethod
    def in_github_actions() -> bool:
        """If the current Python process is running in GitHub Actions.

        Returns:
            `True` if the current Python process is running in GitHub
            Actions, `False` otherwise.
        """
        return "GITHUB_ACTIONS" in os.environ

    @staticmethod
    def in_gitlab_ci() -> bool:
        """If the current Python process is running in GitLab CI.

        Returns:
            `True` if the current Python process is running in GitLab
            CI, `False` otherwise.
        """
        return "GITLAB_CI" in os.environ

    @staticmethod
    def in_circle_ci() -> bool:
        """If the current Python process is running in Circle CI.

        Returns:
            `True` if the current Python process is running in Circle
            CI, `False` otherwise.
        """
        return "CIRCLECI" in os.environ

    @staticmethod
    def in_bitbucket_ci() -> bool:
        """If the current Python process is running in Bitbucket CI.

        Returns:
            `True` if the current Python process is running in Bitbucket
            CI, `False` otherwise.
        """
        return "BITBUCKET_BUILD_NUMBER" in os.environ

    @staticmethod
    def in_ci() -> bool:
        """If the current Python process is running in any CI.

        Returns:
            `True` if the current Python process is running in any
            CI, `False` otherwise.
        """
        return "CI" in os.environ

    @staticmethod
    def in_wsl() -> bool:
        """If the current process is running in Windows Subsystem for Linux.

        source: https://www.scivision.dev/python-detect-wsl/

        Returns:
            `True` if the current process is running in WSL, `False` otherwise.
        """
        return "microsoft-standard" in platform.uname().release

    @staticmethod
    def in_lightning_ai_studio() -> bool:
        """If the current Python process is running in Lightning.ai studios.

        Returns:
            `True` if the current Python process is running in Lightning.ai studios,
            `False` otherwise.
        """
        return (
            "LIGHTNING_CLOUD_URL" in os.environ
            and "LIGHTNING_CLOUDSPACE_HOST" in os.environ
        )

    @staticmethod
    def get_python_packages() -> List[str]:
        """Returns a list of installed Python packages.

        Raises:
            RuntimeError: If the process to get the list of installed packages
                fails.

        Returns:
            List of installed packages in pip freeze format.
        """
        try:
            output = subprocess.check_output(["pip", "freeze"]).decode()
            return output.strip().split("\n")
        except subprocess.CalledProcessError:
            raise RuntimeError(
                "Failed to get list of installed Python packages"
            )

__init__()

Initializes an Environment instance.

Note: Environment is a singleton class, which means this method will only get called once. All following Environment() calls will return the previously initialized instance.

Source code in src/zenml/environment.py
107
108
109
110
111
112
113
def __init__(self) -> None:
    """Initializes an Environment instance.

    Note: Environment is a singleton class, which means this method will
    only get called once. All following `Environment()` calls will return
    the previously initialized instance.
    """

get_python_packages() staticmethod

Returns a list of installed Python packages.

Raises:

Type Description
RuntimeError

If the process to get the list of installed packages fails.

Returns:

Type Description
List[str]

List of installed packages in pip freeze format.

Source code in src/zenml/environment.py
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
@staticmethod
def get_python_packages() -> List[str]:
    """Returns a list of installed Python packages.

    Raises:
        RuntimeError: If the process to get the list of installed packages
            fails.

    Returns:
        List of installed packages in pip freeze format.
    """
    try:
        output = subprocess.check_output(["pip", "freeze"]).decode()
        return output.strip().split("\n")
    except subprocess.CalledProcessError:
        raise RuntimeError(
            "Failed to get list of installed Python packages"
        )

get_system_info() staticmethod

Information about the operating system.

Returns:

Type Description
Dict[str, str]

A dictionary containing information about the operating system.

Source code in src/zenml/environment.py
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
@staticmethod
def get_system_info() -> Dict[str, str]:
    """Information about the operating system.

    Returns:
        A dictionary containing information about the operating system.
    """
    system = platform.system()

    if system == "Windows":
        release, version, csd, ptype = platform.win32_ver()

        return {
            "os": "windows",
            "windows_version_release": release,
            "windows_version": version,
            "windows_version_service_pack": csd,
            "windows_version_os_type": ptype,
        }

    if system == "Darwin":
        return {"os": "mac", "mac_version": platform.mac_ver()[0]}

    if system == "Linux":
        return {
            "os": "linux",
            "linux_distro": distro.id(),
            "linux_distro_like": distro.like(),
            "linux_distro_version": distro.version(),
        }

    # We don't collect data for any other system.
    return {"os": "unknown"}

in_bitbucket_ci() staticmethod

If the current Python process is running in Bitbucket CI.

Returns:

Type Description
bool

True if the current Python process is running in Bitbucket

bool

CI, False otherwise.

Source code in src/zenml/environment.py
322
323
324
325
326
327
328
329
330
@staticmethod
def in_bitbucket_ci() -> bool:
    """If the current Python process is running in Bitbucket CI.

    Returns:
        `True` if the current Python process is running in Bitbucket
        CI, `False` otherwise.
    """
    return "BITBUCKET_BUILD_NUMBER" in os.environ

in_ci() staticmethod

If the current Python process is running in any CI.

Returns:

Type Description
bool

True if the current Python process is running in any

bool

CI, False otherwise.

Source code in src/zenml/environment.py
332
333
334
335
336
337
338
339
340
@staticmethod
def in_ci() -> bool:
    """If the current Python process is running in any CI.

    Returns:
        `True` if the current Python process is running in any
        CI, `False` otherwise.
    """
    return "CI" in os.environ

in_circle_ci() staticmethod

If the current Python process is running in Circle CI.

Returns:

Type Description
bool

True if the current Python process is running in Circle

bool

CI, False otherwise.

Source code in src/zenml/environment.py
312
313
314
315
316
317
318
319
320
@staticmethod
def in_circle_ci() -> bool:
    """If the current Python process is running in Circle CI.

    Returns:
        `True` if the current Python process is running in Circle
        CI, `False` otherwise.
    """
    return "CIRCLECI" in os.environ

in_container() staticmethod

If the current python process is running in a container.

Returns:

Type Description
bool

True if the current python process is running in a

bool

container, False otherwise.

Source code in src/zenml/environment.py
158
159
160
161
162
163
164
165
166
167
@staticmethod
def in_container() -> bool:
    """If the current python process is running in a container.

    Returns:
        `True` if the current python process is running in a
        container, `False` otherwise.
    """
    # TODO [ENG-167]: Make this more reliable and add test.
    return INSIDE_ZENML_CONTAINER

in_docker() staticmethod

If the current python process is running in a docker container.

Returns:

Type Description
bool

True if the current python process is running in a docker

bool

container, False otherwise.

Source code in src/zenml/environment.py
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
@staticmethod
def in_docker() -> bool:
    """If the current python process is running in a docker container.

    Returns:
        `True` if the current python process is running in a docker
        container, `False` otherwise.
    """
    if os.path.exists("./dockerenv") or os.path.exists("/.dockerinit"):
        return True

    try:
        with open("/proc/1/cgroup", "rt") as ifh:
            info = ifh.read()
            return "docker" in info
    except (FileNotFoundError, Exception):
        return False

in_github_actions() staticmethod

If the current Python process is running in GitHub Actions.

Returns:

Type Description
bool

True if the current Python process is running in GitHub

bool

Actions, False otherwise.

Source code in src/zenml/environment.py
292
293
294
295
296
297
298
299
300
@staticmethod
def in_github_actions() -> bool:
    """If the current Python process is running in GitHub Actions.

    Returns:
        `True` if the current Python process is running in GitHub
        Actions, `False` otherwise.
    """
    return "GITHUB_ACTIONS" in os.environ

in_github_codespaces() staticmethod

If the current Python process is running in GitHub Codespaces.

Returns:

Type Description
bool

True if the current Python process is running in GitHub Codespaces,

bool

False otherwise.

Source code in src/zenml/environment.py
245
246
247
248
249
250
251
252
253
254
255
256
257
@staticmethod
def in_github_codespaces() -> bool:
    """If the current Python process is running in GitHub Codespaces.

    Returns:
        `True` if the current Python process is running in GitHub Codespaces,
        `False` otherwise.
    """
    return (
        "CODESPACES" in os.environ
        or "GITHUB_CODESPACE_TOKEN" in os.environ
        or "GITHUB_CODESPACES_PORT_FORWARDING_DOMAIN" in os.environ
    )

in_gitlab_ci() staticmethod

If the current Python process is running in GitLab CI.

Returns:

Type Description
bool

True if the current Python process is running in GitLab

bool

CI, False otherwise.

Source code in src/zenml/environment.py
302
303
304
305
306
307
308
309
310
@staticmethod
def in_gitlab_ci() -> bool:
    """If the current Python process is running in GitLab CI.

    Returns:
        `True` if the current Python process is running in GitLab
        CI, `False` otherwise.
    """
    return "GITLAB_CI" in os.environ

in_google_colab() staticmethod

If the current Python process is running in a Google Colab.

Returns:

Type Description
bool

True if the current Python process is running in a Google Colab,

bool

False otherwise.

Source code in src/zenml/environment.py
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
@staticmethod
def in_google_colab() -> bool:
    """If the current Python process is running in a Google Colab.

    Returns:
        `True` if the current Python process is running in a Google Colab,
        `False` otherwise.
    """
    try:
        import google.colab  # noqa

        return True

    except ModuleNotFoundError:
        return False

in_kubernetes() staticmethod

If the current python process is running in a kubernetes pod.

Returns:

Type Description
bool

True if the current python process is running in a kubernetes

bool

pod, False otherwise.

Source code in src/zenml/environment.py
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
@staticmethod
def in_kubernetes() -> bool:
    """If the current python process is running in a kubernetes pod.

    Returns:
        `True` if the current python process is running in a kubernetes
        pod, `False` otherwise.
    """
    if "KUBERNETES_SERVICE_HOST" in os.environ:
        return True

    try:
        with open("/proc/1/cgroup", "rt") as ifh:
            info = ifh.read()
            return "kubepod" in info
    except (FileNotFoundError, Exception):
        return False

in_lightning_ai_studio() staticmethod

If the current Python process is running in Lightning.ai studios.

Returns:

Type Description
bool

True if the current Python process is running in Lightning.ai studios,

bool

False otherwise.

Source code in src/zenml/environment.py
353
354
355
356
357
358
359
360
361
362
363
364
@staticmethod
def in_lightning_ai_studio() -> bool:
    """If the current Python process is running in Lightning.ai studios.

    Returns:
        `True` if the current Python process is running in Lightning.ai studios,
        `False` otherwise.
    """
    return (
        "LIGHTNING_CLOUD_URL" in os.environ
        and "LIGHTNING_CLOUDSPACE_HOST" in os.environ
    )

in_notebook() staticmethod

If the current Python process is running in a notebook.

Returns:

Type Description
bool

True if the current Python process is running in a notebook,

bool

False otherwise.

Source code in src/zenml/environment.py
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
@staticmethod
def in_notebook() -> bool:
    """If the current Python process is running in a notebook.

    Returns:
        `True` if the current Python process is running in a notebook,
        `False` otherwise.
    """
    if Environment.in_google_colab():
        return True

    try:
        ipython = get_ipython()  # type: ignore[name-defined]
    except NameError:
        return False

    if ipython.__class__.__name__ in [
        "TerminalInteractiveShell",
        "ZMQInteractiveShell",
        "DatabricksShell",
    ]:
        return True
    return False

in_paperspace_gradient() staticmethod

If the current Python process is running in Paperspace Gradient.

Returns:

Type Description
bool

True if the current Python process is running in Paperspace

bool

Gradient, False otherwise.

Source code in src/zenml/environment.py
282
283
284
285
286
287
288
289
290
@staticmethod
def in_paperspace_gradient() -> bool:
    """If the current Python process is running in Paperspace Gradient.

    Returns:
        `True` if the current Python process is running in Paperspace
        Gradient, `False` otherwise.
    """
    return "PAPERSPACE_NOTEBOOK_REPO_ID" in os.environ

in_vscode_remote_container() staticmethod

If the current Python process is running in a VS Code Remote Container.

Returns:

Type Description
bool

True if the current Python process is running in a VS Code Remote Container,

bool

False otherwise.

Source code in src/zenml/environment.py
269
270
271
272
273
274
275
276
277
278
279
280
@staticmethod
def in_vscode_remote_container() -> bool:
    """If the current Python process is running in a VS Code Remote Container.

    Returns:
        `True` if the current Python process is running in a VS Code Remote Container,
        `False` otherwise.
    """
    return (
        "REMOTE_CONTAINERS" in os.environ
        or "VSCODE_REMOTE_CONTAINERS_SESSION" in os.environ
    )

in_wsl() staticmethod

If the current process is running in Windows Subsystem for Linux.

source: https://www.scivision.dev/python-detect-wsl/

Returns:

Type Description
bool

True if the current process is running in WSL, False otherwise.

Source code in src/zenml/environment.py
342
343
344
345
346
347
348
349
350
351
@staticmethod
def in_wsl() -> bool:
    """If the current process is running in Windows Subsystem for Linux.

    source: https://www.scivision.dev/python-detect-wsl/

    Returns:
        `True` if the current process is running in WSL, `False` otherwise.
    """
    return "microsoft-standard" in platform.uname().release

in_zenml_codespace() staticmethod

If the current Python process is running in ZenML Codespaces.

Returns:

Type Description
bool

True if the current Python process is running in ZenML Codespaces,

bool

False otherwise.

Source code in src/zenml/environment.py
259
260
261
262
263
264
265
266
267
@staticmethod
def in_zenml_codespace() -> bool:
    """If the current Python process is running in ZenML Codespaces.

    Returns:
        `True` if the current Python process is running in ZenML Codespaces,
        `False` otherwise.
    """
    return os.environ.get("ZENML_ENVIRONMENT") == "codespace"

python_version() staticmethod

Returns the python version of the running interpreter.

Returns:

Name Type Description
str str

the python version

Source code in src/zenml/environment.py
149
150
151
152
153
154
155
156
@staticmethod
def python_version() -> str:
    """Returns the python version of the running interpreter.

    Returns:
        str: the python version
    """
    return platform.python_version()

get_environment()

Returns a string representing the execution environment of the pipeline.

Returns:

Name Type Description
str str

the execution environment

Source code in src/zenml/environment.py
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
def get_environment() -> str:
    """Returns a string representing the execution environment of the pipeline.

    Returns:
        str: the execution environment
    """
    # Order is important here
    if Environment.in_kubernetes():
        return EnvironmentType.KUBERNETES
    elif Environment.in_github_actions():
        return EnvironmentType.GITHUB_ACTION
    elif Environment.in_gitlab_ci():
        return EnvironmentType.GITLAB_CI
    elif Environment.in_circle_ci():
        return EnvironmentType.CIRCLE_CI
    elif Environment.in_bitbucket_ci():
        return EnvironmentType.BITBUCKET_CI
    elif Environment.in_ci():
        return EnvironmentType.GENERIC_CI
    elif Environment.in_github_codespaces():
        return EnvironmentType.GITHUB_CODESPACES
    elif Environment.in_zenml_codespace():
        return EnvironmentType.ZENML_CODESPACE
    elif Environment.in_vscode_remote_container():
        return EnvironmentType.VSCODE_REMOTE_CONTAINER
    elif Environment.in_lightning_ai_studio():
        return EnvironmentType.LIGHTNING_AI_STUDIO
    elif Environment.in_docker():
        return EnvironmentType.DOCKER
    elif Environment.in_container():
        return EnvironmentType.CONTAINER
    elif Environment.in_google_colab():
        return EnvironmentType.COLAB
    elif Environment.in_paperspace_gradient():
        return EnvironmentType.PAPERSPACE
    elif Environment.in_notebook():
        return EnvironmentType.NOTEBOOK
    elif Environment.in_wsl():
        return EnvironmentType.WSL
    else:
        return EnvironmentType.NATIVE

get_run_environment_dict()

Returns a dictionary of the current run environment.

Everything that is returned here will be saved in the DB as pipeline_run.client_environment and pipeline_run.orchestrator_environment for client and orchestrator respectively.

Returns:

Type Description
Dict[str, Any]

A dictionary of the current run environment.

Source code in src/zenml/environment.py
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
def get_run_environment_dict() -> Dict[str, Any]:
    """Returns a dictionary of the current run environment.

    Everything that is returned here will be saved in the DB as
    `pipeline_run.client_environment` and
    `pipeline_run.orchestrator_environment` for client and orchestrator
    respectively.

    Returns:
        A dictionary of the current run environment.
    """
    env_dict: Dict[str, Any] = {
        "environment": str(get_environment()),
        **Environment.get_system_info(),
        "python_version": Environment.python_version(),
    }

    try:
        python_packages = Environment.get_python_packages()
    except RuntimeError:
        logger.warning("Failed to get list of installed Python packages")
    else:
        # TODO: We send the python packages as a string right now to keep
        # backwards compatibility with old versions. We should update this to
        # be a list of strings eventually.
        env_dict["python_packages"] = "\n".join(python_packages)

    return env_dict

Event Hub

ZenML Event Hub module.

The Event Hub is responsible for receiving all Events and dispatching them to the relevant Subscribers/Triggers.

Event Sources

Base Classes for Event Sources.

Exceptions

ZenML specific exception definitions.

ArtifactStoreInterfaceError

Bases: ZenMLBaseException

Raises exception when interacting with the Artifact Store interface in an unsupported way.

Source code in src/zenml/exceptions.py
89
90
class ArtifactStoreInterfaceError(ZenMLBaseException):
    """Raises exception when interacting with the Artifact Store interface in an unsupported way."""

AuthorizationException

Bases: ZenMLBaseException

Raised when an authorization error occurred while trying to access a ZenML resource .

Source code in src/zenml/exceptions.py
46
47
class AuthorizationException(ZenMLBaseException):
    """Raised when an authorization error occurred while trying to access a ZenML resource ."""

BackupSecretsStoreNotConfiguredError

Bases: NotImplementedError

Raised when a backup secrets store is not configured.

Source code in src/zenml/exceptions.py
213
214
class BackupSecretsStoreNotConfiguredError(NotImplementedError):
    """Raised when a backup secrets store is not configured."""

CredentialsNotValid

Bases: AuthorizationException

Raised when the credentials provided are invalid.

This is a subclass of AuthorizationException and should only be raised when the authentication credentials are invalid (e.g. expired API token, invalid username/password, invalid signature). If caught by the ZenML client, it will trigger an invalidation of the currently cached API token and a re-authentication flow.

Source code in src/zenml/exceptions.py
50
51
52
53
54
55
56
57
58
class CredentialsNotValid(AuthorizationException):
    """Raised when the credentials provided are invalid.

    This is a subclass of AuthorizationException and should only be raised when
    the authentication credentials are invalid (e.g. expired API token, invalid
    username/password, invalid signature). If caught by the ZenML client, it
    will trigger an invalidation of the currently cached API token and a
    re-authentication flow.
    """

CustomFlavorImportError

Bases: ImportError

Raised when failing to import a custom flavor.

Source code in src/zenml/exceptions.py
217
218
class CustomFlavorImportError(ImportError):
    """Raised when failing to import a custom flavor."""

DoesNotExistException

Bases: ZenMLBaseException

Raises exception when the entity does not exist in the system but an action is being done that requires it to be present.

Source code in src/zenml/exceptions.py
61
62
63
64
65
66
67
68
69
70
class DoesNotExistException(ZenMLBaseException):
    """Raises exception when the entity does not exist in the system but an action is being done that requires it to be present."""

    def __init__(self, message: str):
        """Initializes the exception.

        Args:
            message: Message with details of exception.
        """
        super().__init__(message)

__init__(message)

Initializes the exception.

Parameters:

Name Type Description Default
message str

Message with details of exception.

required
Source code in src/zenml/exceptions.py
64
65
66
67
68
69
70
def __init__(self, message: str):
    """Initializes the exception.

    Args:
        message: Message with details of exception.
    """
    super().__init__(message)

EntityCreationError

Bases: ZenMLBaseException, RuntimeError

Raised when failing to create an entity.

Source code in src/zenml/exceptions.py
105
106
class EntityCreationError(ZenMLBaseException, RuntimeError):
    """Raised when failing to create an entity."""

EntityExistsError

Bases: ZenMLBaseException

Raised when trying to register an entity that already exists.

Source code in src/zenml/exceptions.py
101
102
class EntityExistsError(ZenMLBaseException):
    """Raised when trying to register an entity that already exists."""

GitNotFoundError

Bases: ImportError

Raised when ZenML CLI is used to interact with examples on a machine with no git installation.

Source code in src/zenml/exceptions.py
117
118
class GitNotFoundError(ImportError):
    """Raised when ZenML CLI is used to interact with examples on a machine with no git installation."""

HydrationError

Bases: ZenMLBaseException

Raised when the model hydration failed.

Source code in src/zenml/exceptions.py
141
142
class HydrationError(ZenMLBaseException):
    """Raised when the model hydration failed."""

IllegalOperationError

Bases: ZenMLBaseException

Raised when an illegal operation is attempted.

Source code in src/zenml/exceptions.py
121
122
class IllegalOperationError(ZenMLBaseException):
    """Raised when an illegal operation is attempted."""

InitializationException

Bases: ZenMLBaseException

Raised when an error occurred during initialization of a ZenML repository.

Source code in src/zenml/exceptions.py
42
43
class InitializationException(ZenMLBaseException):
    """Raised when an error occurred during initialization of a ZenML repository."""

InputResolutionError

Bases: ZenMLBaseException

Raised when step input resolving failed.

Source code in src/zenml/exceptions.py
133
134
class InputResolutionError(ZenMLBaseException):
    """Raised when step input resolving failed."""

IntegrationError

Bases: ZenMLBaseException

Raises exceptions when a requested integration can not be activated.

Source code in src/zenml/exceptions.py
93
94
class IntegrationError(ZenMLBaseException):
    """Raises exceptions when a requested integration can not be activated."""

MaterializerInterfaceError

Bases: ZenMLBaseException

Raises exception when interacting with the Materializer interface in an unsupported way.

Source code in src/zenml/exceptions.py
77
78
class MaterializerInterfaceError(ZenMLBaseException):
    """Raises exception when interacting with the Materializer interface in an unsupported way."""

MaxConcurrentTasksError

Bases: ZenMLBaseException

Raised when the maximum number of concurrent tasks is reached.

Source code in src/zenml/exceptions.py
221
222
class MaxConcurrentTasksError(ZenMLBaseException):
    """Raised when the maximum number of concurrent tasks is reached."""

MethodNotAllowedError

Bases: ZenMLBaseException

Raised when the server does not allow a request method.

Source code in src/zenml/exceptions.py
125
126
class MethodNotAllowedError(ZenMLBaseException):
    """Raised when the server does not allow a request method."""

OAuthError

Bases: ValueError

OAuth2 error.

Source code in src/zenml/exceptions.py
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
class OAuthError(ValueError):
    """OAuth2 error."""

    def __init__(
        self,
        error: str,
        status_code: int = 400,
        error_description: Optional[str] = None,
        error_uri: Optional[str] = None,
    ) -> None:
        """Initializes the OAuthError.

        Args:
            status_code: HTTP status code.
            error: Error code.
            error_description: Error description.
            error_uri: Error URI.
        """
        self.status_code = status_code
        self.error = error
        self.error_description = error_description
        self.error_uri = error_uri

    def to_dict(self) -> Dict[str, Optional[str]]:
        """Returns the OAuthError as a dictionary.

        Returns:
            The OAuthError as a dictionary.
        """
        return {
            "error": self.error,
            "error_description": self.error_description,
            "error_uri": self.error_uri,
        }

    def __str__(self) -> str:
        """String function.

        Returns:
            the error message
        """
        return f"{self.error}: {self.error_description or ''}"

__init__(error, status_code=400, error_description=None, error_uri=None)

Initializes the OAuthError.

Parameters:

Name Type Description Default
status_code int

HTTP status code.

400
error str

Error code.

required
error_description Optional[str]

Error description.

None
error_uri Optional[str]

Error URI.

None
Source code in src/zenml/exceptions.py
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
def __init__(
    self,
    error: str,
    status_code: int = 400,
    error_description: Optional[str] = None,
    error_uri: Optional[str] = None,
) -> None:
    """Initializes the OAuthError.

    Args:
        status_code: HTTP status code.
        error: Error code.
        error_description: Error description.
        error_uri: Error URI.
    """
    self.status_code = status_code
    self.error = error
    self.error_description = error_description
    self.error_uri = error_uri

__str__()

String function.

Returns:

Type Description
str

the error message

Source code in src/zenml/exceptions.py
200
201
202
203
204
205
206
def __str__(self) -> str:
    """String function.

    Returns:
        the error message
    """
    return f"{self.error}: {self.error_description or ''}"

to_dict()

Returns the OAuthError as a dictionary.

Returns:

Type Description
Dict[str, Optional[str]]

The OAuthError as a dictionary.

Source code in src/zenml/exceptions.py
188
189
190
191
192
193
194
195
196
197
198
def to_dict(self) -> Dict[str, Optional[str]]:
    """Returns the OAuthError as a dictionary.

    Returns:
        The OAuthError as a dictionary.
    """
    return {
        "error": self.error,
        "error_description": self.error_description,
        "error_uri": self.error_uri,
    }

SecretsStoreNotConfiguredError

Bases: NotImplementedError

Raised when a secrets store is not configured.

Source code in src/zenml/exceptions.py
209
210
class SecretsStoreNotConfiguredError(NotImplementedError):
    """Raised when a secrets store is not configured."""

SettingsResolvingError

Bases: ZenMLBaseException

Raised when resolving settings failed.

Source code in src/zenml/exceptions.py
129
130
class SettingsResolvingError(ZenMLBaseException):
    """Raised when resolving settings failed."""

StackComponentInterfaceError

Bases: ZenMLBaseException

Raises exception when interacting with the stack components in an unsupported way.

Source code in src/zenml/exceptions.py
85
86
class StackComponentInterfaceError(ZenMLBaseException):
    """Raises exception when interacting with the stack components in an unsupported way."""

StackValidationError

Bases: ZenMLBaseException

Raised when a stack configuration is not valid.

Source code in src/zenml/exceptions.py
113
114
class StackValidationError(ZenMLBaseException):
    """Raised when a stack configuration is not valid."""

StepContextError

Bases: ZenMLBaseException

Raises exception when interacting with a StepContext in an unsupported way.

Source code in src/zenml/exceptions.py
81
82
class StepContextError(ZenMLBaseException):
    """Raises exception when interacting with a StepContext in an unsupported way."""

StepInterfaceError

Bases: ZenMLBaseException

Raises exception when interacting with the Step interface in an unsupported way.

Source code in src/zenml/exceptions.py
73
74
class StepInterfaceError(ZenMLBaseException):
    """Raises exception when interacting with the Step interface in an unsupported way."""

SubscriptionUpgradeRequiredError

Bases: ZenMLBaseException

Raised when user tries to perform an action outside their current subscription tier.

Source code in src/zenml/exceptions.py
137
138
class SubscriptionUpgradeRequiredError(ZenMLBaseException):
    """Raised when user tries to perform an action outside their current subscription tier."""

ValidationError

Bases: ZenMLBaseException

Raised when the Model passed to the ZenStore.

Source code in src/zenml/exceptions.py
97
98
class ValidationError(ZenMLBaseException):
    """Raised when the Model passed to the ZenStore."""

WebhookInactiveError

Bases: ZenMLBaseException

Raised when source is inactive.

Source code in src/zenml/exceptions.py
109
110
class WebhookInactiveError(ZenMLBaseException):
    """Raised when source is inactive."""

ZenKeyError

Bases: KeyError

Specialized key error which allows error messages with line breaks.

Source code in src/zenml/exceptions.py
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
class ZenKeyError(KeyError):
    """Specialized key error which allows error messages with line breaks."""

    def __init__(self, message: str) -> None:
        """Initialization.

        Args:
            message:str, the error message
        """
        self.message = message

    def __str__(self) -> str:
        """String function.

        Returns:
            the error message
        """
        return self.message

__init__(message)

Initialization.

Parameters:

Name Type Description Default
message str

str, the error message

required
Source code in src/zenml/exceptions.py
148
149
150
151
152
153
154
def __init__(self, message: str) -> None:
    """Initialization.

    Args:
        message:str, the error message
    """
    self.message = message

__str__()

String function.

Returns:

Type Description
str

the error message

Source code in src/zenml/exceptions.py
156
157
158
159
160
161
162
def __str__(self) -> str:
    """String function.

    Returns:
        the error message
    """
    return self.message

ZenMLBaseException

Bases: Exception

Base exception for all ZenML Exceptions.

Source code in src/zenml/exceptions.py
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
class ZenMLBaseException(Exception):
    """Base exception for all ZenML Exceptions."""

    def __init__(
        self,
        message: Optional[str] = None,
        url: Optional[str] = None,
    ):
        """The BaseException used to format messages displayed to the user.

        Args:
            message: Message with details of exception. This message
                     will be appended with another message directing user to
                     `url` for more information. If `None`, then default
                     Exception behavior is used.
            url: URL to point to in exception message. If `None`, then no url
                 is appended.
        """
        if message and url:
            message += f" For more information, visit {url}."
        super().__init__(message)

__init__(message=None, url=None)

The BaseException used to format messages displayed to the user.

Parameters:

Name Type Description Default
message Optional[str]

Message with details of exception. This message will be appended with another message directing user to url for more information. If None, then default Exception behavior is used.

None
url Optional[str]

URL to point to in exception message. If None, then no url is appended.

None
Source code in src/zenml/exceptions.py
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
def __init__(
    self,
    message: Optional[str] = None,
    url: Optional[str] = None,
):
    """The BaseException used to format messages displayed to the user.

    Args:
        message: Message with details of exception. This message
                 will be appended with another message directing user to
                 `url` for more information. If `None`, then default
                 Exception behavior is used.
        url: URL to point to in exception message. If `None`, then no url
             is appended.
    """
    if message and url:
        message += f" For more information, visit {url}."
    super().__init__(message)

Experiment Trackers

Experiment trackers let you track your ML experiments.

They log the parameters used and allow you to compare between runs. In the ZenML world, every pipeline run is considered an experiment, and ZenML facilitates the storage of experiment results through ExperimentTracker stack components. This establishes a clear link between pipeline runs and experiments.

BaseExperimentTracker

Bases: StackComponent, ABC

Base class for all ZenML experiment trackers.

Source code in src/zenml/experiment_trackers/base_experiment_tracker.py
28
29
30
31
32
33
34
35
36
37
38
class BaseExperimentTracker(StackComponent, ABC):
    """Base class for all ZenML experiment trackers."""

    @property
    def config(self) -> BaseExperimentTrackerConfig:
        """Returns the config of the experiment tracker.

        Returns:
            The config of the experiment tracker.
        """
        return cast(BaseExperimentTrackerConfig, self._config)

config property

Returns the config of the experiment tracker.

Returns:

Type Description
BaseExperimentTrackerConfig

The config of the experiment tracker.

Feature Stores

A feature store enables an offline and online serving of feature data.

Feature stores allow data teams to serve data via an offline store and an online low-latency store where data is kept in sync between the two. It also offers a centralized registry where features (and feature schemas) are stored for use within a team or wider organization.

As a data scientist working on training your model, your requirements for how you access your batch / 'offline' data will almost certainly be different from how you access that data as part of a real-time or online inference setting. Feast solves the problem of developing train-serve skew where those two sources of data diverge from each other.

BaseFeatureStore

Bases: StackComponent, ABC

Base class for all ZenML feature stores.

Source code in src/zenml/feature_stores/base_feature_store.py
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
class BaseFeatureStore(StackComponent, ABC):
    """Base class for all ZenML feature stores."""

    @property
    def config(self) -> BaseFeatureStoreConfig:
        """Returns the `BaseFeatureStoreConfig` config.

        Returns:
            The configuration.
        """
        return cast(BaseFeatureStoreConfig, self._config)

    @abstractmethod
    def get_historical_features(
        self,
        entity_df: Any,
        features: List[str],
        full_feature_names: bool = False,
    ) -> Any:
        """Returns the historical features for training or batch scoring.

        Args:
            entity_df: The entity DataFrame or entity name.
            features: The features to retrieve.
            full_feature_names: Whether to return the full feature names.

        Returns:
            The historical features.
        """

    @abstractmethod
    def get_online_features(
        self,
        entity_rows: List[Dict[str, Any]],
        features: List[str],
        full_feature_names: bool = False,
    ) -> Dict[str, Any]:
        """Returns the latest online feature data.

        Args:
            entity_rows: The entity rows to retrieve.
            features: The features to retrieve.
            full_feature_names: Whether to return the full feature names.

        Returns:
            The latest online feature data as a dictionary.
        """

config property

Returns the BaseFeatureStoreConfig config.

Returns:

Type Description
BaseFeatureStoreConfig

The configuration.

get_historical_features(entity_df, features, full_feature_names=False) abstractmethod

Returns the historical features for training or batch scoring.

Parameters:

Name Type Description Default
entity_df Any

The entity DataFrame or entity name.

required
features List[str]

The features to retrieve.

required
full_feature_names bool

Whether to return the full feature names.

False

Returns:

Type Description
Any

The historical features.

Source code in src/zenml/feature_stores/base_feature_store.py
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
@abstractmethod
def get_historical_features(
    self,
    entity_df: Any,
    features: List[str],
    full_feature_names: bool = False,
) -> Any:
    """Returns the historical features for training or batch scoring.

    Args:
        entity_df: The entity DataFrame or entity name.
        features: The features to retrieve.
        full_feature_names: Whether to return the full feature names.

    Returns:
        The historical features.
    """

get_online_features(entity_rows, features, full_feature_names=False) abstractmethod

Returns the latest online feature data.

Parameters:

Name Type Description Default
entity_rows List[Dict[str, Any]]

The entity rows to retrieve.

required
features List[str]

The features to retrieve.

required
full_feature_names bool

Whether to return the full feature names.

False

Returns:

Type Description
Dict[str, Any]

The latest online feature data as a dictionary.

Source code in src/zenml/feature_stores/base_feature_store.py
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
@abstractmethod
def get_online_features(
    self,
    entity_rows: List[Dict[str, Any]],
    features: List[str],
    full_feature_names: bool = False,
) -> Dict[str, Any]:
    """Returns the latest online feature data.

    Args:
        entity_rows: The entity rows to retrieve.
        features: The features to retrieve.
        full_feature_names: Whether to return the full feature names.

    Returns:
        The latest online feature data as a dictionary.
    """

Hooks

The hooks package exposes some standard hooks that can be used in ZenML.

Hooks are functions that run after a step has exited.

alerter_failure_hook(exception)

Standard failure hook that executes after step fails.

This hook uses any BaseAlerter that is configured within the active stack to post a message.

Parameters:

Name Type Description Default
exception BaseException

Original exception that lead to step failing.

required
Source code in src/zenml/hooks/alerter_hooks.py
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
def alerter_failure_hook(exception: BaseException) -> None:
    """Standard failure hook that executes after step fails.

    This hook uses any `BaseAlerter` that is configured within the active stack to post a message.

    Args:
        exception: Original exception that lead to step failing.
    """
    context = get_step_context()
    alerter = Client().active_stack.alerter
    if alerter:
        output_captured = io.StringIO()
        original_stdout = sys.stdout
        sys.stdout = output_captured
        console = Console()
        console.print_exception(show_locals=False)

        sys.stdout = original_stdout
        rich_traceback = output_captured.getvalue()

        message = "*Failure Hook Notification! Step failed!*" + "\n\n"
        message += f"Pipeline name: `{context.pipeline.name}`" + "\n"
        message += f"Run name: `{context.pipeline_run.name}`" + "\n"
        message += f"Step name: `{context.step_run.name}`" + "\n"
        message += f"Parameters: `{context.step_run.config.parameters}`" + "\n"
        message += (
            f"Exception: `({type(exception)}) {rich_traceback}`" + "\n\n"
        )
        alerter.post(message)
    else:
        logger.warning(
            "Specified standard failure hook but no alerter configured in the stack. Skipping.."
        )

alerter_success_hook()

Standard success hook that executes after step finishes successfully.

This hook uses any BaseAlerter that is configured within the active stack to post a message.

Source code in src/zenml/hooks/alerter_hooks.py
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
def alerter_success_hook() -> None:
    """Standard success hook that executes after step finishes successfully.

    This hook uses any `BaseAlerter` that is configured within the active stack to post a message.
    """
    context = get_step_context()
    alerter = Client().active_stack.alerter
    if alerter:
        message = (
            "*Success Hook Notification! Step completed successfully*" + "\n\n"
        )
        message += f"Pipeline name: `{context.pipeline.name}`" + "\n"
        message += f"Run name: `{context.pipeline_run.name}`" + "\n"
        message += f"Step name: `{context.step_run.name}`" + "\n"
        message += f"Parameters: `{context.step_run.config.parameters}`" + "\n"
        alerter.post(message)
    else:
        logger.warning(
            "Specified standard success hook but no alerter configured in the stack. Skipping.."
        )

resolve_and_validate_hook(hook)

Resolves and validates a hook callback.

Parameters:

Name Type Description Default
hook HookSpecification

Hook function or source.

required

Returns:

Type Description
Source

Hook source.

Raises:

Type Description
ValueError

If hook_func is not a valid callable.

Source code in src/zenml/hooks/hook_validators.py
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
def resolve_and_validate_hook(hook: "HookSpecification") -> Source:
    """Resolves and validates a hook callback.

    Args:
        hook: Hook function or source.

    Returns:
        Hook source.

    Raises:
        ValueError: If `hook_func` is not a valid callable.
    """
    if isinstance(hook, (str, Source)):
        func = source_utils.load(hook)
    else:
        func = hook

    if not callable(func):
        raise ValueError(f"{func} is not a valid function.")

    sig = inspect.getfullargspec(inspect.unwrap(func))
    sig_annotations = sig.annotations
    if "return" in sig_annotations:
        sig_annotations.pop("return")

    if sig.args and len(sig.args) != len(sig_annotations):
        raise ValueError(
            "You can only pass arguments to a hook that are annotated with a "
            "`BaseException` type."
        )

    if sig_annotations:
        annotations = sig_annotations.values()
        seen_annotations = set()
        for annotation in annotations:
            if annotation:
                if annotation not in (BaseException,):
                    raise ValueError(
                        "Hook arguments must be of type `BaseException`, not "
                        f"`{annotation}`."
                    )

                if annotation in seen_annotations:
                    raise ValueError(
                        "You can only pass one `BaseException` type to a hook."
                        "Currently your function has the following"
                        f"annotations: {sig_annotations}"
                    )
                seen_annotations.add(annotation)

    return source_utils.resolve(func)

Image Builders

Image builders allow you to build container images.

BaseImageBuilder

Bases: StackComponent, ABC

Base class for all ZenML image builders.

Source code in src/zenml/image_builders/base_image_builder.py
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
class BaseImageBuilder(StackComponent, ABC):
    """Base class for all ZenML image builders."""

    @property
    def config(self) -> BaseImageBuilderConfig:
        """The stack component configuration.

        Returns:
            The configuration.
        """
        return cast(BaseImageBuilderConfig, self._config)

    @property
    def build_context_class(self) -> Type["BuildContext"]:
        """Build context class to use.

        The default build context class creates a build context that works
        for the Docker daemon. Override this method if your image builder
        requires a custom context.

        Returns:
            The build context class.
        """
        from zenml.image_builders import BuildContext

        return BuildContext

    @property
    @abstractmethod
    def is_building_locally(self) -> bool:
        """Whether the image builder builds the images on the client machine.

        Returns:
            True if the image builder builds locally, False otherwise.
        """

    @abstractmethod
    def build(
        self,
        image_name: str,
        build_context: "BuildContext",
        docker_build_options: Dict[str, Any],
        container_registry: Optional["BaseContainerRegistry"] = None,
    ) -> str:
        """Builds a Docker image.

        If a container registry is passed, the image will be pushed to that
        registry.

        Args:
            image_name: Name of the image to build.
            build_context: The build context to use for the image.
            docker_build_options: Docker build options.
            container_registry: Optional container registry to push to.

        Returns:
            The Docker image repo digest or name.
        """

    @staticmethod
    def _upload_build_context(
        build_context: "BuildContext",
        parent_path_directory_name: str,
        archive_type: ArchiveType = ArchiveType.TAR_GZ,
    ) -> str:
        """Uploads a Docker image build context to a remote location.

        Args:
            build_context: The build context to upload.
            parent_path_directory_name: The name of the directory to upload
                the build context to. It will be appended to the artifact
                store path to create the parent path where the build context
                will be uploaded to.
            archive_type: The type of archive to create.

        Returns:
            The path to the uploaded build context.
        """
        artifact_store = Client().active_stack.artifact_store
        parent_path = f"{artifact_store.path}/{parent_path_directory_name}"
        fileio.makedirs(parent_path)

        hash_ = hashlib.sha1()  # nosec
        with tempfile.NamedTemporaryFile(mode="w+b", delete=False) as f:
            build_context.write_archive(f, archive_type)

            while True:
                data = f.read(64 * 1024)
                if not data:
                    break
                hash_.update(data)

            filename = f"{hash_.hexdigest()}.{archive_type.value}"
            filepath = f"{parent_path}/{filename}"
            if not fileio.exists(filepath):
                logger.info("Uploading build context to `%s`.", filepath)
                fileio.copy(f.name, filepath)
            else:
                logger.info("Build context already exists, not uploading.")

        os.unlink(f.name)
        return filepath

build_context_class property

Build context class to use.

The default build context class creates a build context that works for the Docker daemon. Override this method if your image builder requires a custom context.

Returns:

Type Description
Type[BuildContext]

The build context class.

config property

The stack component configuration.

Returns:

Type Description
BaseImageBuilderConfig

The configuration.

is_building_locally abstractmethod property

Whether the image builder builds the images on the client machine.

Returns:

Type Description
bool

True if the image builder builds locally, False otherwise.

build(image_name, build_context, docker_build_options, container_registry=None) abstractmethod

Builds a Docker image.

If a container registry is passed, the image will be pushed to that registry.

Parameters:

Name Type Description Default
image_name str

Name of the image to build.

required
build_context BuildContext

The build context to use for the image.

required
docker_build_options Dict[str, Any]

Docker build options.

required
container_registry Optional[BaseContainerRegistry]

Optional container registry to push to.

None

Returns:

Type Description
str

The Docker image repo digest or name.

Source code in src/zenml/image_builders/base_image_builder.py
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
@abstractmethod
def build(
    self,
    image_name: str,
    build_context: "BuildContext",
    docker_build_options: Dict[str, Any],
    container_registry: Optional["BaseContainerRegistry"] = None,
) -> str:
    """Builds a Docker image.

    If a container registry is passed, the image will be pushed to that
    registry.

    Args:
        image_name: Name of the image to build.
        build_context: The build context to use for the image.
        docker_build_options: Docker build options.
        container_registry: Optional container registry to push to.

    Returns:
        The Docker image repo digest or name.
    """

BaseImageBuilderConfig

Bases: StackComponentConfig

Base config for image builders.

Source code in src/zenml/image_builders/base_image_builder.py
37
38
class BaseImageBuilderConfig(StackComponentConfig):
    """Base config for image builders."""

BaseImageBuilderFlavor

Bases: Flavor, ABC

Base class for all ZenML image builder flavors.

Source code in src/zenml/image_builders/base_image_builder.py
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
class BaseImageBuilderFlavor(Flavor, ABC):
    """Base class for all ZenML image builder flavors."""

    @property
    def type(self) -> StackComponentType:
        """Returns the flavor type.

        Returns:
            The flavor type.
        """
        return StackComponentType.IMAGE_BUILDER

    @property
    def config_class(self) -> Type[BaseImageBuilderConfig]:
        """Config class.

        Returns:
            The config class.
        """
        return BaseImageBuilderConfig

    @property
    def implementation_class(self) -> Type[BaseImageBuilder]:
        """Implementation class.

        Returns:
            The implementation class.
        """
        return BaseImageBuilder

config_class property

Config class.

Returns:

Type Description
Type[BaseImageBuilderConfig]

The config class.

implementation_class property

Implementation class.

Returns:

Type Description
Type[BaseImageBuilder]

The implementation class.

type property

Returns the flavor type.

Returns:

Type Description
StackComponentType

The flavor type.

BuildContext

Bases: Archivable

Image build context.

This class is responsible for creating an archive of the files needed to build a container image.

Source code in src/zenml/image_builders/build_context.py
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
class BuildContext(Archivable):
    """Image build context.

    This class is responsible for creating an archive of the files needed to
    build a container image.
    """

    def __init__(
        self,
        root: Optional[str] = None,
        dockerignore_file: Optional[str] = None,
    ) -> None:
        """Initializes a build context.

        Args:
            root: Optional root directory for the build context.
            dockerignore_file: Optional path to a dockerignore file. If not
                given, a file called `.dockerignore` in the build context root
                directory will be used instead if it exists.
        """
        super().__init__()
        self._root = root
        self._dockerignore_file = dockerignore_file

    @property
    def dockerignore_file(self) -> Optional[str]:
        """The dockerignore file to use.

        Returns:
            Path to the dockerignore file to use.
        """
        if self._dockerignore_file:
            return self._dockerignore_file

        if self._root:
            default_dockerignore_path = os.path.join(
                self._root, ".dockerignore"
            )
            if fileio.exists(default_dockerignore_path):
                return default_dockerignore_path

        return None

    def write_archive(
        self,
        output_file: IO[bytes],
        archive_type: ArchiveType = ArchiveType.TAR_GZ,
    ) -> None:
        """Writes an archive of the build context to the given file.

        Args:
            output_file: The file to write the archive to.
            archive_type: The type of archive to create.
        """
        super().write_archive(output_file, archive_type)

        build_context_size = os.path.getsize(output_file.name)
        if (
            self._root
            and build_context_size > 50 * 1024 * 1024
            and not self.dockerignore_file
        ):
            # The build context exceeds 50MiB and we didn't find any excludes
            # in dockerignore files -> remind to specify a .dockerignore file
            logger.warning(
                "Build context size for docker image: `%s`. If you believe this is "
                "unreasonably large, make sure to include a `.dockerignore` file "
                "at the root of your build context `%s` or specify a custom file "
                "in the Docker configuration when defining your pipeline.",
                string_utils.get_human_readable_filesize(build_context_size),
                os.path.join(self._root, ".dockerignore"),
            )

    def get_files(self) -> Dict[str, str]:
        """Gets all regular files that should be included in the archive.

        Returns:
            A dict {path_in_archive: path_on_filesystem} for all regular files
            in the archive.
        """
        if self._root:
            from docker.utils import build as docker_build_utils

            exclude_patterns = self._get_exclude_patterns()

            archive_paths = cast(
                Set[str],
                docker_build_utils.exclude_paths(
                    self._root, patterns=exclude_patterns
                ),
            )
            return {
                archive_path: os.path.join(self._root, archive_path)
                for archive_path in archive_paths
            }
        else:
            return {}

    def _get_exclude_patterns(self) -> List[str]:
        """Gets all exclude patterns from the dockerignore file.

        Returns:
            The exclude patterns from the dockerignore file.
        """
        dockerignore = self.dockerignore_file
        if dockerignore:
            patterns = self._parse_dockerignore(dockerignore)
            # Always include the .zen directory
            patterns.append(f"!/{REPOSITORY_DIRECTORY_NAME}")
            return patterns
        else:
            logger.info(
                "No `.dockerignore` found, including all files inside build "
                "context.",
            )
            return []

    @staticmethod
    def _parse_dockerignore(dockerignore_path: str) -> List[str]:
        """Parses a dockerignore file and returns a list of patterns to ignore.

        Args:
            dockerignore_path: Path to the dockerignore file.

        Returns:
            List of patterns to ignore.
        """
        try:
            file_content = io_utils.read_file_contents_as_string(
                dockerignore_path
            )
        except FileNotFoundError:
            logger.warning(
                "Unable to find dockerignore file at path '%s'.",
                dockerignore_path,
            )
            return []

        exclude_patterns = []
        for line in file_content.split("\n"):
            line = line.strip()
            if line and not line.startswith("#"):
                exclude_patterns.append(line)

        return exclude_patterns

dockerignore_file property

The dockerignore file to use.

Returns:

Type Description
Optional[str]

Path to the dockerignore file to use.

__init__(root=None, dockerignore_file=None)

Initializes a build context.

Parameters:

Name Type Description Default
root Optional[str]

Optional root directory for the build context.

None
dockerignore_file Optional[str]

Optional path to a dockerignore file. If not given, a file called .dockerignore in the build context root directory will be used instead if it exists.

None
Source code in src/zenml/image_builders/build_context.py
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
def __init__(
    self,
    root: Optional[str] = None,
    dockerignore_file: Optional[str] = None,
) -> None:
    """Initializes a build context.

    Args:
        root: Optional root directory for the build context.
        dockerignore_file: Optional path to a dockerignore file. If not
            given, a file called `.dockerignore` in the build context root
            directory will be used instead if it exists.
    """
    super().__init__()
    self._root = root
    self._dockerignore_file = dockerignore_file

get_files()

Gets all regular files that should be included in the archive.

Returns:

Type Description
Dict[str, str]

A dict {path_in_archive: path_on_filesystem} for all regular files

Dict[str, str]

in the archive.

Source code in src/zenml/image_builders/build_context.py
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
def get_files(self) -> Dict[str, str]:
    """Gets all regular files that should be included in the archive.

    Returns:
        A dict {path_in_archive: path_on_filesystem} for all regular files
        in the archive.
    """
    if self._root:
        from docker.utils import build as docker_build_utils

        exclude_patterns = self._get_exclude_patterns()

        archive_paths = cast(
            Set[str],
            docker_build_utils.exclude_paths(
                self._root, patterns=exclude_patterns
            ),
        )
        return {
            archive_path: os.path.join(self._root, archive_path)
            for archive_path in archive_paths
        }
    else:
        return {}

write_archive(output_file, archive_type=ArchiveType.TAR_GZ)

Writes an archive of the build context to the given file.

Parameters:

Name Type Description Default
output_file IO[bytes]

The file to write the archive to.

required
archive_type ArchiveType

The type of archive to create.

TAR_GZ
Source code in src/zenml/image_builders/build_context.py
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
def write_archive(
    self,
    output_file: IO[bytes],
    archive_type: ArchiveType = ArchiveType.TAR_GZ,
) -> None:
    """Writes an archive of the build context to the given file.

    Args:
        output_file: The file to write the archive to.
        archive_type: The type of archive to create.
    """
    super().write_archive(output_file, archive_type)

    build_context_size = os.path.getsize(output_file.name)
    if (
        self._root
        and build_context_size > 50 * 1024 * 1024
        and not self.dockerignore_file
    ):
        # The build context exceeds 50MiB and we didn't find any excludes
        # in dockerignore files -> remind to specify a .dockerignore file
        logger.warning(
            "Build context size for docker image: `%s`. If you believe this is "
            "unreasonably large, make sure to include a `.dockerignore` file "
            "at the root of your build context `%s` or specify a custom file "
            "in the Docker configuration when defining your pipeline.",
            string_utils.get_human_readable_filesize(build_context_size),
            os.path.join(self._root, ".dockerignore"),
        )

LocalImageBuilder

Bases: BaseImageBuilder

Local image builder implementation.

Source code in src/zenml/image_builders/local_image_builder.py
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
class LocalImageBuilder(BaseImageBuilder):
    """Local image builder implementation."""

    @property
    def config(self) -> LocalImageBuilderConfig:
        """The stack component configuration.

        Returns:
            The configuration.
        """
        return cast(LocalImageBuilderConfig, self._config)

    @property
    def is_building_locally(self) -> bool:
        """Whether the image builder builds the images on the client machine.

        Returns:
            True if the image builder builds locally, False otherwise.
        """
        return True

    @staticmethod
    def _check_prerequisites() -> None:
        """Checks that all prerequisites are installed.

        Raises:
            RuntimeError: If any of the prerequisites are not installed or
                running.
        """
        if not shutil.which("docker"):
            raise RuntimeError(
                "`docker` is required to run the local image builder."
            )

        if not docker_utils.check_docker():
            # For 3., this is not supported by the python docker library
            # https://github.com/docker/docker-py/issues/3146
            raise RuntimeError(
                "Unable to connect to the Docker daemon. There are three "
                "common causes for this:\n"
                "1) The Docker daemon isn't running.\n"
                "2) The Docker client isn't configured correctly. The client "
                "loads its configuration from the following file: "
                "$HOME/.docker/config.json. If your configuration file is in a "
                "different location, set the `DOCKER_CONFIG` environment "
                "variable to the directory that contains your `config.json` "
                "file.\n"
                "3) If your Docker CLI is working fine but you ran into this "
                "issue, you might be using a non-default Docker context which "
                "is not supported by the Docker python library. To verify "
                "this, run `docker context ls` and check which context has a "
                "`*` next to it. If this is not the `default` context, copy "
                "the `DOCKER ENDPOINT` value of that context and set the "
                "`DOCKER_HOST` environment variable to that value."
            )

    def build(
        self,
        image_name: str,
        build_context: "BuildContext",
        docker_build_options: Optional[Dict[str, Any]] = None,
        container_registry: Optional["BaseContainerRegistry"] = None,
    ) -> str:
        """Builds and optionally pushes an image using the local Docker client.

        Args:
            image_name: Name of the image to build and push.
            build_context: The build context to use for the image.
            docker_build_options: Docker build options.
            container_registry: Optional container registry to push to.

        Returns:
            The Docker image repo digest.
        """
        self._check_prerequisites()

        if container_registry:
            # Use the container registry's docker client, which may be
            # authenticated to access additional registries
            docker_client = container_registry.docker_client
        else:
            docker_client = docker_utils._try_get_docker_client_from_env()

        with tempfile.TemporaryFile(mode="w+b") as f:
            build_context.write_archive(f)

            # We use the client api directly here, so we can stream the logs
            output_stream = docker_client.images.client.api.build(
                fileobj=f,
                custom_context=True,
                tag=image_name,
                **(docker_build_options or {}),
            )
        docker_utils._process_stream(output_stream)

        if container_registry:
            return container_registry.push_image(image_name)
        else:
            return image_name

config property

The stack component configuration.

Returns:

Type Description
LocalImageBuilderConfig

The configuration.

is_building_locally property

Whether the image builder builds the images on the client machine.

Returns:

Type Description
bool

True if the image builder builds locally, False otherwise.

build(image_name, build_context, docker_build_options=None, container_registry=None)

Builds and optionally pushes an image using the local Docker client.

Parameters:

Name Type Description Default
image_name str

Name of the image to build and push.

required
build_context BuildContext

The build context to use for the image.

required
docker_build_options Optional[Dict[str, Any]]

Docker build options.

None
container_registry Optional[BaseContainerRegistry]

Optional container registry to push to.

None

Returns:

Type Description
str

The Docker image repo digest.

Source code in src/zenml/image_builders/local_image_builder.py
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
def build(
    self,
    image_name: str,
    build_context: "BuildContext",
    docker_build_options: Optional[Dict[str, Any]] = None,
    container_registry: Optional["BaseContainerRegistry"] = None,
) -> str:
    """Builds and optionally pushes an image using the local Docker client.

    Args:
        image_name: Name of the image to build and push.
        build_context: The build context to use for the image.
        docker_build_options: Docker build options.
        container_registry: Optional container registry to push to.

    Returns:
        The Docker image repo digest.
    """
    self._check_prerequisites()

    if container_registry:
        # Use the container registry's docker client, which may be
        # authenticated to access additional registries
        docker_client = container_registry.docker_client
    else:
        docker_client = docker_utils._try_get_docker_client_from_env()

    with tempfile.TemporaryFile(mode="w+b") as f:
        build_context.write_archive(f)

        # We use the client api directly here, so we can stream the logs
        output_stream = docker_client.images.client.api.build(
            fileobj=f,
            custom_context=True,
            tag=image_name,
            **(docker_build_options or {}),
        )
    docker_utils._process_stream(output_stream)

    if container_registry:
        return container_registry.push_image(image_name)
    else:
        return image_name

LocalImageBuilderConfig

Bases: BaseImageBuilderConfig

Local image builder configuration.

Source code in src/zenml/image_builders/local_image_builder.py
32
33
class LocalImageBuilderConfig(BaseImageBuilderConfig):
    """Local image builder configuration."""

LocalImageBuilderFlavor

Bases: BaseImageBuilderFlavor

Local image builder flavor.

Source code in src/zenml/image_builders/local_image_builder.py
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
class LocalImageBuilderFlavor(BaseImageBuilderFlavor):
    """Local image builder flavor."""

    @property
    def name(self) -> str:
        """The flavor name.

        Returns:
            The flavor name.
        """
        return "local"

    @property
    def docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_docs_url()

    @property
    def sdk_docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_sdk_docs_url()

    @property
    def logo_url(self) -> str:
        """A url to represent the flavor in the dashboard.

        Returns:
            The flavor logo.
        """
        return "https://public-flavor-logos.s3.eu-central-1.amazonaws.com/image_builder/local.svg"

    @property
    def config_class(self) -> Type[LocalImageBuilderConfig]:
        """Config class.

        Returns:
            The config class.
        """
        return LocalImageBuilderConfig

    @property
    def implementation_class(self) -> Type[LocalImageBuilder]:
        """Implementation class.

        Returns:
            The implementation class.
        """
        return LocalImageBuilder

config_class property

Config class.

Returns:

Type Description
Type[LocalImageBuilderConfig]

The config class.

docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

implementation_class property

Implementation class.

Returns:

Type Description
Type[LocalImageBuilder]

The implementation class.

logo_url property

A url to represent the flavor in the dashboard.

Returns:

Type Description
str

The flavor logo.

name property

The flavor name.

Returns:

Type Description
str

The flavor name.

sdk_docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

Io

The io module handles file operations for the ZenML package.

It offers a standard interface for reading, writing and manipulating files and directories. It is heavily influenced and inspired by the io module of tfx.

Logger

Logger implementation.

CustomFormatter

Bases: Formatter

Formats logs according to custom specifications.

Source code in src/zenml/logger.py
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
class CustomFormatter(logging.Formatter):
    """Formats logs according to custom specifications."""

    grey: str = "\x1b[38;21m"
    pink: str = "\x1b[35m"
    green: str = "\x1b[32m"
    yellow: str = "\x1b[33m"
    red: str = "\x1b[31m"
    cyan: str = "\x1b[1;36m"
    bold_red: str = "\x1b[31;1m"
    purple: str = "\x1b[1;35m"
    blue: str = "\x1b[34m"
    reset: str = "\x1b[0m"

    format_template: str = (
        "%(asctime)s - %(name)s - %(levelname)s - %(message)s (%("
        "filename)s:%(lineno)d)"
        if LoggingLevels[ZENML_LOGGING_VERBOSITY] == LoggingLevels.DEBUG
        else "%(message)s"
    )

    COLORS: Dict[LoggingLevels, str] = {
        LoggingLevels.DEBUG: grey,
        LoggingLevels.INFO: purple,
        LoggingLevels.WARN: yellow,
        LoggingLevels.ERROR: red,
        LoggingLevels.CRITICAL: bold_red,
    }

    def format(self, record: logging.LogRecord) -> str:
        """Converts a log record to a (colored) string.

        Args:
            record: LogRecord generated by the code.

        Returns:
            A string formatted according to specifications.
        """
        if ZENML_LOGGING_COLORS_DISABLED:
            # If color formatting is disabled, use the default format without colors
            formatter = logging.Formatter(self.format_template)
            return formatter.format(record)
        else:
            # Use color formatting
            log_fmt = (
                self.COLORS[LoggingLevels(record.levelno)]
                + self.format_template
                + self.reset
            )
            formatter = logging.Formatter(log_fmt)
            formatted_message = formatter.format(record)
            quoted_groups = re.findall("`([^`]*)`", formatted_message)
            for quoted in quoted_groups:
                formatted_message = formatted_message.replace(
                    "`" + quoted + "`",
                    self.reset
                    + self.cyan
                    + quoted
                    + self.COLORS.get(LoggingLevels(record.levelno)),
                )

            # Format URLs
            url_pattern = r"http[s]?://(?:[a-zA-Z]|[0-9]|[$-_@.&+]|[!*\\(\\),]|(?:%[0-9a-fA-F][0-9a-fA-F]))+"
            urls = re.findall(url_pattern, formatted_message)
            for url in urls:
                formatted_message = formatted_message.replace(
                    url,
                    self.reset
                    + self.blue
                    + url
                    + self.COLORS.get(LoggingLevels(record.levelno)),
                )
            return formatted_message

format(record)

Converts a log record to a (colored) string.

Parameters:

Name Type Description Default
record LogRecord

LogRecord generated by the code.

required

Returns:

Type Description
str

A string formatted according to specifications.

Source code in src/zenml/logger.py
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
def format(self, record: logging.LogRecord) -> str:
    """Converts a log record to a (colored) string.

    Args:
        record: LogRecord generated by the code.

    Returns:
        A string formatted according to specifications.
    """
    if ZENML_LOGGING_COLORS_DISABLED:
        # If color formatting is disabled, use the default format without colors
        formatter = logging.Formatter(self.format_template)
        return formatter.format(record)
    else:
        # Use color formatting
        log_fmt = (
            self.COLORS[LoggingLevels(record.levelno)]
            + self.format_template
            + self.reset
        )
        formatter = logging.Formatter(log_fmt)
        formatted_message = formatter.format(record)
        quoted_groups = re.findall("`([^`]*)`", formatted_message)
        for quoted in quoted_groups:
            formatted_message = formatted_message.replace(
                "`" + quoted + "`",
                self.reset
                + self.cyan
                + quoted
                + self.COLORS.get(LoggingLevels(record.levelno)),
            )

        # Format URLs
        url_pattern = r"http[s]?://(?:[a-zA-Z]|[0-9]|[$-_@.&+]|[!*\\(\\),]|(?:%[0-9a-fA-F][0-9a-fA-F]))+"
        urls = re.findall(url_pattern, formatted_message)
        for url in urls:
            formatted_message = formatted_message.replace(
                url,
                self.reset
                + self.blue
                + url
                + self.COLORS.get(LoggingLevels(record.levelno)),
            )
        return formatted_message

get_console_handler()

Get console handler for logging.

Returns:

Type Description
Any

A console handler.

Source code in src/zenml/logger.py
160
161
162
163
164
165
166
167
168
def get_console_handler() -> Any:
    """Get console handler for logging.

    Returns:
        A console handler.
    """
    console_handler = logging.StreamHandler(sys.stdout)
    console_handler.setFormatter(get_formatter())
    return console_handler

get_formatter()

Get a configured logging formatter.

Returns:

Type Description
Formatter

The formatter.

Source code in src/zenml/logger.py
148
149
150
151
152
153
154
155
156
157
def get_formatter() -> logging.Formatter:
    """Get a configured logging formatter.

    Returns:
        The formatter.
    """
    if log_format := os.environ.get(ENV_ZENML_LOGGING_FORMAT, None):
        return logging.Formatter(fmt=log_format)
    else:
        return CustomFormatter()

get_logger(logger_name)

Main function to get logger name,.

Parameters:

Name Type Description Default
logger_name str

Name of logger to initialize.

required

Returns:

Type Description
Logger

A logger object.

Source code in src/zenml/logger.py
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
def get_logger(logger_name: str) -> logging.Logger:
    """Main function to get logger name,.

    Args:
        logger_name: Name of logger to initialize.

    Returns:
        A logger object.
    """
    logger = logging.getLogger(logger_name)
    logger.setLevel(get_logging_level().value)
    logger.addHandler(get_console_handler())

    logger.propagate = False
    return logger

get_logging_level()

Get logging level from the env variable.

Returns:

Type Description
LoggingLevels

The logging level.

Raises:

Type Description
KeyError

If the logging level is not found.

Source code in src/zenml/logger.py
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
def get_logging_level() -> LoggingLevels:
    """Get logging level from the env variable.

    Returns:
        The logging level.

    Raises:
        KeyError: If the logging level is not found.
    """
    verbosity = ZENML_LOGGING_VERBOSITY.upper()
    if verbosity not in LoggingLevels.__members__:
        raise KeyError(
            f"Verbosity must be one of {list(LoggingLevels.__members__.keys())}"
        )
    return LoggingLevels[verbosity]

init_logging()

Initialize logging with default levels.

Source code in src/zenml/logger.py
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
def init_logging() -> None:
    """Initialize logging with default levels."""
    # Mute tensorflow cuda warnings
    os.environ["TF_CPP_MIN_LOG_LEVEL"] = "3"
    set_root_verbosity()

    console_handler = logging.StreamHandler(sys.stdout)
    console_handler.setFormatter(get_formatter())
    logging.root.addHandler(console_handler)

    # Enable logs if environment variable SUPPRESS_ZENML_LOGS is not set to True
    suppress_zenml_logs: bool = handle_bool_env_var(
        ENV_ZENML_SUPPRESS_LOGS, True
    )
    if suppress_zenml_logs:
        # suppress logger info messages
        suppressed_logger_names = [
            "urllib3",
            "azure.core.pipeline.policies.http_logging_policy",
            "grpc",
            "requests",
            "kfp",
            "tensorflow",
        ]
        for logger_name in suppressed_logger_names:
            logging.getLogger(logger_name).setLevel(logging.WARNING)

        # disable logger messages
        disabled_logger_names = [
            "rdbms_metadata_access_object",
            "backoff",
            "segment",
        ]
        for logger_name in disabled_logger_names:
            logging.getLogger(logger_name).setLevel(logging.WARNING)
            logging.getLogger(logger_name).disabled = True

set_root_verbosity()

Set the root verbosity.

Source code in src/zenml/logger.py
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
def set_root_verbosity() -> None:
    """Set the root verbosity."""
    level = get_logging_level()
    if level != LoggingLevels.NOTSET:
        if ENABLE_RICH_TRACEBACK:
            rich_tb_install(show_locals=(level == LoggingLevels.DEBUG))

        logging.root.setLevel(level=level.value)
        get_logger(__name__).debug(
            f"Logging set to level: {logging.getLevelName(level.value)}"
        )
    else:
        logging.disable(sys.maxsize)
        logging.getLogger().disabled = True
        get_logger(__name__).debug("Logging NOTSET")

Logging

Logging utilities.

Login

ZenML login utilities.

Materializers

Initialization of ZenML materializers.

Materializers are used to convert a ZenML artifact into a specific format. They are most often used to handle the input or output of ZenML steps, and can be extended by building on the BaseMaterializer class.

BuiltInContainerMaterializer

Bases: BaseMaterializer

Handle built-in container types (dict, list, set, tuple).

Source code in src/zenml/materializers/built_in_materializer.py
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
class BuiltInContainerMaterializer(BaseMaterializer):
    """Handle built-in container types (dict, list, set, tuple)."""

    ASSOCIATED_TYPES: ClassVar[Tuple[Type[Any], ...]] = (
        dict,
        list,
        set,
        tuple,
    )

    def __init__(
        self, uri: str, artifact_store: Optional[BaseArtifactStore] = None
    ):
        """Define `self.data_path` and `self.metadata_path`.

        Args:
            uri: The URI where the artifact data is stored.
            artifact_store: The artifact store where the artifact data is stored.
        """
        super().__init__(uri, artifact_store)
        self.data_path = os.path.join(self.uri, DEFAULT_FILENAME)
        self.metadata_path = os.path.join(self.uri, DEFAULT_METADATA_FILENAME)

    def load(self, data_type: Type[Any]) -> Any:
        """Reads a materialized built-in container object.

        If the data was serialized to JSON, deserialize it.

        Otherwise, reconstruct all elements according to the metadata file:
            1. Resolve the data type using `find_type_by_str()`,
            2. Get the materializer via the `default_materializer_registry`,
            3. Initialize the materializer with the desired path,
            4. Use `load()` of that materializer to load the element.

        Args:
            data_type: The type of the data to read.

        Returns:
            The data read.

        Raises:
            RuntimeError: If the data was not found.
        """
        # If the data was not serialized, there must be metadata present.
        if not self.artifact_store.exists(
            self.data_path
        ) and not self.artifact_store.exists(self.metadata_path):
            raise RuntimeError(
                f"Materialization of type {data_type} failed. Expected either"
                f"{self.data_path} or {self.metadata_path} to exist."
            )

        # If the data was serialized as JSON, deserialize it.
        if self.artifact_store.exists(self.data_path):
            outputs = yaml_utils.read_json(self.data_path)

        # Otherwise, use the metadata to reconstruct the data as a list.
        else:
            metadata = yaml_utils.read_json(self.metadata_path)
            outputs = []

            # Backwards compatibility for zenml <= 0.37.0
            if isinstance(metadata, dict):
                for path_, type_str in zip(
                    metadata["paths"], metadata["types"]
                ):
                    type_ = find_type_by_str(type_str)
                    materializer_class = materializer_registry[type_]
                    materializer = materializer_class(
                        uri=path_, artifact_store=self.artifact_store
                    )
                    element = materializer.load(type_)
                    outputs.append(element)

            # New format for zenml > 0.37.0
            elif isinstance(metadata, list):
                for entry in metadata:
                    path_ = entry["path"]
                    type_ = source_utils.load(entry["type"])
                    materializer_class = source_utils.load(
                        entry["materializer"]
                    )
                    materializer = materializer_class(
                        uri=path_, artifact_store=self.artifact_store
                    )
                    element = materializer.load(type_)
                    outputs.append(element)

            else:
                raise RuntimeError(f"Unknown metadata format: {metadata}.")

        # Cast the data to the correct type.
        if issubclass(data_type, dict) and not isinstance(outputs, dict):
            keys, values = outputs
            return data_type(zip(keys, values))
        if issubclass(data_type, tuple) and not isinstance(outputs, tuple):
            return data_type(outputs)
        if issubclass(data_type, set) and not isinstance(outputs, set):
            return data_type(outputs)
        return outputs

    def save(self, data: Any) -> None:
        """Materialize a built-in container object.

        If the object can be serialized to JSON, serialize it.

        Otherwise, use the `default_materializer_registry` to find the correct
        materializer for each element and materialize each element into a
        subdirectory.

        Tuples and sets are cast to list before materialization.

        For non-serializable dicts, materialize keys/values as separate lists.

        Args:
            data: The built-in container object to materialize.

        Raises:
            Exception: If any exception occurs, it is raised after cleanup.
        """
        # tuple and set: handle as list.
        if isinstance(data, tuple) or isinstance(data, set):
            data = list(data)

        # If the data is serializable, just write it into a single JSON file.
        if _is_serializable(data):
            yaml_utils.write_json(
                self.data_path,
                data,
                ensure_ascii=not ZENML_MATERIALIZER_ALLOW_NON_ASCII_JSON_DUMPS,
            )
            return

        # non-serializable dict: Handle as non-serializable list of lists.
        if isinstance(data, dict):
            data = [list(data.keys()), list(data.values())]

        # non-serializable list: Materialize each element into a subfolder.
        # Get path, type, and corresponding materializer for each element.
        metadata: List[Dict[str, str]] = []
        materializers: List[BaseMaterializer] = []
        try:
            for i, element in enumerate(data):
                element_path = os.path.join(self.uri, str(i))
                self.artifact_store.mkdir(element_path)
                type_ = type(element)
                materializer_class = materializer_registry[type_]
                materializer = materializer_class(
                    uri=element_path, artifact_store=self.artifact_store
                )
                materializers.append(materializer)
                metadata.append(
                    {
                        "path": element_path,
                        "type": source_utils.resolve(type_).import_path,
                        "materializer": source_utils.resolve(
                            materializer_class
                        ).import_path,
                    }
                )
            # Write metadata as JSON.
            yaml_utils.write_json(self.metadata_path, metadata)
            # Materialize each element.
            for element, materializer in zip(data, materializers):
                materializer.validate_save_type_compatibility(type(element))
                materializer.save(element)
        # If an error occurs, delete all created files.
        except Exception as e:
            # Delete metadata
            if self.artifact_store.exists(self.metadata_path):
                self.artifact_store.remove(self.metadata_path)
            # Delete all elements that were already saved.
            for entry in metadata:
                self.artifact_store.rmtree(entry["path"])
            raise e

    # save dict type objects to JSON file with JSON visualization type
    def save_visualizations(self, data: Any) -> Dict[str, "VisualizationType"]:
        """Save visualizations for the given data.

        Args:
            data: The data to save visualizations for.

        Returns:
            A dictionary of visualization URIs and their types.
        """
        # dict/list type objects are always saved as JSON files
        # doesn't work for non-serializable types as they
        # are saved as list of lists in different files
        if _is_serializable(data):
            return {self.data_path.replace("\\", "/"): VisualizationType.JSON}
        return {}

    def extract_metadata(self, data: Any) -> Dict[str, "MetadataType"]:
        """Extract metadata from the given built-in container object.

        Args:
            data: The built-in container object to extract metadata from.

        Returns:
            The extracted metadata as a dictionary.
        """
        if hasattr(data, "__len__"):
            return {"length": len(data)}
        return {}

__init__(uri, artifact_store=None)

Define self.data_path and self.metadata_path.

Parameters:

Name Type Description Default
uri str

The URI where the artifact data is stored.

required
artifact_store Optional[BaseArtifactStore]

The artifact store where the artifact data is stored.

None
Source code in src/zenml/materializers/built_in_materializer.py
297
298
299
300
301
302
303
304
305
306
307
308
def __init__(
    self, uri: str, artifact_store: Optional[BaseArtifactStore] = None
):
    """Define `self.data_path` and `self.metadata_path`.

    Args:
        uri: The URI where the artifact data is stored.
        artifact_store: The artifact store where the artifact data is stored.
    """
    super().__init__(uri, artifact_store)
    self.data_path = os.path.join(self.uri, DEFAULT_FILENAME)
    self.metadata_path = os.path.join(self.uri, DEFAULT_METADATA_FILENAME)

extract_metadata(data)

Extract metadata from the given built-in container object.

Parameters:

Name Type Description Default
data Any

The built-in container object to extract metadata from.

required

Returns:

Type Description
Dict[str, MetadataType]

The extracted metadata as a dictionary.

Source code in src/zenml/materializers/built_in_materializer.py
480
481
482
483
484
485
486
487
488
489
490
491
def extract_metadata(self, data: Any) -> Dict[str, "MetadataType"]:
    """Extract metadata from the given built-in container object.

    Args:
        data: The built-in container object to extract metadata from.

    Returns:
        The extracted metadata as a dictionary.
    """
    if hasattr(data, "__len__"):
        return {"length": len(data)}
    return {}

load(data_type)

Reads a materialized built-in container object.

If the data was serialized to JSON, deserialize it.

Otherwise, reconstruct all elements according to the metadata file: 1. Resolve the data type using find_type_by_str(), 2. Get the materializer via the default_materializer_registry, 3. Initialize the materializer with the desired path, 4. Use load() of that materializer to load the element.

Parameters:

Name Type Description Default
data_type Type[Any]

The type of the data to read.

required

Returns:

Type Description
Any

The data read.

Raises:

Type Description
RuntimeError

If the data was not found.

Source code in src/zenml/materializers/built_in_materializer.py
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
def load(self, data_type: Type[Any]) -> Any:
    """Reads a materialized built-in container object.

    If the data was serialized to JSON, deserialize it.

    Otherwise, reconstruct all elements according to the metadata file:
        1. Resolve the data type using `find_type_by_str()`,
        2. Get the materializer via the `default_materializer_registry`,
        3. Initialize the materializer with the desired path,
        4. Use `load()` of that materializer to load the element.

    Args:
        data_type: The type of the data to read.

    Returns:
        The data read.

    Raises:
        RuntimeError: If the data was not found.
    """
    # If the data was not serialized, there must be metadata present.
    if not self.artifact_store.exists(
        self.data_path
    ) and not self.artifact_store.exists(self.metadata_path):
        raise RuntimeError(
            f"Materialization of type {data_type} failed. Expected either"
            f"{self.data_path} or {self.metadata_path} to exist."
        )

    # If the data was serialized as JSON, deserialize it.
    if self.artifact_store.exists(self.data_path):
        outputs = yaml_utils.read_json(self.data_path)

    # Otherwise, use the metadata to reconstruct the data as a list.
    else:
        metadata = yaml_utils.read_json(self.metadata_path)
        outputs = []

        # Backwards compatibility for zenml <= 0.37.0
        if isinstance(metadata, dict):
            for path_, type_str in zip(
                metadata["paths"], metadata["types"]
            ):
                type_ = find_type_by_str(type_str)
                materializer_class = materializer_registry[type_]
                materializer = materializer_class(
                    uri=path_, artifact_store=self.artifact_store
                )
                element = materializer.load(type_)
                outputs.append(element)

        # New format for zenml > 0.37.0
        elif isinstance(metadata, list):
            for entry in metadata:
                path_ = entry["path"]
                type_ = source_utils.load(entry["type"])
                materializer_class = source_utils.load(
                    entry["materializer"]
                )
                materializer = materializer_class(
                    uri=path_, artifact_store=self.artifact_store
                )
                element = materializer.load(type_)
                outputs.append(element)

        else:
            raise RuntimeError(f"Unknown metadata format: {metadata}.")

    # Cast the data to the correct type.
    if issubclass(data_type, dict) and not isinstance(outputs, dict):
        keys, values = outputs
        return data_type(zip(keys, values))
    if issubclass(data_type, tuple) and not isinstance(outputs, tuple):
        return data_type(outputs)
    if issubclass(data_type, set) and not isinstance(outputs, set):
        return data_type(outputs)
    return outputs

save(data)

Materialize a built-in container object.

If the object can be serialized to JSON, serialize it.

Otherwise, use the default_materializer_registry to find the correct materializer for each element and materialize each element into a subdirectory.

Tuples and sets are cast to list before materialization.

For non-serializable dicts, materialize keys/values as separate lists.

Parameters:

Name Type Description Default
data Any

The built-in container object to materialize.

required

Raises:

Type Description
Exception

If any exception occurs, it is raised after cleanup.

Source code in src/zenml/materializers/built_in_materializer.py
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
def save(self, data: Any) -> None:
    """Materialize a built-in container object.

    If the object can be serialized to JSON, serialize it.

    Otherwise, use the `default_materializer_registry` to find the correct
    materializer for each element and materialize each element into a
    subdirectory.

    Tuples and sets are cast to list before materialization.

    For non-serializable dicts, materialize keys/values as separate lists.

    Args:
        data: The built-in container object to materialize.

    Raises:
        Exception: If any exception occurs, it is raised after cleanup.
    """
    # tuple and set: handle as list.
    if isinstance(data, tuple) or isinstance(data, set):
        data = list(data)

    # If the data is serializable, just write it into a single JSON file.
    if _is_serializable(data):
        yaml_utils.write_json(
            self.data_path,
            data,
            ensure_ascii=not ZENML_MATERIALIZER_ALLOW_NON_ASCII_JSON_DUMPS,
        )
        return

    # non-serializable dict: Handle as non-serializable list of lists.
    if isinstance(data, dict):
        data = [list(data.keys()), list(data.values())]

    # non-serializable list: Materialize each element into a subfolder.
    # Get path, type, and corresponding materializer for each element.
    metadata: List[Dict[str, str]] = []
    materializers: List[BaseMaterializer] = []
    try:
        for i, element in enumerate(data):
            element_path = os.path.join(self.uri, str(i))
            self.artifact_store.mkdir(element_path)
            type_ = type(element)
            materializer_class = materializer_registry[type_]
            materializer = materializer_class(
                uri=element_path, artifact_store=self.artifact_store
            )
            materializers.append(materializer)
            metadata.append(
                {
                    "path": element_path,
                    "type": source_utils.resolve(type_).import_path,
                    "materializer": source_utils.resolve(
                        materializer_class
                    ).import_path,
                }
            )
        # Write metadata as JSON.
        yaml_utils.write_json(self.metadata_path, metadata)
        # Materialize each element.
        for element, materializer in zip(data, materializers):
            materializer.validate_save_type_compatibility(type(element))
            materializer.save(element)
    # If an error occurs, delete all created files.
    except Exception as e:
        # Delete metadata
        if self.artifact_store.exists(self.metadata_path):
            self.artifact_store.remove(self.metadata_path)
        # Delete all elements that were already saved.
        for entry in metadata:
            self.artifact_store.rmtree(entry["path"])
        raise e

save_visualizations(data)

Save visualizations for the given data.

Parameters:

Name Type Description Default
data Any

The data to save visualizations for.

required

Returns:

Type Description
Dict[str, VisualizationType]

A dictionary of visualization URIs and their types.

Source code in src/zenml/materializers/built_in_materializer.py
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
def save_visualizations(self, data: Any) -> Dict[str, "VisualizationType"]:
    """Save visualizations for the given data.

    Args:
        data: The data to save visualizations for.

    Returns:
        A dictionary of visualization URIs and their types.
    """
    # dict/list type objects are always saved as JSON files
    # doesn't work for non-serializable types as they
    # are saved as list of lists in different files
    if _is_serializable(data):
        return {self.data_path.replace("\\", "/"): VisualizationType.JSON}
    return {}

BuiltInMaterializer

Bases: BaseMaterializer

Handle JSON-serializable basic types (bool, float, int, str).

Source code in src/zenml/materializers/built_in_materializer.py
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
class BuiltInMaterializer(BaseMaterializer):
    """Handle JSON-serializable basic types (`bool`, `float`, `int`, `str`)."""

    ASSOCIATED_ARTIFACT_TYPE: ClassVar[ArtifactType] = ArtifactType.DATA
    ASSOCIATED_TYPES: ClassVar[Tuple[Type[Any], ...]] = BASIC_TYPES

    def __init__(
        self, uri: str, artifact_store: Optional[BaseArtifactStore] = None
    ):
        """Define `self.data_path`.

        Args:
            uri: The URI where the artifact data is stored.
            artifact_store: The artifact store where the artifact data is stored.
        """
        super().__init__(uri, artifact_store)
        self.data_path = os.path.join(self.uri, DEFAULT_FILENAME)

    def load(
        self, data_type: Union[Type[bool], Type[float], Type[int], Type[str]]
    ) -> Any:
        """Reads basic primitive types from JSON.

        Args:
            data_type: The type of the data to read.

        Returns:
            The data read.
        """
        contents = yaml_utils.read_json(self.data_path)
        if type(contents) is not data_type:
            # TODO [ENG-142]: Raise error or try to coerce
            logger.debug(
                f"Contents {contents} was type {type(contents)} but expected "
                f"{data_type}"
            )
        return contents

    def save(self, data: Union[bool, float, int, str]) -> None:
        """Serialize a basic type to JSON.

        Args:
            data: The data to store.
        """
        yaml_utils.write_json(
            self.data_path,
            data,
            ensure_ascii=not ZENML_MATERIALIZER_ALLOW_NON_ASCII_JSON_DUMPS,
        )

    def save_visualizations(
        self, data: Union[bool, float, int, str]
    ) -> Dict[str, VisualizationType]:
        """Save visualizations for the given basic type.

        Args:
            data: The data to save visualizations for.

        Returns:
            A dictionary of visualization URIs and their types.
        """
        return {self.data_path.replace("\\", "/"): VisualizationType.JSON}

    def extract_metadata(
        self, data: Union[bool, float, int, str]
    ) -> Dict[str, "MetadataType"]:
        """Extract metadata from the given built-in container object.

        Args:
            data: The built-in container object to extract metadata from.

        Returns:
            The extracted metadata as a dictionary.
        """
        # For boolean and numbers, add the string representation as metadata.
        # We don't to this for strings because they can be arbitrarily long.
        if isinstance(data, (bool, float, int)):
            return {"string_representation": str(data)}

        return {}

__init__(uri, artifact_store=None)

Define self.data_path.

Parameters:

Name Type Description Default
uri str

The URI where the artifact data is stored.

required
artifact_store Optional[BaseArtifactStore]

The artifact store where the artifact data is stored.

None
Source code in src/zenml/materializers/built_in_materializer.py
66
67
68
69
70
71
72
73
74
75
76
def __init__(
    self, uri: str, artifact_store: Optional[BaseArtifactStore] = None
):
    """Define `self.data_path`.

    Args:
        uri: The URI where the artifact data is stored.
        artifact_store: The artifact store where the artifact data is stored.
    """
    super().__init__(uri, artifact_store)
    self.data_path = os.path.join(self.uri, DEFAULT_FILENAME)

extract_metadata(data)

Extract metadata from the given built-in container object.

Parameters:

Name Type Description Default
data Union[bool, float, int, str]

The built-in container object to extract metadata from.

required

Returns:

Type Description
Dict[str, MetadataType]

The extracted metadata as a dictionary.

Source code in src/zenml/materializers/built_in_materializer.py
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
def extract_metadata(
    self, data: Union[bool, float, int, str]
) -> Dict[str, "MetadataType"]:
    """Extract metadata from the given built-in container object.

    Args:
        data: The built-in container object to extract metadata from.

    Returns:
        The extracted metadata as a dictionary.
    """
    # For boolean and numbers, add the string representation as metadata.
    # We don't to this for strings because they can be arbitrarily long.
    if isinstance(data, (bool, float, int)):
        return {"string_representation": str(data)}

    return {}

load(data_type)

Reads basic primitive types from JSON.

Parameters:

Name Type Description Default
data_type Union[Type[bool], Type[float], Type[int], Type[str]]

The type of the data to read.

required

Returns:

Type Description
Any

The data read.

Source code in src/zenml/materializers/built_in_materializer.py
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
def load(
    self, data_type: Union[Type[bool], Type[float], Type[int], Type[str]]
) -> Any:
    """Reads basic primitive types from JSON.

    Args:
        data_type: The type of the data to read.

    Returns:
        The data read.
    """
    contents = yaml_utils.read_json(self.data_path)
    if type(contents) is not data_type:
        # TODO [ENG-142]: Raise error or try to coerce
        logger.debug(
            f"Contents {contents} was type {type(contents)} but expected "
            f"{data_type}"
        )
    return contents

save(data)

Serialize a basic type to JSON.

Parameters:

Name Type Description Default
data Union[bool, float, int, str]

The data to store.

required
Source code in src/zenml/materializers/built_in_materializer.py
 98
 99
100
101
102
103
104
105
106
107
108
def save(self, data: Union[bool, float, int, str]) -> None:
    """Serialize a basic type to JSON.

    Args:
        data: The data to store.
    """
    yaml_utils.write_json(
        self.data_path,
        data,
        ensure_ascii=not ZENML_MATERIALIZER_ALLOW_NON_ASCII_JSON_DUMPS,
    )

save_visualizations(data)

Save visualizations for the given basic type.

Parameters:

Name Type Description Default
data Union[bool, float, int, str]

The data to save visualizations for.

required

Returns:

Type Description
Dict[str, VisualizationType]

A dictionary of visualization URIs and their types.

Source code in src/zenml/materializers/built_in_materializer.py
110
111
112
113
114
115
116
117
118
119
120
121
def save_visualizations(
    self, data: Union[bool, float, int, str]
) -> Dict[str, VisualizationType]:
    """Save visualizations for the given basic type.

    Args:
        data: The data to save visualizations for.

    Returns:
        A dictionary of visualization URIs and their types.
    """
    return {self.data_path.replace("\\", "/"): VisualizationType.JSON}

BytesMaterializer

Bases: BaseMaterializer

Handle bytes data type, which is not JSON serializable.

Source code in src/zenml/materializers/built_in_materializer.py
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
class BytesMaterializer(BaseMaterializer):
    """Handle `bytes` data type, which is not JSON serializable."""

    ASSOCIATED_ARTIFACT_TYPE: ClassVar[ArtifactType] = ArtifactType.DATA
    ASSOCIATED_TYPES: ClassVar[Tuple[Type[Any], ...]] = (bytes,)

    def __init__(
        self, uri: str, artifact_store: Optional[BaseArtifactStore] = None
    ):
        """Define `self.data_path`.

        Args:
            uri: The URI where the artifact data is stored.
            artifact_store: The artifact store where the artifact data is stored.
        """
        super().__init__(uri, artifact_store)
        self.data_path = os.path.join(self.uri, DEFAULT_BYTES_FILENAME)

    def load(self, data_type: Type[Any]) -> Any:
        """Reads a bytes object from file.

        Args:
            data_type: The type of the data to read.

        Returns:
            The data read.
        """
        with self.artifact_store.open(self.data_path, "rb") as file_:
            return file_.read()

    def save(self, data: Any) -> None:
        """Save a bytes object to file.

        Args:
            data: The data to store.
        """
        with self.artifact_store.open(self.data_path, "wb") as file_:
            file_.write(data)

    def save_visualizations(self, data: bytes) -> Dict[str, VisualizationType]:
        """Save visualizations for the given bytes data.

        Args:
            data: The bytes data to save visualizations for.

        Returns:
            A dictionary of visualization URIs and their types.
        """
        return {self.data_path.replace("\\", "/"): VisualizationType.MARKDOWN}

__init__(uri, artifact_store=None)

Define self.data_path.

Parameters:

Name Type Description Default
uri str

The URI where the artifact data is stored.

required
artifact_store Optional[BaseArtifactStore]

The artifact store where the artifact data is stored.

None
Source code in src/zenml/materializers/built_in_materializer.py
148
149
150
151
152
153
154
155
156
157
158
def __init__(
    self, uri: str, artifact_store: Optional[BaseArtifactStore] = None
):
    """Define `self.data_path`.

    Args:
        uri: The URI where the artifact data is stored.
        artifact_store: The artifact store where the artifact data is stored.
    """
    super().__init__(uri, artifact_store)
    self.data_path = os.path.join(self.uri, DEFAULT_BYTES_FILENAME)

load(data_type)

Reads a bytes object from file.

Parameters:

Name Type Description Default
data_type Type[Any]

The type of the data to read.

required

Returns:

Type Description
Any

The data read.

Source code in src/zenml/materializers/built_in_materializer.py
160
161
162
163
164
165
166
167
168
169
170
def load(self, data_type: Type[Any]) -> Any:
    """Reads a bytes object from file.

    Args:
        data_type: The type of the data to read.

    Returns:
        The data read.
    """
    with self.artifact_store.open(self.data_path, "rb") as file_:
        return file_.read()

save(data)

Save a bytes object to file.

Parameters:

Name Type Description Default
data Any

The data to store.

required
Source code in src/zenml/materializers/built_in_materializer.py
172
173
174
175
176
177
178
179
def save(self, data: Any) -> None:
    """Save a bytes object to file.

    Args:
        data: The data to store.
    """
    with self.artifact_store.open(self.data_path, "wb") as file_:
        file_.write(data)

save_visualizations(data)

Save visualizations for the given bytes data.

Parameters:

Name Type Description Default
data bytes

The bytes data to save visualizations for.

required

Returns:

Type Description
Dict[str, VisualizationType]

A dictionary of visualization URIs and their types.

Source code in src/zenml/materializers/built_in_materializer.py
181
182
183
184
185
186
187
188
189
190
def save_visualizations(self, data: bytes) -> Dict[str, VisualizationType]:
    """Save visualizations for the given bytes data.

    Args:
        data: The bytes data to save visualizations for.

    Returns:
        A dictionary of visualization URIs and their types.
    """
    return {self.data_path.replace("\\", "/"): VisualizationType.MARKDOWN}

CloudpickleMaterializer

Bases: BaseMaterializer

Materializer using cloudpickle.

This materializer can materialize (almost) any object, but does so in a non-reproducble way since artifacts cannot be loaded from other Python versions. It is recommended to use this materializer only as a last resort.

That is also why it has SKIP_REGISTRATION set to True and is currently only used as a fallback materializer inside the materializer registry.

Source code in src/zenml/materializers/cloudpickle_materializer.py
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
class CloudpickleMaterializer(BaseMaterializer):
    """Materializer using cloudpickle.

    This materializer can materialize (almost) any object, but does so in a
    non-reproducble way since artifacts cannot be loaded from other Python
    versions. It is recommended to use this materializer only as a last resort.

    That is also why it has `SKIP_REGISTRATION` set to True and is currently
    only used as a fallback materializer inside the materializer registry.
    """

    ASSOCIATED_TYPES: ClassVar[Tuple[Type[Any], ...]] = (object,)
    ASSOCIATED_ARTIFACT_TYPE: ClassVar[ArtifactType] = ArtifactType.DATA
    SKIP_REGISTRATION: ClassVar[bool] = True

    def load(self, data_type: Type[Any]) -> Any:
        """Reads an artifact from a cloudpickle file.

        Args:
            data_type: The data type of the artifact.

        Returns:
            The loaded artifact data.
        """
        # validate python version
        source_python_version = self._load_python_version()
        current_python_version = Environment().python_version()
        if source_python_version != current_python_version:
            logger.warning(
                f"Your artifact was materialized under Python version "
                f"'{source_python_version}' but you are currently using "
                f"'{current_python_version}'. This might cause unexpected "
                "behavior since pickle is not reproducible across Python "
                "versions. Attempting to load anyway..."
            )

        # load data
        filepath = os.path.join(self.uri, DEFAULT_FILENAME)
        with self.artifact_store.open(filepath, "rb") as fid:
            data = cloudpickle.load(fid)
        return data

    def _load_python_version(self) -> str:
        """Loads the Python version that was used to materialize the artifact.

        Returns:
            The Python version that was used to materialize the artifact.
        """
        filepath = os.path.join(self.uri, DEFAULT_PYTHON_VERSION_FILENAME)
        if os.path.exists(filepath):
            return read_file_contents_as_string(filepath)
        return "unknown"

    def save(self, data: Any) -> None:
        """Saves an artifact to a cloudpickle file.

        Args:
            data: The data to save.
        """
        # Log a warning if this materializer was not explicitly specified for
        # the given data type.
        if type(self) is CloudpickleMaterializer:
            logger.warning(
                f"No materializer is registered for type `{type(data)}`, so "
                "the default Pickle materializer was used. Pickle is not "
                "production ready and should only be used for prototyping as "
                "the artifacts cannot be loaded when running with a different "
                "Python version. Please consider implementing a custom "
                f"materializer for type `{type(data)}` according to the "
                "instructions at https://docs.zenml.io/concepts/artifacts/materializers"
            )

        # save python version for validation on loading
        self._save_python_version()

        # save data
        filepath = os.path.join(self.uri, DEFAULT_FILENAME)
        with self.artifact_store.open(filepath, "wb") as fid:
            cloudpickle.dump(data, fid)

    def _save_python_version(self) -> None:
        """Saves the Python version used to materialize the artifact."""
        filepath = os.path.join(self.uri, DEFAULT_PYTHON_VERSION_FILENAME)
        current_python_version = Environment().python_version()
        write_file_contents_as_string(filepath, current_python_version)

load(data_type)

Reads an artifact from a cloudpickle file.

Parameters:

Name Type Description Default
data_type Type[Any]

The data type of the artifact.

required

Returns:

Type Description
Any

The loaded artifact data.

Source code in src/zenml/materializers/cloudpickle_materializer.py
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
def load(self, data_type: Type[Any]) -> Any:
    """Reads an artifact from a cloudpickle file.

    Args:
        data_type: The data type of the artifact.

    Returns:
        The loaded artifact data.
    """
    # validate python version
    source_python_version = self._load_python_version()
    current_python_version = Environment().python_version()
    if source_python_version != current_python_version:
        logger.warning(
            f"Your artifact was materialized under Python version "
            f"'{source_python_version}' but you are currently using "
            f"'{current_python_version}'. This might cause unexpected "
            "behavior since pickle is not reproducible across Python "
            "versions. Attempting to load anyway..."
        )

    # load data
    filepath = os.path.join(self.uri, DEFAULT_FILENAME)
    with self.artifact_store.open(filepath, "rb") as fid:
        data = cloudpickle.load(fid)
    return data

save(data)

Saves an artifact to a cloudpickle file.

Parameters:

Name Type Description Default
data Any

The data to save.

required
Source code in src/zenml/materializers/cloudpickle_materializer.py
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
def save(self, data: Any) -> None:
    """Saves an artifact to a cloudpickle file.

    Args:
        data: The data to save.
    """
    # Log a warning if this materializer was not explicitly specified for
    # the given data type.
    if type(self) is CloudpickleMaterializer:
        logger.warning(
            f"No materializer is registered for type `{type(data)}`, so "
            "the default Pickle materializer was used. Pickle is not "
            "production ready and should only be used for prototyping as "
            "the artifacts cannot be loaded when running with a different "
            "Python version. Please consider implementing a custom "
            f"materializer for type `{type(data)}` according to the "
            "instructions at https://docs.zenml.io/concepts/artifacts/materializers"
        )

    # save python version for validation on loading
    self._save_python_version()

    # save data
    filepath = os.path.join(self.uri, DEFAULT_FILENAME)
    with self.artifact_store.open(filepath, "wb") as fid:
        cloudpickle.dump(data, fid)

PathMaterializer

Bases: BaseMaterializer

Materializer for Path objects.

This materializer handles pathlib.Path objects by storing their contents in a compressed tar archive within the artifact store if it's a directory, or directly copying the file if it's a single file.

Source code in src/zenml/materializers/path_materializer.py
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
class PathMaterializer(BaseMaterializer):
    """Materializer for Path objects.

    This materializer handles `pathlib.Path` objects by storing their contents
    in a compressed tar archive within the artifact store if it's a directory,
    or directly copying the file if it's a single file.
    """

    ASSOCIATED_TYPES: ClassVar[Tuple[Type[Any], ...]] = (Path,)
    ASSOCIATED_ARTIFACT_TYPE: ClassVar[ArtifactType] = ArtifactType.DATA
    ARCHIVE_NAME: ClassVar[str] = "data.tar.gz"
    FILE_NAME: ClassVar[str] = "file_data"

    # Skip registration if the environment variable is set
    SKIP_REGISTRATION: ClassVar[bool] = handle_bool_env_var(
        ENV_ZENML_DISABLE_PATH_MATERIALIZER, default=False
    )

    def load(self, data_type: Type[Any]) -> Any:
        """Copy the artifact files to a local temp directory or file.

        Args:
            data_type: Unused.

        Returns:
            Path to the local directory or file that contains the artifact.

        Raises:
            FileNotFoundError: If the artifact is not found in the artifact store.
        """
        # Create a temporary directory that will persist until step execution ends
        with self.get_temporary_directory(delete_at_exit=False) as directory:
            # Check if we're loading a file or directory by looking for the archive
            archive_path_remote = os.path.join(self.uri, self.ARCHIVE_NAME)
            file_path_remote = os.path.join(self.uri, self.FILE_NAME)

            if fileio.exists(archive_path_remote):
                # This is a directory artifact
                archive_path_local = os.path.join(directory, self.ARCHIVE_NAME)
                fileio.copy(archive_path_remote, archive_path_local)

                # Extract the archive to the temporary directory
                with tarfile.open(archive_path_local, "r:gz") as tar:
                    # Validate archive members to prevent path traversal attacks
                    # Filter members to only those with safe paths
                    safe_members = []
                    for member in tar.getmembers():
                        if is_path_within_directory(member.name, directory):
                            safe_members.append(member)

                    # Extract only safe members
                    tar.extractall(path=directory, members=safe_members)  # nosec B202 - members are filtered through is_path_within_directory

                # Clean up the archive file
                os.remove(archive_path_local)
                return Path(directory)
            elif fileio.exists(file_path_remote):
                # This is a single file artifact
                file_path_local = os.path.join(
                    directory, os.path.basename(file_path_remote)
                )
                fileio.copy(file_path_remote, file_path_local)
                return Path(file_path_local)
            else:
                raise FileNotFoundError(
                    f"Could not find artifact at {archive_path_remote} or {file_path_remote}"
                )

    def save(self, data: Any) -> None:
        """Store the directory or file in the artifact store.

        Args:
            data: Path to a local directory or file to store. Must be a Path object.

        Raises:
            TypeError: If data is not a Path object.
        """
        if not isinstance(data, Path):
            raise TypeError(
                f"Expected a Path object, got {type(data).__name__}"
            )

        if data.is_dir():
            # Handle directory artifact
            with self.get_temporary_directory(
                delete_at_exit=True
            ) as directory:
                archive_base = os.path.join(directory, "data")

                # Create tar.gz archive - automatically uses relative paths
                shutil.make_archive(
                    base_name=archive_base, format="gztar", root_dir=str(data)
                )

                # Copy the archive to the artifact store
                fileio.copy(
                    f"{archive_base}.tar.gz",
                    os.path.join(self.uri, self.ARCHIVE_NAME),
                )
        else:
            # Handle single file artifact
            file_path_remote = os.path.join(self.uri, self.FILE_NAME)
            fileio.copy(str(data), file_path_remote)

load(data_type)

Copy the artifact files to a local temp directory or file.

Parameters:

Name Type Description Default
data_type Type[Any]

Unused.

required

Returns:

Type Description
Any

Path to the local directory or file that contains the artifact.

Raises:

Type Description
FileNotFoundError

If the artifact is not found in the artifact store.

Source code in src/zenml/materializers/path_materializer.py
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
def load(self, data_type: Type[Any]) -> Any:
    """Copy the artifact files to a local temp directory or file.

    Args:
        data_type: Unused.

    Returns:
        Path to the local directory or file that contains the artifact.

    Raises:
        FileNotFoundError: If the artifact is not found in the artifact store.
    """
    # Create a temporary directory that will persist until step execution ends
    with self.get_temporary_directory(delete_at_exit=False) as directory:
        # Check if we're loading a file or directory by looking for the archive
        archive_path_remote = os.path.join(self.uri, self.ARCHIVE_NAME)
        file_path_remote = os.path.join(self.uri, self.FILE_NAME)

        if fileio.exists(archive_path_remote):
            # This is a directory artifact
            archive_path_local = os.path.join(directory, self.ARCHIVE_NAME)
            fileio.copy(archive_path_remote, archive_path_local)

            # Extract the archive to the temporary directory
            with tarfile.open(archive_path_local, "r:gz") as tar:
                # Validate archive members to prevent path traversal attacks
                # Filter members to only those with safe paths
                safe_members = []
                for member in tar.getmembers():
                    if is_path_within_directory(member.name, directory):
                        safe_members.append(member)

                # Extract only safe members
                tar.extractall(path=directory, members=safe_members)  # nosec B202 - members are filtered through is_path_within_directory

            # Clean up the archive file
            os.remove(archive_path_local)
            return Path(directory)
        elif fileio.exists(file_path_remote):
            # This is a single file artifact
            file_path_local = os.path.join(
                directory, os.path.basename(file_path_remote)
            )
            fileio.copy(file_path_remote, file_path_local)
            return Path(file_path_local)
        else:
            raise FileNotFoundError(
                f"Could not find artifact at {archive_path_remote} or {file_path_remote}"
            )

save(data)

Store the directory or file in the artifact store.

Parameters:

Name Type Description Default
data Any

Path to a local directory or file to store. Must be a Path object.

required

Raises:

Type Description
TypeError

If data is not a Path object.

Source code in src/zenml/materializers/path_materializer.py
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
def save(self, data: Any) -> None:
    """Store the directory or file in the artifact store.

    Args:
        data: Path to a local directory or file to store. Must be a Path object.

    Raises:
        TypeError: If data is not a Path object.
    """
    if not isinstance(data, Path):
        raise TypeError(
            f"Expected a Path object, got {type(data).__name__}"
        )

    if data.is_dir():
        # Handle directory artifact
        with self.get_temporary_directory(
            delete_at_exit=True
        ) as directory:
            archive_base = os.path.join(directory, "data")

            # Create tar.gz archive - automatically uses relative paths
            shutil.make_archive(
                base_name=archive_base, format="gztar", root_dir=str(data)
            )

            # Copy the archive to the artifact store
            fileio.copy(
                f"{archive_base}.tar.gz",
                os.path.join(self.uri, self.ARCHIVE_NAME),
            )
    else:
        # Handle single file artifact
        file_path_remote = os.path.join(self.uri, self.FILE_NAME)
        fileio.copy(str(data), file_path_remote)

PydanticMaterializer

Bases: BaseMaterializer

Handle Pydantic BaseModel objects.

Source code in src/zenml/materializers/pydantic_materializer.py
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
class PydanticMaterializer(BaseMaterializer):
    """Handle Pydantic BaseModel objects."""

    ASSOCIATED_ARTIFACT_TYPE: ClassVar[ArtifactType] = ArtifactType.DATA
    ASSOCIATED_TYPES: ClassVar[Tuple[Type[Any], ...]] = (BaseModel,)

    def load(self, data_type: Type[BaseModel]) -> Any:
        """Reads BaseModel from JSON.

        Args:
            data_type: The type of the data to read.

        Returns:
            The data read.
        """
        data_path = os.path.join(self.uri, DEFAULT_FILENAME)
        contents = yaml_utils.read_json(data_path)
        return data_type.model_validate_json(contents)

    def save(self, data: BaseModel) -> None:
        """Serialize a BaseModel to JSON.

        Args:
            data: The data to store.
        """
        data_path = os.path.join(self.uri, DEFAULT_FILENAME)
        yaml_utils.write_json(data_path, data.model_dump_json())

    def extract_metadata(self, data: BaseModel) -> Dict[str, "MetadataType"]:
        """Extract metadata from the given BaseModel object.

        Args:
            data: The BaseModel object to extract metadata from.

        Returns:
            The extracted metadata as a dictionary.
        """
        return {"schema": data.schema()}

extract_metadata(data)

Extract metadata from the given BaseModel object.

Parameters:

Name Type Description Default
data BaseModel

The BaseModel object to extract metadata from.

required

Returns:

Type Description
Dict[str, MetadataType]

The extracted metadata as a dictionary.

Source code in src/zenml/materializers/pydantic_materializer.py
59
60
61
62
63
64
65
66
67
68
def extract_metadata(self, data: BaseModel) -> Dict[str, "MetadataType"]:
    """Extract metadata from the given BaseModel object.

    Args:
        data: The BaseModel object to extract metadata from.

    Returns:
        The extracted metadata as a dictionary.
    """
    return {"schema": data.schema()}

load(data_type)

Reads BaseModel from JSON.

Parameters:

Name Type Description Default
data_type Type[BaseModel]

The type of the data to read.

required

Returns:

Type Description
Any

The data read.

Source code in src/zenml/materializers/pydantic_materializer.py
37
38
39
40
41
42
43
44
45
46
47
48
def load(self, data_type: Type[BaseModel]) -> Any:
    """Reads BaseModel from JSON.

    Args:
        data_type: The type of the data to read.

    Returns:
        The data read.
    """
    data_path = os.path.join(self.uri, DEFAULT_FILENAME)
    contents = yaml_utils.read_json(data_path)
    return data_type.model_validate_json(contents)

save(data)

Serialize a BaseModel to JSON.

Parameters:

Name Type Description Default
data BaseModel

The data to store.

required
Source code in src/zenml/materializers/pydantic_materializer.py
50
51
52
53
54
55
56
57
def save(self, data: BaseModel) -> None:
    """Serialize a BaseModel to JSON.

    Args:
        data: The data to store.
    """
    data_path = os.path.join(self.uri, DEFAULT_FILENAME)
    yaml_utils.write_json(data_path, data.model_dump_json())

ServiceMaterializer

Bases: BaseMaterializer

Materializer to read/write service instances.

Source code in src/zenml/materializers/service_materializer.py
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
class ServiceMaterializer(BaseMaterializer):
    """Materializer to read/write service instances."""

    ASSOCIATED_TYPES: ClassVar[Tuple[Type[Any], ...]] = (BaseService,)
    ASSOCIATED_ARTIFACT_TYPE: ClassVar[ArtifactType] = ArtifactType.SERVICE

    def load(self, data_type: Type[Any]) -> BaseService:
        """Creates and returns a service.

        This service is instantiated from the serialized service configuration
        and last known status information saved as artifact.

        Args:
            data_type: The type of the data to read.

        Returns:
            A ZenML service instance.
        """
        filepath = os.path.join(self.uri, SERVICE_CONFIG_FILENAME)
        with self.artifact_store.open(filepath, "r") as f:
            service_id = f.read().strip()

        service = Client().get_service(name_id_or_prefix=uuid.UUID(service_id))
        return BaseDeploymentService.from_model(service)

    def save(self, service: BaseService) -> None:
        """Writes a ZenML service.

        The configuration and last known status of the input service instance
        are serialized and saved as an artifact.

        Args:
            service: A ZenML service instance.
        """
        filepath = os.path.join(self.uri, SERVICE_CONFIG_FILENAME)
        with self.artifact_store.open(filepath, "w") as f:
            f.write(str(service.uuid))

    def extract_metadata(
        self, service: BaseService
    ) -> Dict[str, "MetadataType"]:
        """Extract metadata from the given service.

        Args:
            service: The service to extract metadata from.

        Returns:
            The extracted metadata as a dictionary.
        """
        from zenml.metadata.metadata_types import Uri

        if prediction_url := service.get_prediction_url() or None:
            return {"uri": Uri(prediction_url)}
        return {}

extract_metadata(service)

Extract metadata from the given service.

Parameters:

Name Type Description Default
service BaseService

The service to extract metadata from.

required

Returns:

Type Description
Dict[str, MetadataType]

The extracted metadata as a dictionary.

Source code in src/zenml/materializers/service_materializer.py
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
def extract_metadata(
    self, service: BaseService
) -> Dict[str, "MetadataType"]:
    """Extract metadata from the given service.

    Args:
        service: The service to extract metadata from.

    Returns:
        The extracted metadata as a dictionary.
    """
    from zenml.metadata.metadata_types import Uri

    if prediction_url := service.get_prediction_url() or None:
        return {"uri": Uri(prediction_url)}
    return {}

load(data_type)

Creates and returns a service.

This service is instantiated from the serialized service configuration and last known status information saved as artifact.

Parameters:

Name Type Description Default
data_type Type[Any]

The type of the data to read.

required

Returns:

Type Description
BaseService

A ZenML service instance.

Source code in src/zenml/materializers/service_materializer.py
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
def load(self, data_type: Type[Any]) -> BaseService:
    """Creates and returns a service.

    This service is instantiated from the serialized service configuration
    and last known status information saved as artifact.

    Args:
        data_type: The type of the data to read.

    Returns:
        A ZenML service instance.
    """
    filepath = os.path.join(self.uri, SERVICE_CONFIG_FILENAME)
    with self.artifact_store.open(filepath, "r") as f:
        service_id = f.read().strip()

    service = Client().get_service(name_id_or_prefix=uuid.UUID(service_id))
    return BaseDeploymentService.from_model(service)

save(service)

Writes a ZenML service.

The configuration and last known status of the input service instance are serialized and saved as an artifact.

Parameters:

Name Type Description Default
service BaseService

A ZenML service instance.

required
Source code in src/zenml/materializers/service_materializer.py
56
57
58
59
60
61
62
63
64
65
66
67
def save(self, service: BaseService) -> None:
    """Writes a ZenML service.

    The configuration and last known status of the input service instance
    are serialized and saved as an artifact.

    Args:
        service: A ZenML service instance.
    """
    filepath = os.path.join(self.uri, SERVICE_CONFIG_FILENAME)
    with self.artifact_store.open(filepath, "w") as f:
        f.write(str(service.uuid))

StructuredStringMaterializer

Bases: BaseMaterializer

Materializer for HTML or Markdown strings.

Source code in src/zenml/materializers/structured_string_materializer.py
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
class StructuredStringMaterializer(BaseMaterializer):
    """Materializer for HTML or Markdown strings."""

    ASSOCIATED_TYPES = (CSVString, HTMLString, MarkdownString, JSONString)
    ASSOCIATED_ARTIFACT_TYPE = ArtifactType.DATA_ANALYSIS

    def load(self, data_type: Type[STRUCTURED_STRINGS]) -> STRUCTURED_STRINGS:
        """Loads the data from the HTML or Markdown file.

        Args:
            data_type: The type of the data to read.

        Returns:
            The loaded data.
        """
        with self.artifact_store.open(self._get_filepath(data_type), "r") as f:
            return data_type(f.read())

    def save(self, data: STRUCTURED_STRINGS) -> None:
        """Save data as an HTML or Markdown file.

        Args:
            data: The data to save as an HTML or Markdown file.
        """
        with self.artifact_store.open(
            self._get_filepath(type(data)), "w"
        ) as f:
            f.write(data)

    def save_visualizations(
        self, data: STRUCTURED_STRINGS
    ) -> Dict[str, VisualizationType]:
        """Save visualizations for the given data.

        Args:
            data: The data to save visualizations for.

        Returns:
            A dictionary of visualization URIs and their types.
        """
        filepath = self._get_filepath(type(data))
        filepath = filepath.replace("\\", "/")
        visualization_type = self._get_visualization_type(type(data))
        return {filepath: visualization_type}

    def _get_filepath(self, data_type: Type[STRUCTURED_STRINGS]) -> str:
        """Get the file path for the given data type.

        Args:
            data_type: The type of the data.

        Returns:
            The file path for the given data type.

        Raises:
            ValueError: If the data type is not supported.
        """
        if issubclass(data_type, CSVString):
            filename = CSV_FILENAME
        elif issubclass(data_type, HTMLString):
            filename = HTML_FILENAME
        elif issubclass(data_type, MarkdownString):
            filename = MARKDOWN_FILENAME
        elif issubclass(data_type, JSONString):
            filename = JSON_FILENAME
        else:
            raise ValueError(
                f"Data type {data_type} is not supported by this materializer."
            )
        return os.path.join(self.uri, filename)

    def _get_visualization_type(
        self, data_type: Type[STRUCTURED_STRINGS]
    ) -> VisualizationType:
        """Get the visualization type for the given data type.

        Args:
            data_type: The type of the data.

        Returns:
            The visualization type for the given data type.

        Raises:
            ValueError: If the data type is not supported.
        """
        if issubclass(data_type, CSVString):
            return VisualizationType.CSV
        elif issubclass(data_type, HTMLString):
            return VisualizationType.HTML
        elif issubclass(data_type, MarkdownString):
            return VisualizationType.MARKDOWN
        elif issubclass(data_type, JSONString):
            return VisualizationType.JSON
        else:
            raise ValueError(
                f"Data type {data_type} is not supported by this materializer."
            )

load(data_type)

Loads the data from the HTML or Markdown file.

Parameters:

Name Type Description Default
data_type Type[STRUCTURED_STRINGS]

The type of the data to read.

required

Returns:

Type Description
STRUCTURED_STRINGS

The loaded data.

Source code in src/zenml/materializers/structured_string_materializer.py
41
42
43
44
45
46
47
48
49
50
51
def load(self, data_type: Type[STRUCTURED_STRINGS]) -> STRUCTURED_STRINGS:
    """Loads the data from the HTML or Markdown file.

    Args:
        data_type: The type of the data to read.

    Returns:
        The loaded data.
    """
    with self.artifact_store.open(self._get_filepath(data_type), "r") as f:
        return data_type(f.read())

save(data)

Save data as an HTML or Markdown file.

Parameters:

Name Type Description Default
data STRUCTURED_STRINGS

The data to save as an HTML or Markdown file.

required
Source code in src/zenml/materializers/structured_string_materializer.py
53
54
55
56
57
58
59
60
61
62
def save(self, data: STRUCTURED_STRINGS) -> None:
    """Save data as an HTML or Markdown file.

    Args:
        data: The data to save as an HTML or Markdown file.
    """
    with self.artifact_store.open(
        self._get_filepath(type(data)), "w"
    ) as f:
        f.write(data)

save_visualizations(data)

Save visualizations for the given data.

Parameters:

Name Type Description Default
data STRUCTURED_STRINGS

The data to save visualizations for.

required

Returns:

Type Description
Dict[str, VisualizationType]

A dictionary of visualization URIs and their types.

Source code in src/zenml/materializers/structured_string_materializer.py
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
def save_visualizations(
    self, data: STRUCTURED_STRINGS
) -> Dict[str, VisualizationType]:
    """Save visualizations for the given data.

    Args:
        data: The data to save visualizations for.

    Returns:
        A dictionary of visualization URIs and their types.
    """
    filepath = self._get_filepath(type(data))
    filepath = filepath.replace("\\", "/")
    visualization_type = self._get_visualization_type(type(data))
    return {filepath: visualization_type}

UUIDMaterializer

Bases: BaseMaterializer

Materializer to handle UUID objects.

Source code in src/zenml/materializers/uuid_materializer.py
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
class UUIDMaterializer(BaseMaterializer):
    """Materializer to handle UUID objects."""

    ASSOCIATED_TYPES: ClassVar[Tuple[Type[Any], ...]] = (uuid.UUID,)
    ASSOCIATED_ARTIFACT_TYPE: ClassVar[ArtifactType] = ArtifactType.DATA

    def __init__(
        self, uri: str, artifact_store: Optional[BaseArtifactStore] = None
    ):
        """Define `self.data_path`.

        Args:
            uri: The URI where the artifact data is stored.
            artifact_store: The artifact store where the artifact data is stored.
        """
        super().__init__(uri, artifact_store)
        self.data_path = os.path.join(self.uri, DEFAULT_FILENAME)

    def load(self, _: Type[uuid.UUID]) -> uuid.UUID:
        """Read UUID from artifact store.

        Args:
            _: The type of the data to be loaded.

        Returns:
            The loaded UUID.
        """
        with self.artifact_store.open(self.data_path, "r") as f:
            uuid_str = f.read().strip()
        return uuid.UUID(uuid_str)

    def save(self, data: uuid.UUID) -> None:
        """Write UUID to artifact store.

        Args:
            data: The UUID to be saved.
        """
        with self.artifact_store.open(self.data_path, "w") as f:
            f.write(str(data))

    def extract_metadata(self, data: uuid.UUID) -> Dict[str, MetadataType]:
        """Extract metadata from the UUID.

        Args:
            data: The UUID to extract metadata from.

        Returns:
            A dictionary of metadata extracted from the UUID.
        """
        return {
            "string_representation": str(data),
        }

__init__(uri, artifact_store=None)

Define self.data_path.

Parameters:

Name Type Description Default
uri str

The URI where the artifact data is stored.

required
artifact_store Optional[BaseArtifactStore]

The artifact store where the artifact data is stored.

None
Source code in src/zenml/materializers/uuid_materializer.py
34
35
36
37
38
39
40
41
42
43
44
def __init__(
    self, uri: str, artifact_store: Optional[BaseArtifactStore] = None
):
    """Define `self.data_path`.

    Args:
        uri: The URI where the artifact data is stored.
        artifact_store: The artifact store where the artifact data is stored.
    """
    super().__init__(uri, artifact_store)
    self.data_path = os.path.join(self.uri, DEFAULT_FILENAME)

extract_metadata(data)

Extract metadata from the UUID.

Parameters:

Name Type Description Default
data UUID

The UUID to extract metadata from.

required

Returns:

Type Description
Dict[str, MetadataType]

A dictionary of metadata extracted from the UUID.

Source code in src/zenml/materializers/uuid_materializer.py
68
69
70
71
72
73
74
75
76
77
78
79
def extract_metadata(self, data: uuid.UUID) -> Dict[str, MetadataType]:
    """Extract metadata from the UUID.

    Args:
        data: The UUID to extract metadata from.

    Returns:
        A dictionary of metadata extracted from the UUID.
    """
    return {
        "string_representation": str(data),
    }

load(_)

Read UUID from artifact store.

Parameters:

Name Type Description Default
_ Type[UUID]

The type of the data to be loaded.

required

Returns:

Type Description
UUID

The loaded UUID.

Source code in src/zenml/materializers/uuid_materializer.py
46
47
48
49
50
51
52
53
54
55
56
57
def load(self, _: Type[uuid.UUID]) -> uuid.UUID:
    """Read UUID from artifact store.

    Args:
        _: The type of the data to be loaded.

    Returns:
        The loaded UUID.
    """
    with self.artifact_store.open(self.data_path, "r") as f:
        uuid_str = f.read().strip()
    return uuid.UUID(uuid_str)

save(data)

Write UUID to artifact store.

Parameters:

Name Type Description Default
data UUID

The UUID to be saved.

required
Source code in src/zenml/materializers/uuid_materializer.py
59
60
61
62
63
64
65
66
def save(self, data: uuid.UUID) -> None:
    """Write UUID to artifact store.

    Args:
        data: The UUID to be saved.
    """
    with self.artifact_store.open(self.data_path, "w") as f:
        f.write(str(data))

Metadata

Initialization of ZenML metadata.

ZenML metadata is any additional, dynamic information that is associated with your pipeline runs and artifacts at runtime.

Model Deployers

Model deployers are stack components responsible for online model serving.

Online serving is the process of hosting and loading machine-learning models as part of a managed web service and providing access to the models through an API endpoint like HTTP or GRPC. Once deployed, you can send inference requests to the model through the web service's API and receive fast, low-latency responses.

Add a model deployer to your ZenML stack to be able to implement continuous model deployment pipelines that train models and continuously deploy them to a model prediction web service.

When present in a stack, the model deployer also acts as a registry for models that are served with ZenML. You can use the model deployer to list all models that are currently deployed for online inference or filtered according to a particular pipeline run or step, or to suspend, resume or delete an external model server managed through ZenML.

BaseModelDeployer

Bases: StackComponent, ABC

Base class for all ZenML model deployers.

The model deployer serves three major purposes:

  1. It contains all the stack related configuration attributes required to interact with the remote model serving tool, service or platform (e.g. hostnames, URLs, references to credentials, other client related configuration parameters).

  2. It implements the continuous deployment logic necessary to deploy models in a way that updates an existing model server that is already serving a previous version of the same model instead of creating a new model server for every new model version (see the deploy_model abstract method). This functionality can be consumed directly from ZenML pipeline steps, but it can also be used outside the pipeline to deploy ad hoc models. It is also usually coupled with a standard model deployer step, implemented by each integration, that hides the details of the deployment process away from the user.

  3. It acts as a ZenML BaseService registry, where every BaseService instance is used as an internal representation of a remote model server (see the find_model_server abstract method). To achieve this, it must be able to re-create the configuration of a BaseService from information that is persisted externally, alongside or even part of the remote model server configuration itself. For example, for model servers that are implemented as Kubernetes resources, the BaseService instances can be serialized and saved as Kubernetes resource annotations. This allows the model deployer to keep track of all externally running model servers and to re-create their corresponding BaseService instance representations at any given time. The model deployer also defines methods that implement basic life-cycle management on remote model servers outside the coverage of a pipeline (see stop_model_server, start_model_server and delete_model_server).

Source code in src/zenml/model_deployers/base_model_deployer.py
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
class BaseModelDeployer(StackComponent, ABC):
    """Base class for all ZenML model deployers.

    The model deployer serves three major purposes:

    1. It contains all the stack related configuration attributes required to
    interact with the remote model serving tool, service or platform (e.g.
    hostnames, URLs, references to credentials, other client related
    configuration parameters).

    2. It implements the continuous deployment logic necessary to deploy models
    in a way that updates an existing model server that is already serving a
    previous version of the same model instead of creating a new model server
    for every new model version (see the `deploy_model` abstract method).
    This functionality can be consumed directly from ZenML pipeline steps, but
    it can also be used outside the pipeline to deploy ad hoc models. It is
    also usually coupled with a standard model deployer step, implemented by
    each integration, that hides the details of the deployment process away from
    the user.

    3. It acts as a ZenML BaseService registry, where every BaseService instance
    is used as an internal representation of a remote model server (see the
    `find_model_server` abstract method). To achieve this, it must be able to
    re-create the configuration of a BaseService from information that is
    persisted externally, alongside or even part of the remote model server
    configuration itself. For example, for model servers that are implemented as
    Kubernetes resources, the BaseService instances can be serialized and saved
    as Kubernetes resource annotations. This allows the model deployer to keep
    track of all externally running model servers and to re-create their
    corresponding BaseService instance representations at any given time.
    The model deployer also defines methods that implement basic life-cycle
    management on remote model servers outside the coverage of a pipeline
    (see `stop_model_server`, `start_model_server` and `delete_model_server`).
    """

    NAME: ClassVar[str]
    FLAVOR: ClassVar[Type["BaseModelDeployerFlavor"]]

    @property
    def config(self) -> BaseModelDeployerConfig:
        """Returns the `BaseModelDeployerConfig` config.

        Returns:
            The configuration.
        """
        return cast(BaseModelDeployerConfig, self._config)

    @classmethod
    def get_active_model_deployer(cls) -> "BaseModelDeployer":
        """Get the model deployer registered in the active stack.

        Returns:
            The model deployer registered in the active stack.

        Raises:
            TypeError: if a model deployer is not part of the
                active stack.
        """
        flavor: BaseModelDeployerFlavor = cls.FLAVOR()
        client = Client()
        model_deployer = client.active_stack.model_deployer
        if not model_deployer or not isinstance(model_deployer, cls):
            raise TypeError(
                f"The active stack needs to have a {cls.NAME} model "
                f"deployer component registered to be able deploy models "
                f"with {cls.NAME}. You can create a new stack with "
                f"a {cls.NAME} model deployer component or update your "
                f"active stack to add this component, e.g.:\n\n"
                f"  `zenml model-deployer register {flavor.name} "
                f"--flavor={flavor.name} ...`\n"
                f"  `zenml stack register <STACK-NAME> -d {flavor.name} ...`\n"
                f"  or:\n"
                f"  `zenml stack update -d {flavor.name}`\n\n"
            )

        return model_deployer

    def deploy_model(
        self,
        config: ServiceConfig,
        service_type: ServiceType,
        replace: bool = False,
        continuous_deployment_mode: bool = False,
        timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
    ) -> BaseService:
        """Deploy a model.

        the deploy_model method is the main entry point for deploying models
        using the model deployer. It is used to deploy a model to a model server
        instance that is running on a remote serving platform or service. The
        method is responsible for detecting if there is an existing model server
        instance running serving one or more previous versions of the same model
        and deploying the model to the serving platform or updating the existing
        model server instance to include the new model version. The method
        returns a Service object that is a representation of the external model
        server instance. The Service object must implement basic operational
        state tracking and lifecycle management operations for the model server
        (e.g. start, stop, etc.).

        Args:
            config: Custom Service configuration parameters for the model
                deployer. Can include the pipeline name, the run id, the step
                name, the model name, the model uri, the model type etc.
            replace: If True, it will replace any existing model server instances
                that serve the same model. If False, it does not replace any
                existing model server instance.
            continuous_deployment_mode: If True, it will replace any existing
                model server instances that serve the same model, regardless of
                the configuration. If False, it will only replace existing model
                server instances that serve the same model if the configuration
                is exactly the same.
            timeout: The maximum time in seconds to wait for the model server
                to start serving the model.
            service_type: The type of the service to deploy. If not provided,
                the default service type of the model deployer will be used.

        Raises:
            RuntimeError: if the model deployment fails.

        Returns:
            The deployment Service object.
        """
        # Instantiate the client
        client = Client()
        if not continuous_deployment_mode:
            # Find existing model server
            services = self.find_model_server(
                config=config.model_dump(),
                service_type=service_type,
            )
            if len(services) > 0:
                logger.info(
                    f"Existing model server found for {config.name or config.model_name} with the exact same configuration. Returning the existing service named {services[0].config.service_name}."
                )
                return services[0]
        else:
            # Find existing model server
            services = self.find_model_server(
                pipeline_name=config.pipeline_name,
                pipeline_step_name=config.pipeline_step_name,
                model_name=config.model_name,
                service_type=service_type,
            )
            if len(services) > 0:
                logger.info(
                    f"Existing model server found for {config.pipeline_name} and {config.pipeline_step_name}, since continuous deployment mode is enabled, replacing the existing service named {services[0].config.service_name}."
                )
                service = services[0]
                self.delete_model_server(service.uuid)
        logger.info(
            f"Deploying model server for {config.model_name} with the following configuration: {config.model_dump()}"
        )
        service_response = client.create_service(
            config=config,
            service_type=service_type,
            model_version_id=get_model_version_id_if_exists(
                config.model_name, config.model_version
            ),
        )
        try:
            service = self.perform_deploy_model(
                id=service_response.id,
                config=config,
                timeout=timeout,
            )
        except Exception as e:
            client.delete_service(service_response.id)
            raise RuntimeError(
                f"Failed to deploy model server for {config.model_name}: {e}"
            ) from e
        # Update the service in store
        client.update_service(
            id=service.uuid,
            name=service.config.service_name,
            service_source=service.model_dump().get("type"),
            admin_state=service.admin_state,
            status=service.status.model_dump(),
            endpoint=service.endpoint.model_dump()
            if service.endpoint
            else None,
            # labels=service.config.get_service_labels()  # TODO: fix labels in services and config
            prediction_url=service.get_prediction_url(),
            health_check_url=service.get_healthcheck_url(),
        )
        return service

    @abstractmethod
    def perform_deploy_model(
        self,
        id: UUID,
        config: ServiceConfig,
        timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
    ) -> BaseService:
        """Abstract method to deploy a model.

        Concrete model deployer subclasses must implement the following
        functionality in this method:
        - Detect if there is an existing model server instance running serving
        one or more previous versions of the same model
        - Deploy the model to the serving platform or update the existing model
        server instance to include the new model version
        - Return a Service object that is a representation of the external model
        server instance. The Service must implement basic operational state
        tracking and lifecycle management operations for the model server (e.g.
        start, stop, etc.)

        Args:
            id: UUID of the service that was originally used to deploy the model.
            config: Custom Service configuration parameters for the model
                deployer. Can include the pipeline name, the run id, the step
                name, the model name, the model uri, the model type etc.
            timeout: The maximum time in seconds to wait for the model server
                to start serving the model.

        Returns:
            The deployment Service object.
        """

    @staticmethod
    @abstractmethod
    def get_model_server_info(
        service: BaseService,
    ) -> Dict[str, Optional[str]]:
        """Give implementation specific way to extract relevant model server properties for the user.

        Args:
            service: Integration-specific service instance

        Returns:
            A dictionary containing the relevant model server properties.
        """

    def find_model_server(
        self,
        config: Optional[Dict[str, Any]] = None,
        running: Optional[bool] = None,
        service_uuid: Optional[UUID] = None,
        pipeline_name: Optional[str] = None,
        pipeline_step_name: Optional[str] = None,
        service_name: Optional[str] = None,
        model_name: Optional[str] = None,
        model_version: Optional[str] = None,
        service_type: Optional[ServiceType] = None,
        type: Optional[str] = None,
        flavor: Optional[str] = None,
        pipeline_run_id: Optional[str] = None,
    ) -> List[BaseService]:
        """Abstract method to find one or more a model servers that match the given criteria.

        Args:
            running: If true, only running services will be returned.
            service_uuid: The UUID of the service that was originally used
                to deploy the model.
            pipeline_step_name: The name of the pipeline step that was originally used
                to deploy the model.
            pipeline_name: The name of the pipeline that was originally used to deploy
                the model from the model registry.
            model_name: The name of the model that was originally used to deploy
                the model from the model registry.
            model_version: The version of the model that was originally used to
                deploy the model from the model registry.
            service_type: The type of the service to find.
            type: The type of the service to find.
            flavor: The flavor of the service to find.
            pipeline_run_id: The UUID of the pipeline run that was originally used
                to deploy the model.
            config: Custom Service configuration parameters for the model
                deployer. Can include the pipeline name, the run id, the step
                name, the model name, the model uri, the model type etc.
            service_name: The name of the service to find.

        Returns:
            One or more Service objects representing model servers that match
            the input search criteria.
        """
        client = Client()
        service_responses = client.list_services(
            sort_by="desc:created",
            id=service_uuid,
            running=running,
            service_name=service_name,
            pipeline_name=pipeline_name,
            pipeline_step_name=pipeline_step_name,
            model_version_id=get_model_version_id_if_exists(
                model_name, model_version
            ),
            pipeline_run_id=pipeline_run_id,
            config=config,
            type=type or service_type.type if service_type else None,
            flavor=flavor or service_type.flavor if service_type else None,
            hydrate=True,
        )
        services = []
        for service_response in service_responses.items:
            if not service_response.service_source:
                client.delete_service(service_response.id)
                continue
            service = BaseDeploymentService.from_model(service_response)
            service.update_status()
            if service.status.model_dump() != service_response.status:
                client.update_service(
                    id=service.uuid,
                    admin_state=service.admin_state,
                    status=service.status.model_dump(),
                    endpoint=service.endpoint.model_dump()
                    if service.endpoint
                    else None,
                )
            if running and not service.is_running:
                logger.warning(
                    f"Service {service.uuid} is in an unexpected state. "
                    f"Expected running={running}, but found running={service.is_running}."
                )
                continue
            services.append(service)
        return services

    @abstractmethod
    def perform_stop_model(
        self,
        service: BaseService,
        timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
        force: bool = False,
    ) -> BaseService:
        """Abstract method to stop a model server.

        This operation should be reversible. A stopped model server should still
        show up in the list of model servers returned by `find_model_server` and
        it should be possible to start it again by calling `start_model_server`.

        Args:
            service: The service to stop.
            timeout: timeout in seconds to wait for the service to stop. If
                set to 0, the method will return immediately after
                deprovisioning the service, without waiting for it to stop.
            force: if True, force the service to stop.
        """

    def stop_model_server(
        self,
        uuid: UUID,
        timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
        force: bool = False,
    ) -> None:
        """Abstract method to stop a model server.

        This operation should be reversible. A stopped model server should still
        show up in the list of model servers returned by `find_model_server` and
        it should be possible to start it again by calling `start_model_server`.

        Args:
            uuid: UUID of the model server to stop.
            timeout: timeout in seconds to wait for the service to stop. If
                set to 0, the method will return immediately after
                deprovisioning the service, without waiting for it to stop.
            force: if True, force the service to stop.

        Raises:
            RuntimeError: if the model server is not found.
        """
        client = Client()
        try:
            service = self.find_model_server(service_uuid=uuid)[0]
            updated_service = self.perform_stop_model(service, timeout, force)
            client.update_service(
                id=updated_service.uuid,
                admin_state=updated_service.admin_state,
                status=updated_service.status.model_dump(),
                endpoint=updated_service.endpoint.model_dump()
                if updated_service.endpoint
                else None,
            )
        except Exception as e:
            raise RuntimeError(
                f"Failed to stop model server with UUID {uuid}: {e}"
            ) from e

    @abstractmethod
    def perform_start_model(
        self,
        service: BaseService,
        timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
    ) -> BaseService:
        """Abstract method to start a model server.

        Args:
            service: The service to start.
            timeout: timeout in seconds to wait for the service to start. If
                set to 0, the method will return immediately after
                provisioning the service, without waiting for it to become
                active.
        """

    def start_model_server(
        self,
        uuid: UUID,
        timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
    ) -> None:
        """Abstract method to start a model server.

        Args:
            uuid: UUID of the model server to start.
            timeout: timeout in seconds to wait for the service to start. If
                set to 0, the method will return immediately after
                provisioning the service, without waiting for it to become
                active.

        Raises:
            RuntimeError: if the model server is not found.
        """
        client = Client()
        try:
            service = self.find_model_server(service_uuid=uuid)[0]
            updated_service = self.perform_start_model(service, timeout)
            client.update_service(
                id=updated_service.uuid,
                admin_state=updated_service.admin_state,
                status=updated_service.status.model_dump(),
                endpoint=updated_service.endpoint.model_dump()
                if updated_service.endpoint
                else None,
            )
        except Exception as e:
            raise RuntimeError(
                f"Failed to start model server with UUID {uuid}: {e}"
            ) from e

    @abstractmethod
    def perform_delete_model(
        self,
        service: BaseService,
        timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
        force: bool = False,
    ) -> None:
        """Abstract method to delete a model server.

        This operation is irreversible. A deleted model server must no longer
        show up in the list of model servers returned by `find_model_server`.

        Args:
            service: The service to delete.
            timeout: timeout in seconds to wait for the service to stop. If
                set to 0, the method will return immediately after
                deprovisioning the service, without waiting for it to stop.
            force: if True, force the service to stop.
        """

    def delete_model_server(
        self,
        uuid: UUID,
        timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
        force: bool = False,
    ) -> None:
        """Abstract method to delete a model server.

        This operation is irreversible. A deleted model server must no longer
        show up in the list of model servers returned by `find_model_server`.

        Args:
            uuid: UUID of the model server to stop.
            timeout: timeout in seconds to wait for the service to stop. If
                set to 0, the method will return immediately after
                deprovisioning the service, without waiting for it to stop.
            force: if True, force the service to stop.

        Raises:
            RuntimeError: if the model server is not found.
        """
        client = Client()
        try:
            service = self.find_model_server(service_uuid=uuid)[0]
            self.perform_delete_model(service, timeout, force)
            client.delete_service(uuid)
        except Exception as e:
            raise RuntimeError(
                f"Failed to delete model server with UUID {uuid}: {e}"
            ) from e

    def get_model_server_logs(
        self,
        uuid: UUID,
        follow: bool = False,
        tail: Optional[int] = None,
    ) -> Generator[str, bool, None]:
        """Get the logs of a model server.

        Args:
            uuid: UUID of the model server to get the logs of.
            follow: if True, the logs will be streamed as they are written
            tail: only retrieve the last NUM lines of log output.

        Returns:
            A generator that yields the logs of the model server.

        Raises:
            RuntimeError: if the model server is not found.
        """
        services = self.find_model_server(service_uuid=uuid)
        if len(services) == 0:
            raise RuntimeError(f"No model server found with UUID {uuid}")
        return services[0].get_logs(follow=follow, tail=tail)

    def load_service(
        self,
        service_id: UUID,
    ) -> BaseService:
        """Load a service from a URI.

        Args:
            service_id: The ID of the service to load.

        Returns:
            The loaded service.
        """
        client = Client()
        service = client.get_service(service_id)
        return BaseDeploymentService.from_model(service)

config property

Returns the BaseModelDeployerConfig config.

Returns:

Type Description
BaseModelDeployerConfig

The configuration.

delete_model_server(uuid, timeout=DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT, force=False)

Abstract method to delete a model server.

This operation is irreversible. A deleted model server must no longer show up in the list of model servers returned by find_model_server.

Parameters:

Name Type Description Default
uuid UUID

UUID of the model server to stop.

required
timeout int

timeout in seconds to wait for the service to stop. If set to 0, the method will return immediately after deprovisioning the service, without waiting for it to stop.

DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT
force bool

if True, force the service to stop.

False

Raises:

Type Description
RuntimeError

if the model server is not found.

Source code in src/zenml/model_deployers/base_model_deployer.py
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
def delete_model_server(
    self,
    uuid: UUID,
    timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
    force: bool = False,
) -> None:
    """Abstract method to delete a model server.

    This operation is irreversible. A deleted model server must no longer
    show up in the list of model servers returned by `find_model_server`.

    Args:
        uuid: UUID of the model server to stop.
        timeout: timeout in seconds to wait for the service to stop. If
            set to 0, the method will return immediately after
            deprovisioning the service, without waiting for it to stop.
        force: if True, force the service to stop.

    Raises:
        RuntimeError: if the model server is not found.
    """
    client = Client()
    try:
        service = self.find_model_server(service_uuid=uuid)[0]
        self.perform_delete_model(service, timeout, force)
        client.delete_service(uuid)
    except Exception as e:
        raise RuntimeError(
            f"Failed to delete model server with UUID {uuid}: {e}"
        ) from e

deploy_model(config, service_type, replace=False, continuous_deployment_mode=False, timeout=DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT)

Deploy a model.

the deploy_model method is the main entry point for deploying models using the model deployer. It is used to deploy a model to a model server instance that is running on a remote serving platform or service. The method is responsible for detecting if there is an existing model server instance running serving one or more previous versions of the same model and deploying the model to the serving platform or updating the existing model server instance to include the new model version. The method returns a Service object that is a representation of the external model server instance. The Service object must implement basic operational state tracking and lifecycle management operations for the model server (e.g. start, stop, etc.).

Parameters:

Name Type Description Default
config ServiceConfig

Custom Service configuration parameters for the model deployer. Can include the pipeline name, the run id, the step name, the model name, the model uri, the model type etc.

required
replace bool

If True, it will replace any existing model server instances that serve the same model. If False, it does not replace any existing model server instance.

False
continuous_deployment_mode bool

If True, it will replace any existing model server instances that serve the same model, regardless of the configuration. If False, it will only replace existing model server instances that serve the same model if the configuration is exactly the same.

False
timeout int

The maximum time in seconds to wait for the model server to start serving the model.

DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT
service_type ServiceType

The type of the service to deploy. If not provided, the default service type of the model deployer will be used.

required

Raises:

Type Description
RuntimeError

if the model deployment fails.

Returns:

Type Description
BaseService

The deployment Service object.

Source code in src/zenml/model_deployers/base_model_deployer.py
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
def deploy_model(
    self,
    config: ServiceConfig,
    service_type: ServiceType,
    replace: bool = False,
    continuous_deployment_mode: bool = False,
    timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
) -> BaseService:
    """Deploy a model.

    the deploy_model method is the main entry point for deploying models
    using the model deployer. It is used to deploy a model to a model server
    instance that is running on a remote serving platform or service. The
    method is responsible for detecting if there is an existing model server
    instance running serving one or more previous versions of the same model
    and deploying the model to the serving platform or updating the existing
    model server instance to include the new model version. The method
    returns a Service object that is a representation of the external model
    server instance. The Service object must implement basic operational
    state tracking and lifecycle management operations for the model server
    (e.g. start, stop, etc.).

    Args:
        config: Custom Service configuration parameters for the model
            deployer. Can include the pipeline name, the run id, the step
            name, the model name, the model uri, the model type etc.
        replace: If True, it will replace any existing model server instances
            that serve the same model. If False, it does not replace any
            existing model server instance.
        continuous_deployment_mode: If True, it will replace any existing
            model server instances that serve the same model, regardless of
            the configuration. If False, it will only replace existing model
            server instances that serve the same model if the configuration
            is exactly the same.
        timeout: The maximum time in seconds to wait for the model server
            to start serving the model.
        service_type: The type of the service to deploy. If not provided,
            the default service type of the model deployer will be used.

    Raises:
        RuntimeError: if the model deployment fails.

    Returns:
        The deployment Service object.
    """
    # Instantiate the client
    client = Client()
    if not continuous_deployment_mode:
        # Find existing model server
        services = self.find_model_server(
            config=config.model_dump(),
            service_type=service_type,
        )
        if len(services) > 0:
            logger.info(
                f"Existing model server found for {config.name or config.model_name} with the exact same configuration. Returning the existing service named {services[0].config.service_name}."
            )
            return services[0]
    else:
        # Find existing model server
        services = self.find_model_server(
            pipeline_name=config.pipeline_name,
            pipeline_step_name=config.pipeline_step_name,
            model_name=config.model_name,
            service_type=service_type,
        )
        if len(services) > 0:
            logger.info(
                f"Existing model server found for {config.pipeline_name} and {config.pipeline_step_name}, since continuous deployment mode is enabled, replacing the existing service named {services[0].config.service_name}."
            )
            service = services[0]
            self.delete_model_server(service.uuid)
    logger.info(
        f"Deploying model server for {config.model_name} with the following configuration: {config.model_dump()}"
    )
    service_response = client.create_service(
        config=config,
        service_type=service_type,
        model_version_id=get_model_version_id_if_exists(
            config.model_name, config.model_version
        ),
    )
    try:
        service = self.perform_deploy_model(
            id=service_response.id,
            config=config,
            timeout=timeout,
        )
    except Exception as e:
        client.delete_service(service_response.id)
        raise RuntimeError(
            f"Failed to deploy model server for {config.model_name}: {e}"
        ) from e
    # Update the service in store
    client.update_service(
        id=service.uuid,
        name=service.config.service_name,
        service_source=service.model_dump().get("type"),
        admin_state=service.admin_state,
        status=service.status.model_dump(),
        endpoint=service.endpoint.model_dump()
        if service.endpoint
        else None,
        # labels=service.config.get_service_labels()  # TODO: fix labels in services and config
        prediction_url=service.get_prediction_url(),
        health_check_url=service.get_healthcheck_url(),
    )
    return service

find_model_server(config=None, running=None, service_uuid=None, pipeline_name=None, pipeline_step_name=None, service_name=None, model_name=None, model_version=None, service_type=None, type=None, flavor=None, pipeline_run_id=None)

Abstract method to find one or more a model servers that match the given criteria.

Parameters:

Name Type Description Default
running Optional[bool]

If true, only running services will be returned.

None
service_uuid Optional[UUID]

The UUID of the service that was originally used to deploy the model.

None
pipeline_step_name Optional[str]

The name of the pipeline step that was originally used to deploy the model.

None
pipeline_name Optional[str]

The name of the pipeline that was originally used to deploy the model from the model registry.

None
model_name Optional[str]

The name of the model that was originally used to deploy the model from the model registry.

None
model_version Optional[str]

The version of the model that was originally used to deploy the model from the model registry.

None
service_type Optional[ServiceType]

The type of the service to find.

None
type Optional[str]

The type of the service to find.

None
flavor Optional[str]

The flavor of the service to find.

None
pipeline_run_id Optional[str]

The UUID of the pipeline run that was originally used to deploy the model.

None
config Optional[Dict[str, Any]]

Custom Service configuration parameters for the model deployer. Can include the pipeline name, the run id, the step name, the model name, the model uri, the model type etc.

None
service_name Optional[str]

The name of the service to find.

None

Returns:

Type Description
List[BaseService]

One or more Service objects representing model servers that match

List[BaseService]

the input search criteria.

Source code in src/zenml/model_deployers/base_model_deployer.py
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
def find_model_server(
    self,
    config: Optional[Dict[str, Any]] = None,
    running: Optional[bool] = None,
    service_uuid: Optional[UUID] = None,
    pipeline_name: Optional[str] = None,
    pipeline_step_name: Optional[str] = None,
    service_name: Optional[str] = None,
    model_name: Optional[str] = None,
    model_version: Optional[str] = None,
    service_type: Optional[ServiceType] = None,
    type: Optional[str] = None,
    flavor: Optional[str] = None,
    pipeline_run_id: Optional[str] = None,
) -> List[BaseService]:
    """Abstract method to find one or more a model servers that match the given criteria.

    Args:
        running: If true, only running services will be returned.
        service_uuid: The UUID of the service that was originally used
            to deploy the model.
        pipeline_step_name: The name of the pipeline step that was originally used
            to deploy the model.
        pipeline_name: The name of the pipeline that was originally used to deploy
            the model from the model registry.
        model_name: The name of the model that was originally used to deploy
            the model from the model registry.
        model_version: The version of the model that was originally used to
            deploy the model from the model registry.
        service_type: The type of the service to find.
        type: The type of the service to find.
        flavor: The flavor of the service to find.
        pipeline_run_id: The UUID of the pipeline run that was originally used
            to deploy the model.
        config: Custom Service configuration parameters for the model
            deployer. Can include the pipeline name, the run id, the step
            name, the model name, the model uri, the model type etc.
        service_name: The name of the service to find.

    Returns:
        One or more Service objects representing model servers that match
        the input search criteria.
    """
    client = Client()
    service_responses = client.list_services(
        sort_by="desc:created",
        id=service_uuid,
        running=running,
        service_name=service_name,
        pipeline_name=pipeline_name,
        pipeline_step_name=pipeline_step_name,
        model_version_id=get_model_version_id_if_exists(
            model_name, model_version
        ),
        pipeline_run_id=pipeline_run_id,
        config=config,
        type=type or service_type.type if service_type else None,
        flavor=flavor or service_type.flavor if service_type else None,
        hydrate=True,
    )
    services = []
    for service_response in service_responses.items:
        if not service_response.service_source:
            client.delete_service(service_response.id)
            continue
        service = BaseDeploymentService.from_model(service_response)
        service.update_status()
        if service.status.model_dump() != service_response.status:
            client.update_service(
                id=service.uuid,
                admin_state=service.admin_state,
                status=service.status.model_dump(),
                endpoint=service.endpoint.model_dump()
                if service.endpoint
                else None,
            )
        if running and not service.is_running:
            logger.warning(
                f"Service {service.uuid} is in an unexpected state. "
                f"Expected running={running}, but found running={service.is_running}."
            )
            continue
        services.append(service)
    return services

get_active_model_deployer() classmethod

Get the model deployer registered in the active stack.

Returns:

Type Description
BaseModelDeployer

The model deployer registered in the active stack.

Raises:

Type Description
TypeError

if a model deployer is not part of the active stack.

Source code in src/zenml/model_deployers/base_model_deployer.py
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
@classmethod
def get_active_model_deployer(cls) -> "BaseModelDeployer":
    """Get the model deployer registered in the active stack.

    Returns:
        The model deployer registered in the active stack.

    Raises:
        TypeError: if a model deployer is not part of the
            active stack.
    """
    flavor: BaseModelDeployerFlavor = cls.FLAVOR()
    client = Client()
    model_deployer = client.active_stack.model_deployer
    if not model_deployer or not isinstance(model_deployer, cls):
        raise TypeError(
            f"The active stack needs to have a {cls.NAME} model "
            f"deployer component registered to be able deploy models "
            f"with {cls.NAME}. You can create a new stack with "
            f"a {cls.NAME} model deployer component or update your "
            f"active stack to add this component, e.g.:\n\n"
            f"  `zenml model-deployer register {flavor.name} "
            f"--flavor={flavor.name} ...`\n"
            f"  `zenml stack register <STACK-NAME> -d {flavor.name} ...`\n"
            f"  or:\n"
            f"  `zenml stack update -d {flavor.name}`\n\n"
        )

    return model_deployer

get_model_server_info(service) abstractmethod staticmethod

Give implementation specific way to extract relevant model server properties for the user.

Parameters:

Name Type Description Default
service BaseService

Integration-specific service instance

required

Returns:

Type Description
Dict[str, Optional[str]]

A dictionary containing the relevant model server properties.

Source code in src/zenml/model_deployers/base_model_deployer.py
267
268
269
270
271
272
273
274
275
276
277
278
279
@staticmethod
@abstractmethod
def get_model_server_info(
    service: BaseService,
) -> Dict[str, Optional[str]]:
    """Give implementation specific way to extract relevant model server properties for the user.

    Args:
        service: Integration-specific service instance

    Returns:
        A dictionary containing the relevant model server properties.
    """

get_model_server_logs(uuid, follow=False, tail=None)

Get the logs of a model server.

Parameters:

Name Type Description Default
uuid UUID

UUID of the model server to get the logs of.

required
follow bool

if True, the logs will be streamed as they are written

False
tail Optional[int]

only retrieve the last NUM lines of log output.

None

Returns:

Type Description
None

A generator that yields the logs of the model server.

Raises:

Type Description
RuntimeError

if the model server is not found.

Source code in src/zenml/model_deployers/base_model_deployer.py
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
def get_model_server_logs(
    self,
    uuid: UUID,
    follow: bool = False,
    tail: Optional[int] = None,
) -> Generator[str, bool, None]:
    """Get the logs of a model server.

    Args:
        uuid: UUID of the model server to get the logs of.
        follow: if True, the logs will be streamed as they are written
        tail: only retrieve the last NUM lines of log output.

    Returns:
        A generator that yields the logs of the model server.

    Raises:
        RuntimeError: if the model server is not found.
    """
    services = self.find_model_server(service_uuid=uuid)
    if len(services) == 0:
        raise RuntimeError(f"No model server found with UUID {uuid}")
    return services[0].get_logs(follow=follow, tail=tail)

load_service(service_id)

Load a service from a URI.

Parameters:

Name Type Description Default
service_id UUID

The ID of the service to load.

required

Returns:

Type Description
BaseService

The loaded service.

Source code in src/zenml/model_deployers/base_model_deployer.py
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
def load_service(
    self,
    service_id: UUID,
) -> BaseService:
    """Load a service from a URI.

    Args:
        service_id: The ID of the service to load.

    Returns:
        The loaded service.
    """
    client = Client()
    service = client.get_service(service_id)
    return BaseDeploymentService.from_model(service)

perform_delete_model(service, timeout=DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT, force=False) abstractmethod

Abstract method to delete a model server.

This operation is irreversible. A deleted model server must no longer show up in the list of model servers returned by find_model_server.

Parameters:

Name Type Description Default
service BaseService

The service to delete.

required
timeout int

timeout in seconds to wait for the service to stop. If set to 0, the method will return immediately after deprovisioning the service, without waiting for it to stop.

DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT
force bool

if True, force the service to stop.

False
Source code in src/zenml/model_deployers/base_model_deployer.py
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
@abstractmethod
def perform_delete_model(
    self,
    service: BaseService,
    timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
    force: bool = False,
) -> None:
    """Abstract method to delete a model server.

    This operation is irreversible. A deleted model server must no longer
    show up in the list of model servers returned by `find_model_server`.

    Args:
        service: The service to delete.
        timeout: timeout in seconds to wait for the service to stop. If
            set to 0, the method will return immediately after
            deprovisioning the service, without waiting for it to stop.
        force: if True, force the service to stop.
    """

perform_deploy_model(id, config, timeout=DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT) abstractmethod

Abstract method to deploy a model.

Concrete model deployer subclasses must implement the following functionality in this method: - Detect if there is an existing model server instance running serving one or more previous versions of the same model - Deploy the model to the serving platform or update the existing model server instance to include the new model version - Return a Service object that is a representation of the external model server instance. The Service must implement basic operational state tracking and lifecycle management operations for the model server (e.g. start, stop, etc.)

Parameters:

Name Type Description Default
id UUID

UUID of the service that was originally used to deploy the model.

required
config ServiceConfig

Custom Service configuration parameters for the model deployer. Can include the pipeline name, the run id, the step name, the model name, the model uri, the model type etc.

required
timeout int

The maximum time in seconds to wait for the model server to start serving the model.

DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT

Returns:

Type Description
BaseService

The deployment Service object.

Source code in src/zenml/model_deployers/base_model_deployer.py
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
@abstractmethod
def perform_deploy_model(
    self,
    id: UUID,
    config: ServiceConfig,
    timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
) -> BaseService:
    """Abstract method to deploy a model.

    Concrete model deployer subclasses must implement the following
    functionality in this method:
    - Detect if there is an existing model server instance running serving
    one or more previous versions of the same model
    - Deploy the model to the serving platform or update the existing model
    server instance to include the new model version
    - Return a Service object that is a representation of the external model
    server instance. The Service must implement basic operational state
    tracking and lifecycle management operations for the model server (e.g.
    start, stop, etc.)

    Args:
        id: UUID of the service that was originally used to deploy the model.
        config: Custom Service configuration parameters for the model
            deployer. Can include the pipeline name, the run id, the step
            name, the model name, the model uri, the model type etc.
        timeout: The maximum time in seconds to wait for the model server
            to start serving the model.

    Returns:
        The deployment Service object.
    """

perform_start_model(service, timeout=DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT) abstractmethod

Abstract method to start a model server.

Parameters:

Name Type Description Default
service BaseService

The service to start.

required
timeout int

timeout in seconds to wait for the service to start. If set to 0, the method will return immediately after provisioning the service, without waiting for it to become active.

DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT
Source code in src/zenml/model_deployers/base_model_deployer.py
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
@abstractmethod
def perform_start_model(
    self,
    service: BaseService,
    timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
) -> BaseService:
    """Abstract method to start a model server.

    Args:
        service: The service to start.
        timeout: timeout in seconds to wait for the service to start. If
            set to 0, the method will return immediately after
            provisioning the service, without waiting for it to become
            active.
    """

perform_stop_model(service, timeout=DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT, force=False) abstractmethod

Abstract method to stop a model server.

This operation should be reversible. A stopped model server should still show up in the list of model servers returned by find_model_server and it should be possible to start it again by calling start_model_server.

Parameters:

Name Type Description Default
service BaseService

The service to stop.

required
timeout int

timeout in seconds to wait for the service to stop. If set to 0, the method will return immediately after deprovisioning the service, without waiting for it to stop.

DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT
force bool

if True, force the service to stop.

False
Source code in src/zenml/model_deployers/base_model_deployer.py
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
@abstractmethod
def perform_stop_model(
    self,
    service: BaseService,
    timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
    force: bool = False,
) -> BaseService:
    """Abstract method to stop a model server.

    This operation should be reversible. A stopped model server should still
    show up in the list of model servers returned by `find_model_server` and
    it should be possible to start it again by calling `start_model_server`.

    Args:
        service: The service to stop.
        timeout: timeout in seconds to wait for the service to stop. If
            set to 0, the method will return immediately after
            deprovisioning the service, without waiting for it to stop.
        force: if True, force the service to stop.
    """

start_model_server(uuid, timeout=DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT)

Abstract method to start a model server.

Parameters:

Name Type Description Default
uuid UUID

UUID of the model server to start.

required
timeout int

timeout in seconds to wait for the service to start. If set to 0, the method will return immediately after provisioning the service, without waiting for it to become active.

DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT

Raises:

Type Description
RuntimeError

if the model server is not found.

Source code in src/zenml/model_deployers/base_model_deployer.py
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
def start_model_server(
    self,
    uuid: UUID,
    timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
) -> None:
    """Abstract method to start a model server.

    Args:
        uuid: UUID of the model server to start.
        timeout: timeout in seconds to wait for the service to start. If
            set to 0, the method will return immediately after
            provisioning the service, without waiting for it to become
            active.

    Raises:
        RuntimeError: if the model server is not found.
    """
    client = Client()
    try:
        service = self.find_model_server(service_uuid=uuid)[0]
        updated_service = self.perform_start_model(service, timeout)
        client.update_service(
            id=updated_service.uuid,
            admin_state=updated_service.admin_state,
            status=updated_service.status.model_dump(),
            endpoint=updated_service.endpoint.model_dump()
            if updated_service.endpoint
            else None,
        )
    except Exception as e:
        raise RuntimeError(
            f"Failed to start model server with UUID {uuid}: {e}"
        ) from e

stop_model_server(uuid, timeout=DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT, force=False)

Abstract method to stop a model server.

This operation should be reversible. A stopped model server should still show up in the list of model servers returned by find_model_server and it should be possible to start it again by calling start_model_server.

Parameters:

Name Type Description Default
uuid UUID

UUID of the model server to stop.

required
timeout int

timeout in seconds to wait for the service to stop. If set to 0, the method will return immediately after deprovisioning the service, without waiting for it to stop.

DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT
force bool

if True, force the service to stop.

False

Raises:

Type Description
RuntimeError

if the model server is not found.

Source code in src/zenml/model_deployers/base_model_deployer.py
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
def stop_model_server(
    self,
    uuid: UUID,
    timeout: int = DEFAULT_DEPLOYMENT_START_STOP_TIMEOUT,
    force: bool = False,
) -> None:
    """Abstract method to stop a model server.

    This operation should be reversible. A stopped model server should still
    show up in the list of model servers returned by `find_model_server` and
    it should be possible to start it again by calling `start_model_server`.

    Args:
        uuid: UUID of the model server to stop.
        timeout: timeout in seconds to wait for the service to stop. If
            set to 0, the method will return immediately after
            deprovisioning the service, without waiting for it to stop.
        force: if True, force the service to stop.

    Raises:
        RuntimeError: if the model server is not found.
    """
    client = Client()
    try:
        service = self.find_model_server(service_uuid=uuid)[0]
        updated_service = self.perform_stop_model(service, timeout, force)
        client.update_service(
            id=updated_service.uuid,
            admin_state=updated_service.admin_state,
            status=updated_service.status.model_dump(),
            endpoint=updated_service.endpoint.model_dump()
            if updated_service.endpoint
            else None,
        )
    except Exception as e:
        raise RuntimeError(
            f"Failed to stop model server with UUID {uuid}: {e}"
        ) from e

BaseModelDeployerFlavor

Bases: Flavor

Base class for model deployer flavors.

Source code in src/zenml/model_deployers/base_model_deployer.py
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
class BaseModelDeployerFlavor(Flavor):
    """Base class for model deployer flavors."""

    @property
    def type(self) -> StackComponentType:
        """Returns the flavor type.

        Returns:
            The flavor type.
        """
        return StackComponentType.MODEL_DEPLOYER

    @property
    def config_class(self) -> Type[BaseModelDeployerConfig]:
        """Returns `BaseModelDeployerConfig` config class.

        Returns:
                The config class.
        """
        return BaseModelDeployerConfig

    @property
    @abstractmethod
    def implementation_class(self) -> Type[BaseModelDeployer]:
        """The class that implements the model deployer."""

config_class property

Returns BaseModelDeployerConfig config class.

Returns:

Type Description
Type[BaseModelDeployerConfig]

The config class.

implementation_class abstractmethod property

The class that implements the model deployer.

type property

Returns the flavor type.

Returns:

Type Description
StackComponentType

The flavor type.

Model Registries

Initialization of the MLflow Service.

Model registries are centralized repositories that facilitate the collaboration and management of machine learning models. They provide functionalities such as version control, metadata tracking, and storage of model artifacts, enabling data scientists to efficiently share and keep track of their models within a team or organization.

BaseModelRegistry

Bases: StackComponent, ABC

Base class for all ZenML model registries.

Source code in src/zenml/model_registries/base_model_registry.py
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
class BaseModelRegistry(StackComponent, ABC):
    """Base class for all ZenML model registries."""

    @property
    def config(self) -> BaseModelRegistryConfig:
        """Returns the config of the model registries.

        Returns:
            The config of the model registries.
        """
        return cast(BaseModelRegistryConfig, self._config)

    # ---------
    # Model Registration Methods
    # ---------

    @abstractmethod
    def register_model(
        self,
        name: str,
        description: Optional[str] = None,
        metadata: Optional[Dict[str, str]] = None,
    ) -> RegisteredModel:
        """Registers a model in the model registry.

        Args:
            name: The name of the registered model.
            description: The description of the registered model.
            metadata: The metadata associated with the registered model.

        Returns:
            The registered model.

        Raises:
            zenml.exceptions.EntityExistsError: If a model with the same name already exists.
            RuntimeError: If registration fails.
        """

    @abstractmethod
    def delete_model(
        self,
        name: str,
    ) -> None:
        """Deletes a registered model from the model registry.

        Args:
            name: The name of the registered model.

        Raises:
            KeyError: If the model does not exist.
            RuntimeError: If deletion fails.
        """

    @abstractmethod
    def update_model(
        self,
        name: str,
        description: Optional[str] = None,
        metadata: Optional[Dict[str, str]] = None,
        remove_metadata: Optional[List[str]] = None,
    ) -> RegisteredModel:
        """Updates a registered model in the model registry.

        Args:
            name: The name of the registered model.
            description: The description of the registered model.
            metadata: The metadata associated with the registered model.
            remove_metadata: The metadata to remove from the registered model.

        Raises:
            KeyError: If the model does not exist.
            RuntimeError: If update fails.
        """

    @abstractmethod
    def get_model(self, name: str) -> RegisteredModel:
        """Gets a registered model from the model registry.

        Args:
            name: The name of the registered model.

        Returns:
            The registered model.

        Raises:
            zenml.exceptions.EntityExistsError: If the model does not exist.
            RuntimeError: If retrieval fails.
        """

    @abstractmethod
    def list_models(
        self,
        name: Optional[str] = None,
        metadata: Optional[Dict[str, str]] = None,
    ) -> List[RegisteredModel]:
        """Lists all registered models in the model registry.

        Args:
            name: The name of the registered model.
            metadata: The metadata associated with the registered model.

        Returns:
            A list of registered models.
        """

    # ---------
    # Model Version Methods
    # ---------

    @abstractmethod
    def register_model_version(
        self,
        name: str,
        version: Optional[str] = None,
        model_source_uri: Optional[str] = None,
        description: Optional[str] = None,
        metadata: Optional[ModelRegistryModelMetadata] = None,
        **kwargs: Any,
    ) -> RegistryModelVersion:
        """Registers a model version in the model registry.

        Args:
            name: The name of the registered model.
            model_source_uri: The source URI of the model.
            version: The version of the model version.
            description: The description of the model version.
            metadata: The metadata associated with the model
                version.
            **kwargs: Additional keyword arguments.

        Returns:
            The registered model version.

        Raises:
            RuntimeError: If registration fails.
        """

    @abstractmethod
    def delete_model_version(
        self,
        name: str,
        version: str,
    ) -> None:
        """Deletes a model version from the model registry.

        Args:
            name: The name of the registered model.
            version: The version of the model version to delete.

        Raises:
            KeyError: If the model version does not exist.
            RuntimeError: If deletion fails.
        """

    @abstractmethod
    def update_model_version(
        self,
        name: str,
        version: str,
        description: Optional[str] = None,
        metadata: Optional[ModelRegistryModelMetadata] = None,
        remove_metadata: Optional[List[str]] = None,
        stage: Optional[ModelVersionStage] = None,
    ) -> RegistryModelVersion:
        """Updates a model version in the model registry.

        Args:
            name: The name of the registered model.
            version: The version of the model version to update.
            description: The description of the model version.
            metadata: Metadata associated with this model version.
            remove_metadata: The metadata to remove from the model version.
            stage: The stage of the model version.

        Returns:
            The updated model version.

        Raises:
            KeyError: If the model version does not exist.
            RuntimeError: If update fails.
        """

    @abstractmethod
    def list_model_versions(
        self,
        name: Optional[str] = None,
        model_source_uri: Optional[str] = None,
        metadata: Optional[ModelRegistryModelMetadata] = None,
        stage: Optional[ModelVersionStage] = None,
        count: Optional[int] = None,
        created_after: Optional[datetime] = None,
        created_before: Optional[datetime] = None,
        order_by_date: Optional[str] = None,
        **kwargs: Any,
    ) -> Optional[List[RegistryModelVersion]]:
        """Lists all model versions for a registered model.

        Args:
            name: The name of the registered model.
            model_source_uri: The model source URI of the registered model.
            metadata: Metadata associated with this model version.
            stage: The stage of the model version.
            count: The number of model versions to return.
            created_after: The timestamp after which to list model versions.
            created_before: The timestamp before which to list model versions.
            order_by_date: Whether to sort by creation time, this can
                be "asc" or "desc".
            kwargs: Additional keyword arguments.

        Returns:
            A list of model versions.
        """

    def get_latest_model_version(
        self,
        name: str,
        stage: Optional[ModelVersionStage] = None,
    ) -> Optional[RegistryModelVersion]:
        """Gets the latest model version for a registered model.

        This method is used to get the latest model version for a registered
        model. If no stage is provided, the latest model version across all
        stages is returned. If a stage is provided, the latest model version
        for that stage is returned.

        Args:
            name: The name of the registered model.
            stage: The stage of the model version.

        Returns:
            The latest model version.
        """
        model_versions = self.list_model_versions(
            name=name, stage=stage, order_by_date="desc", count=1
        )
        if model_versions:
            return model_versions[0]
        return None

    @abstractmethod
    def get_model_version(
        self, name: str, version: str
    ) -> RegistryModelVersion:
        """Gets a model version for a registered model.

        Args:
            name: The name of the registered model.
            version: The version of the model version to get.

        Returns:
            The model version.

        Raises:
            KeyError: If the model version does not exist.
            RuntimeError: If retrieval fails.
        """

    @abstractmethod
    def load_model_version(
        self,
        name: str,
        version: str,
        **kwargs: Any,
    ) -> Any:
        """Loads a model version from the model registry.

        Args:
            name: The name of the registered model.
            version: The version of the model version to load.
            **kwargs: Additional keyword arguments.

        Returns:
            The loaded model version.

        Raises:
            KeyError: If the model version does not exist.
            RuntimeError: If loading fails.
        """

    @abstractmethod
    def get_model_uri_artifact_store(
        self,
        model_version: RegistryModelVersion,
    ) -> str:
        """Gets the URI artifact store for a model version.

        This method retrieves the URI of the artifact store for a specific model
        version. Its purpose is to ensure that the URI is in the correct format
        for the specific artifact store being used. This is essential for the
        model serving component, which relies on the URI to serve the model
        version. In some cases, the URI may be stored in a different format by
        certain model registry integrations. This method allows us to obtain the
        URI in the correct format, regardless of the integration being used.

        Note: In some cases the URI artifact store may not be available to the
        user, the method should save the target model in one of the other
        artifact stores supported by ZenML and return the URI of that artifact
        store.

        Args:
            model_version: The model version for which to get the URI artifact
                store.

        Returns:
            The URI artifact store for the model version.
        """

config property

Returns the config of the model registries.

Returns:

Type Description
BaseModelRegistryConfig

The config of the model registries.

delete_model(name) abstractmethod

Deletes a registered model from the model registry.

Parameters:

Name Type Description Default
name str

The name of the registered model.

required

Raises:

Type Description
KeyError

If the model does not exist.

RuntimeError

If deletion fails.

Source code in src/zenml/model_registries/base_model_registry.py
213
214
215
216
217
218
219
220
221
222
223
224
225
226
@abstractmethod
def delete_model(
    self,
    name: str,
) -> None:
    """Deletes a registered model from the model registry.

    Args:
        name: The name of the registered model.

    Raises:
        KeyError: If the model does not exist.
        RuntimeError: If deletion fails.
    """

delete_model_version(name, version) abstractmethod

Deletes a model version from the model registry.

Parameters:

Name Type Description Default
name str

The name of the registered model.

required
version str

The version of the model version to delete.

required

Raises:

Type Description
KeyError

If the model version does not exist.

RuntimeError

If deletion fails.

Source code in src/zenml/model_registries/base_model_registry.py
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
@abstractmethod
def delete_model_version(
    self,
    name: str,
    version: str,
) -> None:
    """Deletes a model version from the model registry.

    Args:
        name: The name of the registered model.
        version: The version of the model version to delete.

    Raises:
        KeyError: If the model version does not exist.
        RuntimeError: If deletion fails.
    """

get_latest_model_version(name, stage=None)

Gets the latest model version for a registered model.

This method is used to get the latest model version for a registered model. If no stage is provided, the latest model version across all stages is returned. If a stage is provided, the latest model version for that stage is returned.

Parameters:

Name Type Description Default
name str

The name of the registered model.

required
stage Optional[ModelVersionStage]

The stage of the model version.

None

Returns:

Type Description
Optional[RegistryModelVersion]

The latest model version.

Source code in src/zenml/model_registries/base_model_registry.py
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
def get_latest_model_version(
    self,
    name: str,
    stage: Optional[ModelVersionStage] = None,
) -> Optional[RegistryModelVersion]:
    """Gets the latest model version for a registered model.

    This method is used to get the latest model version for a registered
    model. If no stage is provided, the latest model version across all
    stages is returned. If a stage is provided, the latest model version
    for that stage is returned.

    Args:
        name: The name of the registered model.
        stage: The stage of the model version.

    Returns:
        The latest model version.
    """
    model_versions = self.list_model_versions(
        name=name, stage=stage, order_by_date="desc", count=1
    )
    if model_versions:
        return model_versions[0]
    return None

get_model(name) abstractmethod

Gets a registered model from the model registry.

Parameters:

Name Type Description Default
name str

The name of the registered model.

required

Returns:

Type Description
RegisteredModel

The registered model.

Raises:

Type Description
EntityExistsError

If the model does not exist.

RuntimeError

If retrieval fails.

Source code in src/zenml/model_registries/base_model_registry.py
249
250
251
252
253
254
255
256
257
258
259
260
261
262
@abstractmethod
def get_model(self, name: str) -> RegisteredModel:
    """Gets a registered model from the model registry.

    Args:
        name: The name of the registered model.

    Returns:
        The registered model.

    Raises:
        zenml.exceptions.EntityExistsError: If the model does not exist.
        RuntimeError: If retrieval fails.
    """

get_model_uri_artifact_store(model_version) abstractmethod

Gets the URI artifact store for a model version.

This method retrieves the URI of the artifact store for a specific model version. Its purpose is to ensure that the URI is in the correct format for the specific artifact store being used. This is essential for the model serving component, which relies on the URI to serve the model version. In some cases, the URI may be stored in a different format by certain model registry integrations. This method allows us to obtain the URI in the correct format, regardless of the integration being used.

Note: In some cases the URI artifact store may not be available to the user, the method should save the target model in one of the other artifact stores supported by ZenML and return the URI of that artifact store.

Parameters:

Name Type Description Default
model_version RegistryModelVersion

The model version for which to get the URI artifact store.

required

Returns:

Type Description
str

The URI artifact store for the model version.

Source code in src/zenml/model_registries/base_model_registry.py
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
@abstractmethod
def get_model_uri_artifact_store(
    self,
    model_version: RegistryModelVersion,
) -> str:
    """Gets the URI artifact store for a model version.

    This method retrieves the URI of the artifact store for a specific model
    version. Its purpose is to ensure that the URI is in the correct format
    for the specific artifact store being used. This is essential for the
    model serving component, which relies on the URI to serve the model
    version. In some cases, the URI may be stored in a different format by
    certain model registry integrations. This method allows us to obtain the
    URI in the correct format, regardless of the integration being used.

    Note: In some cases the URI artifact store may not be available to the
    user, the method should save the target model in one of the other
    artifact stores supported by ZenML and return the URI of that artifact
    store.

    Args:
        model_version: The model version for which to get the URI artifact
            store.

    Returns:
        The URI artifact store for the model version.
    """

get_model_version(name, version) abstractmethod

Gets a model version for a registered model.

Parameters:

Name Type Description Default
name str

The name of the registered model.

required
version str

The version of the model version to get.

required

Returns:

Type Description
RegistryModelVersion

The model version.

Raises:

Type Description
KeyError

If the model version does not exist.

RuntimeError

If retrieval fails.

Source code in src/zenml/model_registries/base_model_registry.py
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
@abstractmethod
def get_model_version(
    self, name: str, version: str
) -> RegistryModelVersion:
    """Gets a model version for a registered model.

    Args:
        name: The name of the registered model.
        version: The version of the model version to get.

    Returns:
        The model version.

    Raises:
        KeyError: If the model version does not exist.
        RuntimeError: If retrieval fails.
    """

list_model_versions(name=None, model_source_uri=None, metadata=None, stage=None, count=None, created_after=None, created_before=None, order_by_date=None, **kwargs) abstractmethod

Lists all model versions for a registered model.

Parameters:

Name Type Description Default
name Optional[str]

The name of the registered model.

None
model_source_uri Optional[str]

The model source URI of the registered model.

None
metadata Optional[ModelRegistryModelMetadata]

Metadata associated with this model version.

None
stage Optional[ModelVersionStage]

The stage of the model version.

None
count Optional[int]

The number of model versions to return.

None
created_after Optional[datetime]

The timestamp after which to list model versions.

None
created_before Optional[datetime]

The timestamp before which to list model versions.

None
order_by_date Optional[str]

Whether to sort by creation time, this can be "asc" or "desc".

None
kwargs Any

Additional keyword arguments.

{}

Returns:

Type Description
Optional[List[RegistryModelVersion]]

A list of model versions.

Source code in src/zenml/model_registries/base_model_registry.py
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
@abstractmethod
def list_model_versions(
    self,
    name: Optional[str] = None,
    model_source_uri: Optional[str] = None,
    metadata: Optional[ModelRegistryModelMetadata] = None,
    stage: Optional[ModelVersionStage] = None,
    count: Optional[int] = None,
    created_after: Optional[datetime] = None,
    created_before: Optional[datetime] = None,
    order_by_date: Optional[str] = None,
    **kwargs: Any,
) -> Optional[List[RegistryModelVersion]]:
    """Lists all model versions for a registered model.

    Args:
        name: The name of the registered model.
        model_source_uri: The model source URI of the registered model.
        metadata: Metadata associated with this model version.
        stage: The stage of the model version.
        count: The number of model versions to return.
        created_after: The timestamp after which to list model versions.
        created_before: The timestamp before which to list model versions.
        order_by_date: Whether to sort by creation time, this can
            be "asc" or "desc".
        kwargs: Additional keyword arguments.

    Returns:
        A list of model versions.
    """

list_models(name=None, metadata=None) abstractmethod

Lists all registered models in the model registry.

Parameters:

Name Type Description Default
name Optional[str]

The name of the registered model.

None
metadata Optional[Dict[str, str]]

The metadata associated with the registered model.

None

Returns:

Type Description
List[RegisteredModel]

A list of registered models.

Source code in src/zenml/model_registries/base_model_registry.py
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
@abstractmethod
def list_models(
    self,
    name: Optional[str] = None,
    metadata: Optional[Dict[str, str]] = None,
) -> List[RegisteredModel]:
    """Lists all registered models in the model registry.

    Args:
        name: The name of the registered model.
        metadata: The metadata associated with the registered model.

    Returns:
        A list of registered models.
    """

load_model_version(name, version, **kwargs) abstractmethod

Loads a model version from the model registry.

Parameters:

Name Type Description Default
name str

The name of the registered model.

required
version str

The version of the model version to load.

required
**kwargs Any

Additional keyword arguments.

{}

Returns:

Type Description
Any

The loaded model version.

Raises:

Type Description
KeyError

If the model version does not exist.

RuntimeError

If loading fails.

Source code in src/zenml/model_registries/base_model_registry.py
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
@abstractmethod
def load_model_version(
    self,
    name: str,
    version: str,
    **kwargs: Any,
) -> Any:
    """Loads a model version from the model registry.

    Args:
        name: The name of the registered model.
        version: The version of the model version to load.
        **kwargs: Additional keyword arguments.

    Returns:
        The loaded model version.

    Raises:
        KeyError: If the model version does not exist.
        RuntimeError: If loading fails.
    """

register_model(name, description=None, metadata=None) abstractmethod

Registers a model in the model registry.

Parameters:

Name Type Description Default
name str

The name of the registered model.

required
description Optional[str]

The description of the registered model.

None
metadata Optional[Dict[str, str]]

The metadata associated with the registered model.

None

Returns:

Type Description
RegisteredModel

The registered model.

Raises:

Type Description
EntityExistsError

If a model with the same name already exists.

RuntimeError

If registration fails.

Source code in src/zenml/model_registries/base_model_registry.py
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
@abstractmethod
def register_model(
    self,
    name: str,
    description: Optional[str] = None,
    metadata: Optional[Dict[str, str]] = None,
) -> RegisteredModel:
    """Registers a model in the model registry.

    Args:
        name: The name of the registered model.
        description: The description of the registered model.
        metadata: The metadata associated with the registered model.

    Returns:
        The registered model.

    Raises:
        zenml.exceptions.EntityExistsError: If a model with the same name already exists.
        RuntimeError: If registration fails.
    """

register_model_version(name, version=None, model_source_uri=None, description=None, metadata=None, **kwargs) abstractmethod

Registers a model version in the model registry.

Parameters:

Name Type Description Default
name str

The name of the registered model.

required
model_source_uri Optional[str]

The source URI of the model.

None
version Optional[str]

The version of the model version.

None
description Optional[str]

The description of the model version.

None
metadata Optional[ModelRegistryModelMetadata]

The metadata associated with the model version.

None
**kwargs Any

Additional keyword arguments.

{}

Returns:

Type Description
RegistryModelVersion

The registered model version.

Raises:

Type Description
RuntimeError

If registration fails.

Source code in src/zenml/model_registries/base_model_registry.py
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
@abstractmethod
def register_model_version(
    self,
    name: str,
    version: Optional[str] = None,
    model_source_uri: Optional[str] = None,
    description: Optional[str] = None,
    metadata: Optional[ModelRegistryModelMetadata] = None,
    **kwargs: Any,
) -> RegistryModelVersion:
    """Registers a model version in the model registry.

    Args:
        name: The name of the registered model.
        model_source_uri: The source URI of the model.
        version: The version of the model version.
        description: The description of the model version.
        metadata: The metadata associated with the model
            version.
        **kwargs: Additional keyword arguments.

    Returns:
        The registered model version.

    Raises:
        RuntimeError: If registration fails.
    """

update_model(name, description=None, metadata=None, remove_metadata=None) abstractmethod

Updates a registered model in the model registry.

Parameters:

Name Type Description Default
name str

The name of the registered model.

required
description Optional[str]

The description of the registered model.

None
metadata Optional[Dict[str, str]]

The metadata associated with the registered model.

None
remove_metadata Optional[List[str]]

The metadata to remove from the registered model.

None

Raises:

Type Description
KeyError

If the model does not exist.

RuntimeError

If update fails.

Source code in src/zenml/model_registries/base_model_registry.py
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
@abstractmethod
def update_model(
    self,
    name: str,
    description: Optional[str] = None,
    metadata: Optional[Dict[str, str]] = None,
    remove_metadata: Optional[List[str]] = None,
) -> RegisteredModel:
    """Updates a registered model in the model registry.

    Args:
        name: The name of the registered model.
        description: The description of the registered model.
        metadata: The metadata associated with the registered model.
        remove_metadata: The metadata to remove from the registered model.

    Raises:
        KeyError: If the model does not exist.
        RuntimeError: If update fails.
    """

update_model_version(name, version, description=None, metadata=None, remove_metadata=None, stage=None) abstractmethod

Updates a model version in the model registry.

Parameters:

Name Type Description Default
name str

The name of the registered model.

required
version str

The version of the model version to update.

required
description Optional[str]

The description of the model version.

None
metadata Optional[ModelRegistryModelMetadata]

Metadata associated with this model version.

None
remove_metadata Optional[List[str]]

The metadata to remove from the model version.

None
stage Optional[ModelVersionStage]

The stage of the model version.

None

Returns:

Type Description
RegistryModelVersion

The updated model version.

Raises:

Type Description
KeyError

If the model version does not exist.

RuntimeError

If update fails.

Source code in src/zenml/model_registries/base_model_registry.py
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
@abstractmethod
def update_model_version(
    self,
    name: str,
    version: str,
    description: Optional[str] = None,
    metadata: Optional[ModelRegistryModelMetadata] = None,
    remove_metadata: Optional[List[str]] = None,
    stage: Optional[ModelVersionStage] = None,
) -> RegistryModelVersion:
    """Updates a model version in the model registry.

    Args:
        name: The name of the registered model.
        version: The version of the model version to update.
        description: The description of the model version.
        metadata: Metadata associated with this model version.
        remove_metadata: The metadata to remove from the model version.
        stage: The stage of the model version.

    Returns:
        The updated model version.

    Raises:
        KeyError: If the model version does not exist.
        RuntimeError: If update fails.
    """

BaseModelRegistryConfig

Bases: StackComponentConfig

Base config for model registries.

Source code in src/zenml/model_registries/base_model_registry.py
171
172
class BaseModelRegistryConfig(StackComponentConfig):
    """Base config for model registries."""

BaseModelRegistryFlavor

Bases: Flavor

Base class for all ZenML model registry flavors.

Source code in src/zenml/model_registries/base_model_registry.py
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
class BaseModelRegistryFlavor(Flavor):
    """Base class for all ZenML model registry flavors."""

    @property
    def type(self) -> StackComponentType:
        """Type of the flavor.

        Returns:
            StackComponentType: The type of the flavor.
        """
        return StackComponentType.MODEL_REGISTRY

    @property
    def config_class(self) -> Type[BaseModelRegistryConfig]:
        """Config class for this flavor.

        Returns:
            The config class for this flavor.
        """
        return BaseModelRegistryConfig

    @property
    @abstractmethod
    def implementation_class(self) -> Type[StackComponent]:
        """Returns the implementation class for this flavor.

        Returns:
            The implementation class for this flavor.
        """
        return BaseModelRegistry

config_class property

Config class for this flavor.

Returns:

Type Description
Type[BaseModelRegistryConfig]

The config class for this flavor.

implementation_class abstractmethod property

Returns the implementation class for this flavor.

Returns:

Type Description
Type[StackComponent]

The implementation class for this flavor.

type property

Type of the flavor.

Returns:

Name Type Description
StackComponentType StackComponentType

The type of the flavor.

Model

Concepts related to the Model Control Plane feature.

Models

Pydantic models for the various concepts in ZenML.

APIKey

Bases: BaseModel

Encoded model for API keys.

Source code in src/zenml/models/v2/core/api_key.py
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
class APIKey(BaseModel):
    """Encoded model for API keys."""

    id: UUID
    key: str

    @classmethod
    def decode_api_key(cls, encoded_key: str) -> "APIKey":
        """Decodes an API key from a base64 string.

        Args:
            encoded_key: The encoded API key.

        Returns:
            The decoded API key.

        Raises:
            ValueError: If the key is not valid.
        """
        if encoded_key.startswith(ZENML_API_KEY_PREFIX):
            encoded_key = encoded_key[len(ZENML_API_KEY_PREFIX) :]
        try:
            json_key = b64_decode(encoded_key)
            return cls.model_validate_json(json_key)
        except Exception:
            raise ValueError("Invalid API key.")

    def encode(self) -> str:
        """Encodes the API key in a base64 string that includes the key ID and prefix.

        Returns:
            The encoded API key.
        """
        encoded_key = b64_encode(self.model_dump_json())
        return f"{ZENML_API_KEY_PREFIX}{encoded_key}"

decode_api_key(encoded_key) classmethod

Decodes an API key from a base64 string.

Parameters:

Name Type Description Default
encoded_key str

The encoded API key.

required

Returns:

Type Description
APIKey

The decoded API key.

Raises:

Type Description
ValueError

If the key is not valid.

Source code in src/zenml/models/v2/core/api_key.py
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
@classmethod
def decode_api_key(cls, encoded_key: str) -> "APIKey":
    """Decodes an API key from a base64 string.

    Args:
        encoded_key: The encoded API key.

    Returns:
        The decoded API key.

    Raises:
        ValueError: If the key is not valid.
    """
    if encoded_key.startswith(ZENML_API_KEY_PREFIX):
        encoded_key = encoded_key[len(ZENML_API_KEY_PREFIX) :]
    try:
        json_key = b64_decode(encoded_key)
        return cls.model_validate_json(json_key)
    except Exception:
        raise ValueError("Invalid API key.")

encode()

Encodes the API key in a base64 string that includes the key ID and prefix.

Returns:

Type Description
str

The encoded API key.

Source code in src/zenml/models/v2/core/api_key.py
72
73
74
75
76
77
78
79
def encode(self) -> str:
    """Encodes the API key in a base64 string that includes the key ID and prefix.

    Returns:
        The encoded API key.
    """
    encoded_key = b64_encode(self.model_dump_json())
    return f"{ZENML_API_KEY_PREFIX}{encoded_key}"

APIKeyFilter

Bases: BaseFilter

Filter model for API keys.

Source code in src/zenml/models/v2/core/api_key.py
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
class APIKeyFilter(BaseFilter):
    """Filter model for API keys."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *BaseFilter.FILTER_EXCLUDE_FIELDS,
        "service_account",
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *BaseFilter.CLI_EXCLUDE_FIELDS,
        "service_account",
    ]

    service_account: Optional[UUID] = Field(
        default=None,
        description="The service account to scope this query to.",
    )
    name: Optional[str] = Field(
        default=None,
        description="Name of the API key",
    )
    description: Optional[str] = Field(
        default=None,
        title="Filter by the API key description.",
    )
    active: Optional[Union[bool, str]] = Field(
        default=None,
        title="Whether the API key is active.",
        union_mode="left_to_right",
    )
    last_login: Optional[Union[datetime, str]] = Field(
        default=None,
        title="Time when the API key was last used to log in.",
        union_mode="left_to_right",
    )
    last_rotated: Optional[Union[datetime, str]] = Field(
        default=None,
        title="Time when the API key was last rotated.",
        union_mode="left_to_right",
    )

    def set_service_account(self, service_account_id: UUID) -> None:
        """Set the service account by which to scope this query.

        Args:
            service_account_id: The service account ID.
        """
        self.service_account = service_account_id

    def apply_filter(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Override to apply the service account scope as an additional filter.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        query = super().apply_filter(query=query, table=table)

        if self.service_account:
            scope_filter = (
                getattr(table, "service_account_id") == self.service_account
            )
            query = query.where(scope_filter)

        return query

apply_filter(query, table)

Override to apply the service account scope as an additional filter.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/core/api_key.py
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
def apply_filter(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Override to apply the service account scope as an additional filter.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    query = super().apply_filter(query=query, table=table)

    if self.service_account:
        scope_filter = (
            getattr(table, "service_account_id") == self.service_account
        )
        query = query.where(scope_filter)

    return query

set_service_account(service_account_id)

Set the service account by which to scope this query.

Parameters:

Name Type Description Default
service_account_id UUID

The service account ID.

required
Source code in src/zenml/models/v2/core/api_key.py
377
378
379
380
381
382
383
def set_service_account(self, service_account_id: UUID) -> None:
    """Set the service account by which to scope this query.

    Args:
        service_account_id: The service account ID.
    """
    self.service_account = service_account_id

APIKeyInternalResponse

Bases: APIKeyResponse

Response model for API keys used internally.

Source code in src/zenml/models/v2/core/api_key.py
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
class APIKeyInternalResponse(APIKeyResponse):
    """Response model for API keys used internally."""

    previous_key: Optional[str] = Field(
        default=None,
        title="The previous API key. Only set if the key was rotated.",
    )

    def verify_key(
        self,
        key: str,
    ) -> bool:
        """Verifies a given key against the stored (hashed) key(s).

        Args:
            key: Input key to be verified.

        Returns:
            True if the keys match.
        """
        # even when the hashed key is not set, we still want to execute
        # the hash verification to protect against response discrepancy
        # attacks (https://cwe.mitre.org/data/definitions/204.html)
        key_hash: Optional[str] = None
        context = CryptContext(schemes=["bcrypt"], deprecated="auto")
        if self.key is not None and self.active:
            key_hash = self.key
        result = context.verify(key, key_hash)

        # same for the previous key, if set and if it's still valid
        key_hash = None
        if (
            self.previous_key is not None
            and self.last_rotated is not None
            and self.active
            and self.retain_period_minutes > 0
        ):
            # check if the previous key is still valid
            if utc_now(
                tz_aware=self.last_rotated
            ) - self.last_rotated < timedelta(
                minutes=self.retain_period_minutes
            ):
                key_hash = self.previous_key
        previous_result = context.verify(key, key_hash)

        return result or previous_result

verify_key(key)

Verifies a given key against the stored (hashed) key(s).

Parameters:

Name Type Description Default
key str

Input key to be verified.

required

Returns:

Type Description
bool

True if the keys match.

Source code in src/zenml/models/v2/core/api_key.py
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
def verify_key(
    self,
    key: str,
) -> bool:
    """Verifies a given key against the stored (hashed) key(s).

    Args:
        key: Input key to be verified.

    Returns:
        True if the keys match.
    """
    # even when the hashed key is not set, we still want to execute
    # the hash verification to protect against response discrepancy
    # attacks (https://cwe.mitre.org/data/definitions/204.html)
    key_hash: Optional[str] = None
    context = CryptContext(schemes=["bcrypt"], deprecated="auto")
    if self.key is not None and self.active:
        key_hash = self.key
    result = context.verify(key, key_hash)

    # same for the previous key, if set and if it's still valid
    key_hash = None
    if (
        self.previous_key is not None
        and self.last_rotated is not None
        and self.active
        and self.retain_period_minutes > 0
    ):
        # check if the previous key is still valid
        if utc_now(
            tz_aware=self.last_rotated
        ) - self.last_rotated < timedelta(
            minutes=self.retain_period_minutes
        ):
            key_hash = self.previous_key
    previous_result = context.verify(key, key_hash)

    return result or previous_result

APIKeyInternalUpdate

Bases: APIKeyUpdate

Update model for API keys used internally.

Source code in src/zenml/models/v2/core/api_key.py
132
133
134
135
136
137
138
class APIKeyInternalUpdate(APIKeyUpdate):
    """Update model for API keys used internally."""

    update_last_login: bool = Field(
        default=False,
        title="Whether to update the last login timestamp.",
    )

APIKeyRequest

Bases: BaseRequest

Request model for API keys.

Source code in src/zenml/models/v2/core/api_key.py
85
86
87
88
89
90
91
92
93
94
95
96
97
class APIKeyRequest(BaseRequest):
    """Request model for API keys."""

    name: str = Field(
        title="The name of the API Key.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    description: Optional[str] = Field(
        default=None,
        title="The description of the API Key.",
        max_length=TEXT_FIELD_MAX_LENGTH,
    )

APIKeyResponse

Bases: BaseIdentifiedResponse[APIKeyResponseBody, APIKeyResponseMetadata, APIKeyResponseResources]

Response model for API keys.

Source code in src/zenml/models/v2/core/api_key.py
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
class APIKeyResponse(
    BaseIdentifiedResponse[
        APIKeyResponseBody, APIKeyResponseMetadata, APIKeyResponseResources
    ]
):
    """Response model for API keys."""

    name: str = Field(
        title="The name of the API Key.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    _warn_on_response_updates = False

    def get_hydrated_version(self) -> "APIKeyResponse":
        """Get the hydrated version of this API key.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_api_key(
            service_account_id=self.service_account.id,
            api_key_name_or_id=self.id,
        )

    # Helper functions
    def set_key(self, key: str) -> None:
        """Sets the API key and encodes it.

        Args:
            key: The API key value to be set.
        """
        self.get_body().key = APIKey(id=self.id, key=key).encode()

    # Body and metadata properties
    @property
    def key(self) -> Optional[str]:
        """The `key` property.

        Returns:
            the value of the property.
        """
        return self.get_body().key

    @property
    def active(self) -> bool:
        """The `active` property.

        Returns:
            the value of the property.
        """
        return self.get_body().active

    @property
    def service_account(self) -> "ServiceAccountResponse":
        """The `service_account` property.

        Returns:
            the value of the property.
        """
        return self.get_body().service_account

    @property
    def description(self) -> str:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

    @property
    def retain_period_minutes(self) -> int:
        """The `retain_period_minutes` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().retain_period_minutes

    @property
    def last_login(self) -> Optional[datetime]:
        """The `last_login` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().last_login

    @property
    def last_rotated(self) -> Optional[datetime]:
        """The `last_rotated` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().last_rotated

active property

The active property.

Returns:

Type Description
bool

the value of the property.

description property

The description property.

Returns:

Type Description
str

the value of the property.

key property

The key property.

Returns:

Type Description
Optional[str]

the value of the property.

last_login property

The last_login property.

Returns:

Type Description
Optional[datetime]

the value of the property.

last_rotated property

The last_rotated property.

Returns:

Type Description
Optional[datetime]

the value of the property.

retain_period_minutes property

The retain_period_minutes property.

Returns:

Type Description
int

the value of the property.

service_account property

The service_account property.

Returns:

Type Description
ServiceAccountResponse

the value of the property.

get_hydrated_version()

Get the hydrated version of this API key.

Returns:

Type Description
APIKeyResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/api_key.py
198
199
200
201
202
203
204
205
206
207
208
209
def get_hydrated_version(self) -> "APIKeyResponse":
    """Get the hydrated version of this API key.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_api_key(
        service_account_id=self.service_account.id,
        api_key_name_or_id=self.id,
    )

set_key(key)

Sets the API key and encodes it.

Parameters:

Name Type Description Default
key str

The API key value to be set.

required
Source code in src/zenml/models/v2/core/api_key.py
212
213
214
215
216
217
218
def set_key(self, key: str) -> None:
    """Sets the API key and encodes it.

    Args:
        key: The API key value to be set.
    """
    self.get_body().key = APIKey(id=self.id, key=key).encode()

APIKeyResponseBody

Bases: BaseDatedResponseBody

Response body for API keys.

Source code in src/zenml/models/v2/core/api_key.py
144
145
146
147
148
149
150
151
152
153
154
155
156
157
class APIKeyResponseBody(BaseDatedResponseBody):
    """Response body for API keys."""

    key: Optional[str] = Field(
        default=None,
        title="The API key. Only set immediately after creation or rotation.",
    )
    active: bool = Field(
        default=True,
        title="Whether the API key is active.",
    )
    service_account: "ServiceAccountResponse" = Field(
        title="The service account associated with this API key."
    )

APIKeyResponseMetadata

Bases: BaseResponseMetadata

Response metadata for API keys.

Source code in src/zenml/models/v2/core/api_key.py
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
class APIKeyResponseMetadata(BaseResponseMetadata):
    """Response metadata for API keys."""

    description: str = Field(
        default="",
        title="The description of the API Key.",
        max_length=TEXT_FIELD_MAX_LENGTH,
    )
    retain_period_minutes: int = Field(
        title="Number of minutes for which the previous key is still valid "
        "after it has been rotated.",
    )
    last_login: Optional[datetime] = Field(
        default=None, title="Time when the API key was last used to log in."
    )
    last_rotated: Optional[datetime] = Field(
        default=None, title="Time when the API key was last rotated."
    )

APIKeyRotateRequest

Bases: BaseRequest

Request model for API key rotation.

Source code in src/zenml/models/v2/core/api_key.py
100
101
102
103
104
105
106
107
class APIKeyRotateRequest(BaseRequest):
    """Request model for API key rotation."""

    retain_period_minutes: int = Field(
        default=0,
        title="Number of minutes for which the previous key is still valid "
        "after it has been rotated.",
    )

APIKeyUpdate

Bases: BaseUpdate

Update model for API keys.

Source code in src/zenml/models/v2/core/api_key.py
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
class APIKeyUpdate(BaseUpdate):
    """Update model for API keys."""

    name: Optional[str] = Field(
        title="The name of the API Key.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    description: Optional[str] = Field(
        title="The description of the API Key.",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    active: Optional[bool] = Field(
        title="Whether the API key is active.",
        default=None,
    )

ActionFilter

Bases: ProjectScopedFilter

Model to enable advanced filtering of all actions.

Source code in src/zenml/models/v2/core/action.py
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
class ActionFilter(ProjectScopedFilter):
    """Model to enable advanced filtering of all actions."""

    name: Optional[str] = Field(
        default=None,
        description="Name of the action.",
    )
    flavor: Optional[str] = Field(
        default=None,
        title="The flavor of the action.",
    )
    plugin_subtype: Optional[str] = Field(
        default=None,
        title="The subtype of the action.",
    )

ActionFlavorResponse

Bases: BasePluginFlavorResponse[ActionFlavorResponseBody, ActionFlavorResponseMetadata, ActionFlavorResponseResources]

Response model for Action Flavors.

Source code in src/zenml/models/v2/core/action_flavor.py
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
class ActionFlavorResponse(
    BasePluginFlavorResponse[
        ActionFlavorResponseBody,
        ActionFlavorResponseMetadata,
        ActionFlavorResponseResources,
    ]
):
    """Response model for Action Flavors."""

    # Body and metadata properties
    @property
    def config_schema(self) -> Dict[str, Any]:
        """The `source_config_schema` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().config_schema

config_schema property

The source_config_schema property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

ActionFlavorResponseBody

Bases: BasePluginResponseBody

Response body for action flavors.

Source code in src/zenml/models/v2/core/action_flavor.py
26
27
class ActionFlavorResponseBody(BasePluginResponseBody):
    """Response body for action flavors."""

ActionFlavorResponseMetadata

Bases: BasePluginResponseMetadata

Response metadata for action flavors.

Source code in src/zenml/models/v2/core/action_flavor.py
30
31
32
33
class ActionFlavorResponseMetadata(BasePluginResponseMetadata):
    """Response metadata for action flavors."""

    config_schema: Dict[str, Any]

ActionFlavorResponseResources

Bases: BasePluginResponseResources

Response resources for action flavors.

Source code in src/zenml/models/v2/core/action_flavor.py
36
37
class ActionFlavorResponseResources(BasePluginResponseResources):
    """Response resources for action flavors."""

ActionRequest

Bases: ProjectScopedRequest

Model for creating a new action.

Source code in src/zenml/models/v2/core/action.py
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
class ActionRequest(ProjectScopedRequest):
    """Model for creating a new action."""

    name: str = Field(
        title="The name of the action.", max_length=STR_FIELD_MAX_LENGTH
    )
    description: str = Field(
        default="",
        title="The description of the action",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    flavor: str = Field(
        title="The flavor of the action.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    plugin_subtype: PluginSubType = Field(
        title="The subtype of the action.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    configuration: Dict[str, Any] = Field(
        title="The configuration for the action.",
    )
    service_account_id: UUID = Field(
        title="The service account that is used to execute the action.",
    )
    auth_window: Optional[int] = Field(
        default=None,
        title="The time window in minutes for which the service account is "
        "authorized to execute the action. Set this to 0 to authorize the "
        "service account indefinitely (not recommended). If not set, a "
        "default value defined for each individual action type is used.",
    )

ActionResponse

Bases: ProjectScopedResponse[ActionResponseBody, ActionResponseMetadata, ActionResponseResources]

Response model for actions.

Source code in src/zenml/models/v2/core/action.py
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
class ActionResponse(
    ProjectScopedResponse[
        ActionResponseBody, ActionResponseMetadata, ActionResponseResources
    ]
):
    """Response model for actions."""

    name: str = Field(
        title="The name of the action.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "ActionResponse":
        """Get the hydrated version of this action.

        Returns:
            An instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_action(self.id)

    # Body and metadata properties
    @property
    def flavor(self) -> str:
        """The `flavor` property.

        Returns:
            the value of the property.
        """
        return self.get_body().flavor

    @property
    def plugin_subtype(self) -> PluginSubType:
        """The `plugin_subtype` property.

        Returns:
            the value of the property.
        """
        return self.get_body().plugin_subtype

    @property
    def description(self) -> str:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

    @property
    def auth_window(self) -> int:
        """The `auth_window` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().auth_window

    @property
    def configuration(self) -> Dict[str, Any]:
        """The `configuration` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().configuration

    def set_configuration(self, configuration: Dict[str, Any]) -> None:
        """Set the `configuration` property.

        Args:
            configuration: The value to set.
        """
        self.get_metadata().configuration = configuration

    # Resource properties
    @property
    def service_account(self) -> "UserResponse":
        """The `service_account` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().service_account

auth_window property

The auth_window property.

Returns:

Type Description
int

the value of the property.

configuration property

The configuration property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

description property

The description property.

Returns:

Type Description
str

the value of the property.

flavor property

The flavor property.

Returns:

Type Description
str

the value of the property.

plugin_subtype property

The plugin_subtype property.

Returns:

Type Description
PluginSubType

the value of the property.

service_account property

The service_account property.

Returns:

Type Description
UserResponse

the value of the property.

get_hydrated_version()

Get the hydrated version of this action.

Returns:

Type Description
ActionResponse

An instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/action.py
184
185
186
187
188
189
190
191
192
def get_hydrated_version(self) -> "ActionResponse":
    """Get the hydrated version of this action.

    Returns:
        An instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_action(self.id)

set_configuration(configuration)

Set the configuration property.

Parameters:

Name Type Description Default
configuration Dict[str, Any]

The value to set.

required
Source code in src/zenml/models/v2/core/action.py
240
241
242
243
244
245
246
def set_configuration(self, configuration: Dict[str, Any]) -> None:
    """Set the `configuration` property.

    Args:
        configuration: The value to set.
    """
    self.get_metadata().configuration = configuration

ActionResponseBody

Bases: ProjectScopedResponseBody

Response body for actions.

Source code in src/zenml/models/v2/core/action.py
134
135
136
137
138
139
140
141
142
143
144
class ActionResponseBody(ProjectScopedResponseBody):
    """Response body for actions."""

    flavor: str = Field(
        title="The flavor of the action.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    plugin_subtype: PluginSubType = Field(
        title="The subtype of the action.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

ActionResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for actions.

Source code in src/zenml/models/v2/core/action.py
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
class ActionResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for actions."""

    description: str = Field(
        default="",
        title="The description of the action.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    configuration: Dict[str, Any] = Field(
        title="The configuration for the action.",
    )
    auth_window: int = Field(
        title="The time window in minutes for which the service account is "
        "authorized to execute the action."
    )

ActionResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the action entity.

Source code in src/zenml/models/v2/core/action.py
164
165
166
167
168
169
class ActionResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the action entity."""

    service_account: UserResponse = Field(
        title="The service account that is used to execute the action.",
    )

ActionUpdate

Bases: BaseUpdate

Update model for actions.

Source code in src/zenml/models/v2/core/action.py
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
class ActionUpdate(BaseUpdate):
    """Update model for actions."""

    name: Optional[str] = Field(
        default=None,
        title="The new name for the action.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    description: Optional[str] = Field(
        default=None,
        title="The new description for the action.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    configuration: Optional[Dict[str, Any]] = Field(
        default=None,
        title="The configuration for the action.",
    )
    service_account_id: Optional[UUID] = Field(
        default=None,
        title="The service account that is used to execute the action.",
    )
    auth_window: Optional[int] = Field(
        default=None,
        title="The time window in minutes for which the service account is "
        "authorized to execute the action. Set this to 0 to authorize the "
        "service account indefinitely (not recommended). If not set, a "
        "default value defined for each individual action type is used.",
    )

    @classmethod
    def from_response(cls, response: "ActionResponse") -> "ActionUpdate":
        """Create an update model from a response model.

        Args:
            response: The response model to create the update model from.

        Returns:
            The update model.
        """
        return ActionUpdate(
            configuration=copy.deepcopy(response.configuration),
        )

from_response(response) classmethod

Create an update model from a response model.

Parameters:

Name Type Description Default
response ActionResponse

The response model to create the update model from.

required

Returns:

Type Description
ActionUpdate

The update model.

Source code in src/zenml/models/v2/core/action.py
116
117
118
119
120
121
122
123
124
125
126
127
128
@classmethod
def from_response(cls, response: "ActionResponse") -> "ActionUpdate":
    """Create an update model from a response model.

    Args:
        response: The response model to create the update model from.

    Returns:
        The update model.
    """
    return ActionUpdate(
        configuration=copy.deepcopy(response.configuration),
    )

ApiTransactionRequest

Bases: UserScopedRequest

Request model for API transactions.

Source code in src/zenml/models/v2/core/api_transaction.py
43
44
45
46
47
48
49
50
51
52
53
54
class ApiTransactionRequest(UserScopedRequest):
    """Request model for API transactions."""

    transaction_id: UUID = Field(
        title="The ID of the transaction.",
    )
    method: str = Field(
        title="The HTTP method of the transaction.",
    )
    url: str = Field(
        title="The URL of the transaction.",
    )

ApiTransactionResponse

Bases: UserScopedResponse[ApiTransactionResponseBody, ApiTransactionResponseMetadata, ApiTransactionResponseResources]

Response model for API transactions.

Source code in src/zenml/models/v2/core/api_transaction.py
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
class ApiTransactionResponse(
    UserScopedResponse[
        ApiTransactionResponseBody,
        ApiTransactionResponseMetadata,
        ApiTransactionResponseResources,
    ]
):
    """Response model for API transactions."""

    def get_hydrated_version(self) -> "ApiTransactionResponse":
        """Get the hydrated version of this API transaction.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        return self

    # Body and metadata properties

    @property
    def method(self) -> str:
        """The `method` property.

        Returns:
            the value of the property.
        """
        return self.get_body().method

    @property
    def url(self) -> str:
        """The `url` property.

        Returns:
            the value of the property.
        """
        return self.get_body().url

    @property
    def completed(self) -> bool:
        """The `completed` property.

        Returns:
            the value of the property.
        """
        return self.get_body().completed

    @property
    def result(self) -> Optional[PlainSerializedSecretStr]:
        """The `result` property.

        Returns:
            the value of the property.
        """
        return self.get_body().result

    def get_result(self) -> Optional[str]:
        """Get the result of the API transaction.

        Returns:
            the result of the API transaction.
        """
        result = self.result
        if result is None:
            return None
        return result.get_secret_value()

    def set_result(self, result: str) -> None:
        """Set the result of the API transaction.

        Args:
            result: the result of the API transaction.
        """
        self.get_body().result = SecretStr(result)

completed property

The completed property.

Returns:

Type Description
bool

the value of the property.

method property

The method property.

Returns:

Type Description
str

the value of the property.

result property

The result property.

Returns:

Type Description
Optional[PlainSerializedSecretStr]

the value of the property.

url property

The url property.

Returns:

Type Description
str

the value of the property.

get_hydrated_version()

Get the hydrated version of this API transaction.

Returns:

Type Description
ApiTransactionResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/api_transaction.py
130
131
132
133
134
135
136
def get_hydrated_version(self) -> "ApiTransactionResponse":
    """Get the hydrated version of this API transaction.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    return self

get_result()

Get the result of the API transaction.

Returns:

Type Description
Optional[str]

the result of the API transaction.

Source code in src/zenml/models/v2/core/api_transaction.py
176
177
178
179
180
181
182
183
184
185
def get_result(self) -> Optional[str]:
    """Get the result of the API transaction.

    Returns:
        the result of the API transaction.
    """
    result = self.result
    if result is None:
        return None
    return result.get_secret_value()

set_result(result)

Set the result of the API transaction.

Parameters:

Name Type Description Default
result str

the result of the API transaction.

required
Source code in src/zenml/models/v2/core/api_transaction.py
187
188
189
190
191
192
193
def set_result(self, result: str) -> None:
    """Set the result of the API transaction.

    Args:
        result: the result of the API transaction.
    """
    self.get_body().result = SecretStr(result)

ApiTransactionResponseBody

Bases: UserScopedResponseBody

Response body for API transactions.

Source code in src/zenml/models/v2/core/api_transaction.py
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
class ApiTransactionResponseBody(UserScopedResponseBody):
    """Response body for API transactions."""

    method: str = Field(
        title="The HTTP method of the transaction.",
    )
    url: str = Field(
        title="The URL of the transaction.",
    )
    completed: bool = Field(
        title="Whether the transaction is completed.",
    )
    result: Optional[PlainSerializedSecretStr] = Field(
        default=None,
        title="The response payload.",
    )

ApiTransactionResponseMetadata

Bases: UserScopedResponseMetadata

Response metadata for API transactions.

Source code in src/zenml/models/v2/core/api_transaction.py
113
114
class ApiTransactionResponseMetadata(UserScopedResponseMetadata):
    """Response metadata for API transactions."""

ApiTransactionResponseResources

Bases: UserScopedResponseResources

Response resources for API transactions.

Source code in src/zenml/models/v2/core/api_transaction.py
117
118
class ApiTransactionResponseResources(UserScopedResponseResources):
    """Response resources for API transactions."""

ApiTransactionUpdate

Bases: BaseUpdate

Update model for stack components.

Source code in src/zenml/models/v2/core/api_transaction.py
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
class ApiTransactionUpdate(BaseUpdate):
    """Update model for stack components."""

    result: Optional[PlainSerializedSecretStr] = Field(
        default=None,
        title="The response payload.",
    )
    cache_time: int = Field(
        title="The time in seconds that the transaction is kept around after "
        "completion."
    )

    def get_result(self) -> Optional[str]:
        """Get the result of the API transaction.

        Returns:
            the result of the API transaction.
        """
        result = self.result
        if result is None:
            return None
        return result.get_secret_value()

    def set_result(self, result: str) -> None:
        """Set the result of the API transaction.

        Args:
            result: the result of the API transaction.
        """
        self.result = SecretStr(result)

get_result()

Get the result of the API transaction.

Returns:

Type Description
Optional[str]

the result of the API transaction.

Source code in src/zenml/models/v2/core/api_transaction.py
72
73
74
75
76
77
78
79
80
81
def get_result(self) -> Optional[str]:
    """Get the result of the API transaction.

    Returns:
        the result of the API transaction.
    """
    result = self.result
    if result is None:
        return None
    return result.get_secret_value()

set_result(result)

Set the result of the API transaction.

Parameters:

Name Type Description Default
result str

the result of the API transaction.

required
Source code in src/zenml/models/v2/core/api_transaction.py
83
84
85
86
87
88
89
def set_result(self, result: str) -> None:
    """Set the result of the API transaction.

    Args:
        result: the result of the API transaction.
    """
    self.result = SecretStr(result)

ArtifactFilter

Bases: ProjectScopedFilter, TaggableFilter

Model to enable advanced filtering of artifacts.

Source code in src/zenml/models/v2/core/artifact.py
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
class ArtifactFilter(ProjectScopedFilter, TaggableFilter):
    """Model to enable advanced filtering of artifacts."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.FILTER_EXCLUDE_FIELDS,
        *TaggableFilter.FILTER_EXCLUDE_FIELDS,
    ]

    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CUSTOM_SORTING_OPTIONS,
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
        SORT_BY_LATEST_VERSION_KEY,
    ]

    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CLI_EXCLUDE_FIELDS,
        *TaggableFilter.CLI_EXCLUDE_FIELDS,
    ]

    name: Optional[str] = None
    has_custom_name: Optional[bool] = None

    def apply_sorting(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Apply sorting to the query for Artifacts.

        Args:
            query: The query to which to apply the sorting.
            table: The query table.

        Returns:
            The query with sorting applied.
        """
        from sqlmodel import asc, case, col, desc, func, select

        from zenml.enums import SorterOps
        from zenml.zen_stores.schemas import (
            ArtifactSchema,
            ArtifactVersionSchema,
        )

        sort_by, operand = self.sorting_params

        if sort_by == SORT_BY_LATEST_VERSION_KEY:
            # Subquery to find the latest version per artifact
            latest_version_subquery = (
                select(
                    ArtifactSchema.id,
                    case(
                        (
                            func.max(ArtifactVersionSchema.created).is_(None),
                            ArtifactSchema.created,
                        ),
                        else_=func.max(ArtifactVersionSchema.created),
                    ).label("latest_version_created"),
                )
                .outerjoin(
                    ArtifactVersionSchema,
                    ArtifactSchema.id == ArtifactVersionSchema.artifact_id,  # type: ignore[arg-type]
                )
                .group_by(col(ArtifactSchema.id))
                .subquery()
            )

            query = query.add_columns(
                latest_version_subquery.c.latest_version_created,
            ).where(ArtifactSchema.id == latest_version_subquery.c.id)

            # Apply sorting based on the operand
            if operand == SorterOps.ASCENDING:
                query = query.order_by(
                    asc(latest_version_subquery.c.latest_version_created),
                    asc(ArtifactSchema.id),
                )
            else:
                query = query.order_by(
                    desc(latest_version_subquery.c.latest_version_created),
                    desc(ArtifactSchema.id),
                )
            return query

        # For other sorting cases, delegate to the parent class
        return super().apply_sorting(query=query, table=table)

apply_sorting(query, table)

Apply sorting to the query for Artifacts.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the sorting.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with sorting applied.

Source code in src/zenml/models/v2/core/artifact.py
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
def apply_sorting(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Apply sorting to the query for Artifacts.

    Args:
        query: The query to which to apply the sorting.
        table: The query table.

    Returns:
        The query with sorting applied.
    """
    from sqlmodel import asc, case, col, desc, func, select

    from zenml.enums import SorterOps
    from zenml.zen_stores.schemas import (
        ArtifactSchema,
        ArtifactVersionSchema,
    )

    sort_by, operand = self.sorting_params

    if sort_by == SORT_BY_LATEST_VERSION_KEY:
        # Subquery to find the latest version per artifact
        latest_version_subquery = (
            select(
                ArtifactSchema.id,
                case(
                    (
                        func.max(ArtifactVersionSchema.created).is_(None),
                        ArtifactSchema.created,
                    ),
                    else_=func.max(ArtifactVersionSchema.created),
                ).label("latest_version_created"),
            )
            .outerjoin(
                ArtifactVersionSchema,
                ArtifactSchema.id == ArtifactVersionSchema.artifact_id,  # type: ignore[arg-type]
            )
            .group_by(col(ArtifactSchema.id))
            .subquery()
        )

        query = query.add_columns(
            latest_version_subquery.c.latest_version_created,
        ).where(ArtifactSchema.id == latest_version_subquery.c.id)

        # Apply sorting based on the operand
        if operand == SorterOps.ASCENDING:
            query = query.order_by(
                asc(latest_version_subquery.c.latest_version_created),
                asc(ArtifactSchema.id),
            )
        else:
            query = query.order_by(
                desc(latest_version_subquery.c.latest_version_created),
                desc(ArtifactSchema.id),
            )
        return query

    # For other sorting cases, delegate to the parent class
    return super().apply_sorting(query=query, table=table)

ArtifactRequest

Bases: ProjectScopedRequest

Artifact request model.

Source code in src/zenml/models/v2/core/artifact.py
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
class ArtifactRequest(ProjectScopedRequest):
    """Artifact request model."""

    name: str = Field(
        title="Name of the artifact.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    has_custom_name: bool = Field(
        title="Whether the name is custom (True) or auto-generated (False).",
        default=False,
    )
    tags: Optional[List[str]] = Field(
        title="Artifact tags.",
        description="Should be a list of plain strings, e.g., ['tag1', 'tag2']",
        default=None,
    )

ArtifactResponse

Bases: ProjectScopedResponse[ArtifactResponseBody, ArtifactResponseMetadata, ArtifactResponseResources]

Artifact response model.

Source code in src/zenml/models/v2/core/artifact.py
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
class ArtifactResponse(
    ProjectScopedResponse[
        ArtifactResponseBody,
        ArtifactResponseMetadata,
        ArtifactResponseResources,
    ]
):
    """Artifact response model."""

    def get_hydrated_version(self) -> "ArtifactResponse":
        """Get the hydrated version of this artifact.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_artifact(self.id)

    name: str = Field(
        title="Name of the output in the parent step.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    # Body and metadata properties
    @property
    def tags(self) -> List[TagResponse]:
        """The `tags` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().tags

    @property
    def latest_version_name(self) -> Optional[str]:
        """The `latest_version_name` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().latest_version_name

    @property
    def latest_version_id(self) -> Optional[UUID]:
        """The `latest_version_id` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().latest_version_id

    @property
    def has_custom_name(self) -> bool:
        """The `has_custom_name` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().has_custom_name

    # Helper methods
    @property
    def versions(self) -> Dict[str, "ArtifactVersionResponse"]:
        """Get a list of all versions of this artifact.

        Returns:
            A list of all versions of this artifact.
        """
        from zenml.client import Client

        responses = Client().list_artifact_versions(artifact=self.name)
        return {str(response.version): response for response in responses}

has_custom_name property

The has_custom_name property.

Returns:

Type Description
bool

the value of the property.

latest_version_id property

The latest_version_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

latest_version_name property

The latest_version_name property.

Returns:

Type Description
Optional[str]

the value of the property.

tags property

The tags property.

Returns:

Type Description
List[TagResponse]

the value of the property.

versions property

Get a list of all versions of this artifact.

Returns:

Type Description
Dict[str, ArtifactVersionResponse]

A list of all versions of this artifact.

get_hydrated_version()

Get the hydrated version of this artifact.

Returns:

Type Description
ArtifactResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/artifact.py
120
121
122
123
124
125
126
127
128
def get_hydrated_version(self) -> "ArtifactResponse":
    """Get the hydrated version of this artifact.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_artifact(self.id)

ArtifactResponseBody

Bases: ProjectScopedResponseBody

Response body for artifacts.

Source code in src/zenml/models/v2/core/artifact.py
87
88
class ArtifactResponseBody(ProjectScopedResponseBody):
    """Response body for artifacts."""

ArtifactResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for artifacts.

Source code in src/zenml/models/v2/core/artifact.py
91
92
93
94
95
96
97
class ArtifactResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for artifacts."""

    has_custom_name: bool = Field(
        title="Whether the name is custom (True) or auto-generated (False).",
        default=False,
    )

ArtifactResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the Artifact Entity.

Source code in src/zenml/models/v2/core/artifact.py
100
101
102
103
104
105
106
107
108
class ArtifactResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the Artifact Entity."""

    tags: List[TagResponse] = Field(
        title="Tags associated with the artifact.",
    )
    # TODO: maybe move these back to body or figure out a better solution
    latest_version_name: Optional[str] = None
    latest_version_id: Optional[UUID] = None

ArtifactUpdate

Bases: BaseUpdate

Artifact update model.

Source code in src/zenml/models/v2/core/artifact.py
75
76
77
78
79
80
81
class ArtifactUpdate(BaseUpdate):
    """Artifact update model."""

    name: Optional[str] = None
    add_tags: Optional[List[str]] = None
    remove_tags: Optional[List[str]] = None
    has_custom_name: Optional[bool] = None

ArtifactVersionFilter

Bases: ProjectScopedFilter, TaggableFilter, RunMetadataFilterMixin

Model to enable advanced filtering of artifact versions.

Source code in src/zenml/models/v2/core/artifact_version.py
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
class ArtifactVersionFilter(
    ProjectScopedFilter, TaggableFilter, RunMetadataFilterMixin
):
    """Model to enable advanced filtering of artifact versions."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.FILTER_EXCLUDE_FIELDS,
        *TaggableFilter.FILTER_EXCLUDE_FIELDS,
        *RunMetadataFilterMixin.FILTER_EXCLUDE_FIELDS,
        "artifact_id",
        "artifact",
        "only_unused",
        "has_custom_name",
        "model",
        "pipeline_run",
        "model_version_id",
    ]
    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CUSTOM_SORTING_OPTIONS,
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
        *RunMetadataFilterMixin.CUSTOM_SORTING_OPTIONS,
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CLI_EXCLUDE_FIELDS,
        *TaggableFilter.CLI_EXCLUDE_FIELDS,
        *RunMetadataFilterMixin.CLI_EXCLUDE_FIELDS,
        "artifact_id",
    ]
    API_MULTI_INPUT_PARAMS: ClassVar[List[str]] = [
        *ProjectScopedFilter.API_MULTI_INPUT_PARAMS,
        *TaggableFilter.API_MULTI_INPUT_PARAMS,
        *RunMetadataFilterMixin.API_MULTI_INPUT_PARAMS,
    ]

    artifact: Optional[Union[UUID, str]] = Field(
        default=None,
        description="The name or ID of the artifact which the search is scoped "
        "to. This field must always be set and is always applied in addition "
        "to the other filters, regardless of the value of the "
        "logical_operator field.",
        union_mode="left_to_right",
    )
    artifact_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="[Deprecated] Use 'artifact' instead. ID of the artifact to which this version belongs.",
        union_mode="left_to_right",
    )
    version: Optional[str] = Field(
        default=None,
        description="Version of the artifact",
    )
    version_number: Optional[Union[int, str]] = Field(
        default=None,
        description="Version of the artifact if it is an integer",
        union_mode="left_to_right",
    )
    uri: Optional[str] = Field(
        default=None,
        description="Uri of the artifact",
    )
    materializer: Optional[str] = Field(
        default=None,
        description="Materializer used to produce the artifact",
    )
    type: Optional[str] = Field(
        default=None,
        description="Type of the artifact",
    )
    data_type: Optional[str] = Field(
        default=None,
        description="Datatype of the artifact",
    )
    artifact_store_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Artifact store for this artifact",
        union_mode="left_to_right",
    )
    model_version_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="ID of the model version that is associated with this "
        "artifact version.",
        union_mode="left_to_right",
    )
    only_unused: Optional[bool] = Field(
        default=False, description="Filter only for unused artifacts"
    )
    has_custom_name: Optional[bool] = Field(
        default=None,
        description="Filter only artifacts with/without custom names.",
    )
    model: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the model that is associated with this "
        "artifact version.",
    )
    pipeline_run: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of a pipeline run that is associated with this "
        "artifact version.",
    )

    model_config = ConfigDict(protected_namespaces=())

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List[Union["ColumnElement[bool]"]]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_, or_, select

        from zenml.zen_stores.schemas import (
            ArtifactSchema,
            ArtifactVersionSchema,
            ModelSchema,
            ModelVersionArtifactSchema,
            ModelVersionSchema,
            PipelineRunSchema,
            StepRunInputArtifactSchema,
            StepRunOutputArtifactSchema,
            StepRunSchema,
        )

        if self.artifact:
            value, operator = self._resolve_operator(self.artifact)
            artifact_filter = and_(
                ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.artifact, table=ArtifactSchema
                ),
            )
            custom_filters.append(artifact_filter)

        if self.only_unused:
            unused_filter = and_(
                ArtifactVersionSchema.id.notin_(  # type: ignore[attr-defined]
                    select(StepRunOutputArtifactSchema.artifact_id)
                ),
                ArtifactVersionSchema.id.notin_(  # type: ignore[attr-defined]
                    select(StepRunInputArtifactSchema.artifact_id)
                ),
            )
            custom_filters.append(unused_filter)

        if self.model_version_id:
            value, operator = self._resolve_operator(self.model_version_id)

            model_version_filter = and_(
                ArtifactVersionSchema.id
                == ModelVersionArtifactSchema.artifact_version_id,
                ModelVersionArtifactSchema.model_version_id
                == ModelVersionSchema.id,
                FilterGenerator(ModelVersionSchema)
                .define_filter(column="id", value=value, operator=operator)
                .generate_query_conditions(ModelVersionSchema),
            )
            custom_filters.append(model_version_filter)

        if self.has_custom_name is not None:
            custom_name_filter = and_(
                ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
                ArtifactSchema.has_custom_name == self.has_custom_name,
            )
            custom_filters.append(custom_name_filter)

        if self.model:
            model_filter = and_(
                ArtifactVersionSchema.id
                == ModelVersionArtifactSchema.artifact_version_id,
                ModelVersionArtifactSchema.model_version_id
                == ModelVersionSchema.id,
                ModelVersionSchema.model_id == ModelSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.model, table=ModelSchema
                ),
            )
            custom_filters.append(model_filter)

        if self.pipeline_run:
            pipeline_run_filter = and_(
                or_(
                    and_(
                        ArtifactVersionSchema.id
                        == StepRunOutputArtifactSchema.artifact_id,
                        StepRunOutputArtifactSchema.step_id
                        == StepRunSchema.id,
                    ),
                    and_(
                        ArtifactVersionSchema.id
                        == StepRunInputArtifactSchema.artifact_id,
                        StepRunInputArtifactSchema.step_id == StepRunSchema.id,
                    ),
                ),
                StepRunSchema.pipeline_run_id == PipelineRunSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.pipeline_run, table=PipelineRunSchema
                ),
            )
            custom_filters.append(pipeline_run_filter)

        return custom_filters

    @model_validator(mode="after")
    def _migrate_artifact_id(self) -> "ArtifactVersionFilter":
        """Migrate value from the deprecated artifact_id attribute.

        Returns:
            The filter with migrated value.
        """
        # Handle deprecated artifact_id field
        if self.artifact_id is not None:
            logger.warning(
                "The 'ArtifactVersionFilter.artifact_id' field is deprecated "
                "and will be removed in a future version. Please use "
                "'ArtifactVersionFilter.artifact' instead."
            )
            self.artifact = self.artifact or self.artifact_id

        return self

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[Union[ColumnElement[bool]]]

A list of custom filters.

Source code in src/zenml/models/v2/core/artifact_version.py
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List[Union["ColumnElement[bool]"]]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_, or_, select

    from zenml.zen_stores.schemas import (
        ArtifactSchema,
        ArtifactVersionSchema,
        ModelSchema,
        ModelVersionArtifactSchema,
        ModelVersionSchema,
        PipelineRunSchema,
        StepRunInputArtifactSchema,
        StepRunOutputArtifactSchema,
        StepRunSchema,
    )

    if self.artifact:
        value, operator = self._resolve_operator(self.artifact)
        artifact_filter = and_(
            ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.artifact, table=ArtifactSchema
            ),
        )
        custom_filters.append(artifact_filter)

    if self.only_unused:
        unused_filter = and_(
            ArtifactVersionSchema.id.notin_(  # type: ignore[attr-defined]
                select(StepRunOutputArtifactSchema.artifact_id)
            ),
            ArtifactVersionSchema.id.notin_(  # type: ignore[attr-defined]
                select(StepRunInputArtifactSchema.artifact_id)
            ),
        )
        custom_filters.append(unused_filter)

    if self.model_version_id:
        value, operator = self._resolve_operator(self.model_version_id)

        model_version_filter = and_(
            ArtifactVersionSchema.id
            == ModelVersionArtifactSchema.artifact_version_id,
            ModelVersionArtifactSchema.model_version_id
            == ModelVersionSchema.id,
            FilterGenerator(ModelVersionSchema)
            .define_filter(column="id", value=value, operator=operator)
            .generate_query_conditions(ModelVersionSchema),
        )
        custom_filters.append(model_version_filter)

    if self.has_custom_name is not None:
        custom_name_filter = and_(
            ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
            ArtifactSchema.has_custom_name == self.has_custom_name,
        )
        custom_filters.append(custom_name_filter)

    if self.model:
        model_filter = and_(
            ArtifactVersionSchema.id
            == ModelVersionArtifactSchema.artifact_version_id,
            ModelVersionArtifactSchema.model_version_id
            == ModelVersionSchema.id,
            ModelVersionSchema.model_id == ModelSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.model, table=ModelSchema
            ),
        )
        custom_filters.append(model_filter)

    if self.pipeline_run:
        pipeline_run_filter = and_(
            or_(
                and_(
                    ArtifactVersionSchema.id
                    == StepRunOutputArtifactSchema.artifact_id,
                    StepRunOutputArtifactSchema.step_id
                    == StepRunSchema.id,
                ),
                and_(
                    ArtifactVersionSchema.id
                    == StepRunInputArtifactSchema.artifact_id,
                    StepRunInputArtifactSchema.step_id == StepRunSchema.id,
                ),
            ),
            StepRunSchema.pipeline_run_id == PipelineRunSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.pipeline_run, table=PipelineRunSchema
            ),
        )
        custom_filters.append(pipeline_run_filter)

    return custom_filters

ArtifactVersionRequest

Bases: ProjectScopedRequest

Request model for artifact versions.

Source code in src/zenml/models/v2/core/artifact_version.py
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
class ArtifactVersionRequest(ProjectScopedRequest):
    """Request model for artifact versions."""

    artifact_id: Optional[UUID] = Field(
        default=None,
        title="ID of the artifact to which this version belongs.",
    )
    artifact_name: Optional[str] = Field(
        default=None,
        title="Name of the artifact to which this version belongs.",
    )
    version: Optional[Union[int, str]] = Field(
        default=None, title="Version of the artifact."
    )
    has_custom_name: bool = Field(
        title="Whether the name is custom (True) or auto-generated (False).",
        default=False,
    )
    type: ArtifactType = Field(title="Type of the artifact.")
    artifact_store_id: Optional[UUID] = Field(
        title="ID of the artifact store in which this artifact is stored.",
        default=None,
    )
    uri: str = Field(
        title="URI of the artifact.", max_length=TEXT_FIELD_MAX_LENGTH
    )
    materializer: SourceWithValidator = Field(
        title="Materializer class to use for this artifact.",
    )
    data_type: SourceWithValidator = Field(
        title="Data type of the artifact.",
    )
    tags: Optional[List[str]] = Field(
        title="Tags of the artifact.",
        description="Should be a list of plain strings, e.g., ['tag1', 'tag2']",
        default=None,
    )
    visualizations: Optional[List["ArtifactVisualizationRequest"]] = Field(
        default=None, title="Visualizations of the artifact."
    )
    save_type: ArtifactSaveType = Field(
        title="The save type of the artifact version.",
    )
    metadata: Optional[Dict[str, MetadataType]] = Field(
        default=None, title="Metadata of the artifact version."
    )

    @field_validator("version")
    @classmethod
    def str_field_max_length_check(cls, value: Any) -> Any:
        """Checks if the length of the value exceeds the maximum str length.

        Args:
            value: the value set in the field

        Returns:
            the value itself.

        Raises:
            AssertionError: if the length of the field is longer than the
                maximum threshold.
        """
        assert len(str(value)) < STR_FIELD_MAX_LENGTH, (
            "The length of the value for this field can not "
            f"exceed {STR_FIELD_MAX_LENGTH}"
        )
        return value

    @model_validator(mode="after")
    def _validate_request(self) -> "ArtifactVersionRequest":
        """Validate the request values.

        Raises:
            ValueError: If the request is invalid.

        Returns:
            The validated request.
        """
        if self.artifact_id and self.artifact_name:
            raise ValueError(
                "Only one of artifact_name and artifact_id can be set."
            )

        if not (self.artifact_id or self.artifact_name):
            raise ValueError(
                "Either artifact_name or artifact_id must be set."
            )

        return self

str_field_max_length_check(value) classmethod

Checks if the length of the value exceeds the maximum str length.

Parameters:

Name Type Description Default
value Any

the value set in the field

required

Returns:

Type Description
Any

the value itself.

Raises:

Type Description
AssertionError

if the length of the field is longer than the maximum threshold.

Source code in src/zenml/models/v2/core/artifact_version.py
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
@field_validator("version")
@classmethod
def str_field_max_length_check(cls, value: Any) -> Any:
    """Checks if the length of the value exceeds the maximum str length.

    Args:
        value: the value set in the field

    Returns:
        the value itself.

    Raises:
        AssertionError: if the length of the field is longer than the
            maximum threshold.
    """
    assert len(str(value)) < STR_FIELD_MAX_LENGTH, (
        "The length of the value for this field can not "
        f"exceed {STR_FIELD_MAX_LENGTH}"
    )
    return value

ArtifactVersionResponse

Bases: ProjectScopedResponse[ArtifactVersionResponseBody, ArtifactVersionResponseMetadata, ArtifactVersionResponseResources]

Response model for artifact versions.

Source code in src/zenml/models/v2/core/artifact_version.py
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
class ArtifactVersionResponse(
    ProjectScopedResponse[
        ArtifactVersionResponseBody,
        ArtifactVersionResponseMetadata,
        ArtifactVersionResponseResources,
    ]
):
    """Response model for artifact versions."""

    def get_hydrated_version(self) -> "ArtifactVersionResponse":
        """Get the hydrated version of this artifact version.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_artifact_version(self.id)

    # Body and metadata properties
    @property
    def artifact(self) -> "ArtifactResponse":
        """The `artifact` property.

        Returns:
            the value of the property.
        """
        return self.get_body().artifact

    @property
    def version(self) -> str:
        """The `version` property.

        Returns:
            the value of the property.
        """
        return self.get_body().version

    @property
    def uri(self) -> str:
        """The `uri` property.

        Returns:
            the value of the property.
        """
        return self.get_body().uri

    @property
    def type(self) -> ArtifactType:
        """The `type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().type

    @property
    def tags(self) -> List[TagResponse]:
        """The `tags` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().tags

    @property
    def producer_pipeline_run_id(self) -> Optional[UUID]:
        """The `producer_pipeline_run_id` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().producer_pipeline_run_id

    @property
    def save_type(self) -> ArtifactSaveType:
        """The `save_type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().save_type

    @property
    def artifact_store_id(self) -> Optional[UUID]:
        """The `artifact_store_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().artifact_store_id

    @property
    def producer_step_run_id(self) -> Optional[UUID]:
        """The `producer_step_run_id` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().producer_step_run_id

    @property
    def visualizations(
        self,
    ) -> Optional[List["ArtifactVisualizationResponse"]]:
        """The `visualizations` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().visualizations

    @property
    def run_metadata(self) -> Dict[str, MetadataType]:
        """The `metadata` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().run_metadata

    @property
    def materializer(self) -> Source:
        """The `materializer` property.

        Returns:
            the value of the property.
        """
        return self.get_body().materializer

    @property
    def data_type(self) -> Source:
        """The `data_type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().data_type

    # Helper methods
    @property
    def name(self) -> str:
        """The `name` property.

        Returns:
            the value of the property.
        """
        return self.artifact.name

    @property
    def step(self) -> "StepRunResponse":
        """Get the step that produced this artifact.

        Returns:
            The step that produced this artifact.
        """
        from zenml.artifacts.utils import get_producer_step_of_artifact

        return get_producer_step_of_artifact(self)

    @property
    def run(self) -> "PipelineRunResponse":
        """Get the pipeline run that produced this artifact.

        Returns:
            The pipeline run that produced this artifact.
        """
        from zenml.client import Client

        return Client().get_pipeline_run(self.step.pipeline_run_id)

    def load(self) -> Any:
        """Materializes (loads) the data stored in this artifact.

        Returns:
            The materialized data.
        """
        from zenml.artifacts.utils import load_artifact_from_response

        return load_artifact_from_response(self)

    def download_files(self, path: str, overwrite: bool = False) -> None:
        """Downloads data for an artifact with no materializing.

        Any artifacts will be saved as a zip file to the given path.

        Args:
            path: The path to save the binary data to.
            overwrite: Whether to overwrite the file if it already exists.

        Raises:
            ValueError: If the path does not end with '.zip'.
        """
        if not path.endswith(".zip"):
            raise ValueError(
                "The path should end with '.zip' to save the binary data."
            )
        from zenml.artifacts.utils import (
            download_artifact_files_from_response,
        )

        download_artifact_files_from_response(
            self,
            path=path,
            overwrite=overwrite,
        )

    def visualize(self, title: Optional[str] = None) -> None:
        """Visualize the artifact in notebook environments.

        Args:
            title: Optional title to show before the visualizations.
        """
        from zenml.utils.visualization_utils import visualize_artifact

        visualize_artifact(self, title=title)

artifact property

The artifact property.

Returns:

Type Description
ArtifactResponse

the value of the property.

artifact_store_id property

The artifact_store_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

data_type property

The data_type property.

Returns:

Type Description
Source

the value of the property.

materializer property

The materializer property.

Returns:

Type Description
Source

the value of the property.

name property

The name property.

Returns:

Type Description
str

the value of the property.

producer_pipeline_run_id property

The producer_pipeline_run_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

producer_step_run_id property

The producer_step_run_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

run property

Get the pipeline run that produced this artifact.

Returns:

Type Description
PipelineRunResponse

The pipeline run that produced this artifact.

run_metadata property

The metadata property.

Returns:

Type Description
Dict[str, MetadataType]

the value of the property.

save_type property

The save_type property.

Returns:

Type Description
ArtifactSaveType

the value of the property.

step property

Get the step that produced this artifact.

Returns:

Type Description
StepRunResponse

The step that produced this artifact.

tags property

The tags property.

Returns:

Type Description
List[TagResponse]

the value of the property.

type property

The type property.

Returns:

Type Description
ArtifactType

the value of the property.

uri property

The uri property.

Returns:

Type Description
str

the value of the property.

version property

The version property.

Returns:

Type Description
str

the value of the property.

visualizations property

The visualizations property.

Returns:

Type Description
Optional[List[ArtifactVisualizationResponse]]

the value of the property.

download_files(path, overwrite=False)

Downloads data for an artifact with no materializing.

Any artifacts will be saved as a zip file to the given path.

Parameters:

Name Type Description Default
path str

The path to save the binary data to.

required
overwrite bool

Whether to overwrite the file if it already exists.

False

Raises:

Type Description
ValueError

If the path does not end with '.zip'.

Source code in src/zenml/models/v2/core/artifact_version.py
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
def download_files(self, path: str, overwrite: bool = False) -> None:
    """Downloads data for an artifact with no materializing.

    Any artifacts will be saved as a zip file to the given path.

    Args:
        path: The path to save the binary data to.
        overwrite: Whether to overwrite the file if it already exists.

    Raises:
        ValueError: If the path does not end with '.zip'.
    """
    if not path.endswith(".zip"):
        raise ValueError(
            "The path should end with '.zip' to save the binary data."
        )
    from zenml.artifacts.utils import (
        download_artifact_files_from_response,
    )

    download_artifact_files_from_response(
        self,
        path=path,
        overwrite=overwrite,
    )

get_hydrated_version()

Get the hydrated version of this artifact version.

Returns:

Type Description
ArtifactVersionResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/artifact_version.py
263
264
265
266
267
268
269
270
271
def get_hydrated_version(self) -> "ArtifactVersionResponse":
    """Get the hydrated version of this artifact version.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_artifact_version(self.id)

load()

Materializes (loads) the data stored in this artifact.

Returns:

Type Description
Any

The materialized data.

Source code in src/zenml/models/v2/core/artifact_version.py
425
426
427
428
429
430
431
432
433
def load(self) -> Any:
    """Materializes (loads) the data stored in this artifact.

    Returns:
        The materialized data.
    """
    from zenml.artifacts.utils import load_artifact_from_response

    return load_artifact_from_response(self)

visualize(title=None)

Visualize the artifact in notebook environments.

Parameters:

Name Type Description Default
title Optional[str]

Optional title to show before the visualizations.

None
Source code in src/zenml/models/v2/core/artifact_version.py
461
462
463
464
465
466
467
468
469
def visualize(self, title: Optional[str] = None) -> None:
    """Visualize the artifact in notebook environments.

    Args:
        title: Optional title to show before the visualizations.
    """
    from zenml.utils.visualization_utils import visualize_artifact

    visualize_artifact(self, title=title)

ArtifactVersionResponseBody

Bases: ProjectScopedResponseBody

Response body for artifact versions.

Source code in src/zenml/models/v2/core/artifact_version.py
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
class ArtifactVersionResponseBody(ProjectScopedResponseBody):
    """Response body for artifact versions."""

    artifact: ArtifactResponse = Field(
        title="Artifact to which this version belongs."
    )
    version: str = Field(title="Version of the artifact.")
    uri: str = Field(
        title="URI of the artifact.", max_length=TEXT_FIELD_MAX_LENGTH
    )
    type: ArtifactType = Field(title="Type of the artifact.")
    materializer: SourceWithValidator = Field(
        title="Materializer class to use for this artifact.",
    )
    data_type: SourceWithValidator = Field(
        title="Data type of the artifact.",
    )
    save_type: ArtifactSaveType = Field(
        title="The save type of the artifact version.",
    )
    artifact_store_id: Optional[UUID] = Field(
        title="ID of the artifact store in which this artifact is stored.",
        default=None,
    )

    @field_validator("version")
    @classmethod
    def str_field_max_length_check(cls, value: Any) -> Any:
        """Checks if the length of the value exceeds the maximum str length.

        Args:
            value: the value set in the field

        Returns:
            the value itself.

        Raises:
            AssertionError: if the length of the field is longer than the
                maximum threshold.
        """
        assert len(str(value)) < STR_FIELD_MAX_LENGTH, (
            "The length of the value for this field can not "
            f"exceed {STR_FIELD_MAX_LENGTH}"
        )
        return value

str_field_max_length_check(value) classmethod

Checks if the length of the value exceeds the maximum str length.

Parameters:

Name Type Description Default
value Any

the value set in the field

required

Returns:

Type Description
Any

the value itself.

Raises:

Type Description
AssertionError

if the length of the field is longer than the maximum threshold.

Source code in src/zenml/models/v2/core/artifact_version.py
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
@field_validator("version")
@classmethod
def str_field_max_length_check(cls, value: Any) -> Any:
    """Checks if the length of the value exceeds the maximum str length.

    Args:
        value: the value set in the field

    Returns:
        the value itself.

    Raises:
        AssertionError: if the length of the field is longer than the
            maximum threshold.
    """
    assert len(str(value)) < STR_FIELD_MAX_LENGTH, (
        "The length of the value for this field can not "
        f"exceed {STR_FIELD_MAX_LENGTH}"
    )
    return value

ArtifactVersionResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for artifact versions.

Source code in src/zenml/models/v2/core/artifact_version.py
227
228
229
230
231
232
233
234
235
class ArtifactVersionResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for artifact versions."""

    visualizations: Optional[List["ArtifactVisualizationResponse"]] = Field(
        default=None, title="Visualizations of the artifact."
    )
    run_metadata: Dict[str, MetadataType] = Field(
        default={}, title="Metadata of the artifact."
    )

ArtifactVersionResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the artifact version entity.

Source code in src/zenml/models/v2/core/artifact_version.py
238
239
240
241
242
243
244
245
246
247
248
249
250
251
class ArtifactVersionResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the artifact version entity."""

    tags: List[TagResponse] = Field(
        title="Tags associated with the artifact version.",
    )
    producer_step_run_id: Optional[UUID] = Field(
        title="ID of the step run that produced this artifact.",
        default=None,
    )
    producer_pipeline_run_id: Optional[UUID] = Field(
        title="The ID of the pipeline run that generated this artifact version.",
        default=None,
    )

ArtifactVersionUpdate

Bases: BaseUpdate

Artifact version update model.

Source code in src/zenml/models/v2/core/artifact_version.py
169
170
171
172
173
174
class ArtifactVersionUpdate(BaseUpdate):
    """Artifact version update model."""

    name: Optional[str] = None
    add_tags: Optional[List[str]] = None
    remove_tags: Optional[List[str]] = None

ArtifactVisualizationRequest

Bases: BaseRequest

Request model for artifact visualization.

Source code in src/zenml/models/v2/core/artifact_visualization.py
30
31
32
33
34
class ArtifactVisualizationRequest(BaseRequest):
    """Request model for artifact visualization."""

    type: VisualizationType
    uri: str

ArtifactVisualizationResponse

Bases: BaseIdentifiedResponse[ArtifactVisualizationResponseBody, ArtifactVisualizationResponseMetadata, ArtifactVisualizationResponseResources]

Response model for artifact visualizations.

Source code in src/zenml/models/v2/core/artifact_visualization.py
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
class ArtifactVisualizationResponse(
    BaseIdentifiedResponse[
        ArtifactVisualizationResponseBody,
        ArtifactVisualizationResponseMetadata,
        ArtifactVisualizationResponseResources,
    ]
):
    """Response model for artifact visualizations."""

    def get_hydrated_version(self) -> "ArtifactVisualizationResponse":
        """Get the hydrated version of this artifact visualization.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_artifact_visualization(self.id)

    # Body and metadata properties
    @property
    def type(self) -> VisualizationType:
        """The `type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().type

    @property
    def uri(self) -> str:
        """The `uri` property.

        Returns:
            the value of the property.
        """
        return self.get_body().uri

    @property
    def artifact_version_id(self) -> UUID:
        """The `artifact_version_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().artifact_version_id

artifact_version_id property

The artifact_version_id property.

Returns:

Type Description
UUID

the value of the property.

type property

The type property.

Returns:

Type Description
VisualizationType

the value of the property.

uri property

The uri property.

Returns:

Type Description
str

the value of the property.

get_hydrated_version()

Get the hydrated version of this artifact visualization.

Returns:

Type Description
ArtifactVisualizationResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/artifact_visualization.py
70
71
72
73
74
75
76
77
78
def get_hydrated_version(self) -> "ArtifactVisualizationResponse":
    """Get the hydrated version of this artifact visualization.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_artifact_visualization(self.id)

ArtifactVisualizationResponseBody

Bases: BaseDatedResponseBody

Response body for artifact visualizations.

Source code in src/zenml/models/v2/core/artifact_visualization.py
44
45
46
47
48
class ArtifactVisualizationResponseBody(BaseDatedResponseBody):
    """Response body for artifact visualizations."""

    type: VisualizationType
    uri: str

ArtifactVisualizationResponseMetadata

Bases: BaseResponseMetadata

Response metadata model for artifact visualizations.

Source code in src/zenml/models/v2/core/artifact_visualization.py
51
52
53
54
class ArtifactVisualizationResponseMetadata(BaseResponseMetadata):
    """Response metadata model for artifact visualizations."""

    artifact_version_id: UUID

AuthenticationMethodModel

Bases: BaseModel

Authentication method specification.

Describes the schema for the configuration and secrets that need to be provided to configure an authentication method.

Source code in src/zenml/models/v2/misc/service_connector_type.py
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
class AuthenticationMethodModel(BaseModel):
    """Authentication method specification.

    Describes the schema for the configuration and secrets that need to be
    provided to configure an authentication method.
    """

    name: str = Field(
        title="User readable name for the authentication method.",
    )
    auth_method: str = Field(
        title="The name of the authentication method.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    description: str = Field(
        default="",
        title="A description of the authentication method.",
    )
    config_schema: Dict[str, Any] = Field(
        default_factory=dict,
        title="The JSON schema of the configuration for this authentication "
        "method.",
    )
    min_expiration_seconds: Optional[int] = Field(
        default=None,
        title="The minimum number of seconds that the authentication "
        "session can be configured to be valid for. Set to None for "
        "authentication sessions and long-lived credentials that don't expire.",
    )
    max_expiration_seconds: Optional[int] = Field(
        default=None,
        title="The maximum number of seconds that the authentication "
        "session can be configured to be valid for. Set to None for "
        "authentication sessions and long-lived credentials that don't expire.",
    )
    default_expiration_seconds: Optional[int] = Field(
        default=None,
        title="The default number of seconds that the authentication "
        "session is valid for. Set to None for authentication sessions and "
        "long-lived credentials that don't expire.",
    )
    _config_class: Optional[Type[BaseModel]] = None

    def __init__(
        self, config_class: Optional[Type[BaseModel]] = None, **values: Any
    ):
        """Initialize the authentication method.

        Args:
            config_class: The configuration class for the authentication
                method.
            **values: The data to initialize the authentication method with.
        """
        if config_class:
            values["config_schema"] = config_class.model_json_schema()

        super().__init__(**values)
        self._config_class = config_class

    @property
    def config_class(self) -> Optional[Type[BaseModel]]:
        """Get the configuration class for the authentication method.

        Returns:
            The configuration class for the authentication method.
        """
        return self._config_class

    def supports_temporary_credentials(self) -> bool:
        """Check if the authentication method supports temporary credentials.

        Returns:
            True if the authentication method supports temporary credentials,
            False otherwise.
        """
        return (
            self.min_expiration_seconds is not None
            or self.max_expiration_seconds is not None
            or self.default_expiration_seconds is not None
        )

    def validate_expiration(
        self, expiration_seconds: Optional[int]
    ) -> Optional[int]:
        """Validate the expiration time.

        Args:
            expiration_seconds: The expiration time in seconds. If None, the
                default expiration time is used, if applicable.

        Returns:
            The expiration time in seconds or None if not applicable.

        Raises:
            ValueError: If the expiration time is not valid.
        """
        if not self.supports_temporary_credentials():
            if expiration_seconds is not None:
                # Expiration is not supported
                raise ValueError(
                    "Expiration time is not supported for this authentication "
                    f"method but a value was provided: {expiration_seconds}"
                )

            return None

        expiration_seconds = (
            expiration_seconds or self.default_expiration_seconds
        )

        if expiration_seconds is None:
            return None

        if self.min_expiration_seconds is not None:
            if expiration_seconds < self.min_expiration_seconds:
                raise ValueError(
                    f"Expiration time must be at least "
                    f"{self.min_expiration_seconds} seconds."
                )

        if self.max_expiration_seconds is not None:
            if expiration_seconds > self.max_expiration_seconds:
                raise ValueError(
                    f"Expiration time must be at most "
                    f"{self.max_expiration_seconds} seconds."
                )

        return expiration_seconds

config_class property

Get the configuration class for the authentication method.

Returns:

Type Description
Optional[Type[BaseModel]]

The configuration class for the authentication method.

__init__(config_class=None, **values)

Initialize the authentication method.

Parameters:

Name Type Description Default
config_class Optional[Type[BaseModel]]

The configuration class for the authentication method.

None
**values Any

The data to initialize the authentication method with.

{}
Source code in src/zenml/models/v2/misc/service_connector_type.py
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
def __init__(
    self, config_class: Optional[Type[BaseModel]] = None, **values: Any
):
    """Initialize the authentication method.

    Args:
        config_class: The configuration class for the authentication
            method.
        **values: The data to initialize the authentication method with.
    """
    if config_class:
        values["config_schema"] = config_class.model_json_schema()

    super().__init__(**values)
    self._config_class = config_class

supports_temporary_credentials()

Check if the authentication method supports temporary credentials.

Returns:

Type Description
bool

True if the authentication method supports temporary credentials,

bool

False otherwise.

Source code in src/zenml/models/v2/misc/service_connector_type.py
157
158
159
160
161
162
163
164
165
166
167
168
def supports_temporary_credentials(self) -> bool:
    """Check if the authentication method supports temporary credentials.

    Returns:
        True if the authentication method supports temporary credentials,
        False otherwise.
    """
    return (
        self.min_expiration_seconds is not None
        or self.max_expiration_seconds is not None
        or self.default_expiration_seconds is not None
    )

validate_expiration(expiration_seconds)

Validate the expiration time.

Parameters:

Name Type Description Default
expiration_seconds Optional[int]

The expiration time in seconds. If None, the default expiration time is used, if applicable.

required

Returns:

Type Description
Optional[int]

The expiration time in seconds or None if not applicable.

Raises:

Type Description
ValueError

If the expiration time is not valid.

Source code in src/zenml/models/v2/misc/service_connector_type.py
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
def validate_expiration(
    self, expiration_seconds: Optional[int]
) -> Optional[int]:
    """Validate the expiration time.

    Args:
        expiration_seconds: The expiration time in seconds. If None, the
            default expiration time is used, if applicable.

    Returns:
        The expiration time in seconds or None if not applicable.

    Raises:
        ValueError: If the expiration time is not valid.
    """
    if not self.supports_temporary_credentials():
        if expiration_seconds is not None:
            # Expiration is not supported
            raise ValueError(
                "Expiration time is not supported for this authentication "
                f"method but a value was provided: {expiration_seconds}"
            )

        return None

    expiration_seconds = (
        expiration_seconds or self.default_expiration_seconds
    )

    if expiration_seconds is None:
        return None

    if self.min_expiration_seconds is not None:
        if expiration_seconds < self.min_expiration_seconds:
            raise ValueError(
                f"Expiration time must be at least "
                f"{self.min_expiration_seconds} seconds."
            )

    if self.max_expiration_seconds is not None:
        if expiration_seconds > self.max_expiration_seconds:
            raise ValueError(
                f"Expiration time must be at most "
                f"{self.max_expiration_seconds} seconds."
            )

    return expiration_seconds

BaseDatedResponseBody

Bases: BaseResponseBody

Base body model for entities that track a creation and update timestamp.

Used as a base class for all body models associated with responses. Features a creation and update timestamp.

Source code in src/zenml/models/v2/base/base.py
331
332
333
334
335
336
337
338
339
340
341
342
343
class BaseDatedResponseBody(BaseResponseBody):
    """Base body model for entities that track a creation and update timestamp.

    Used as a base class for all body models associated with responses.
    Features a creation and update timestamp.
    """

    created: datetime = Field(
        title="The timestamp when this resource was created."
    )
    updated: datetime = Field(
        title="The timestamp when this resource was last updated."
    )

BaseFilter

Bases: BaseModel

Class to unify all filter, paginate and sort request parameters.

This Model allows fine-grained filtering, sorting and pagination of resources.

Usage example for subclasses of this class:

ResourceListModel(
    name="contains:default",
    project="default"
    count_steps="gte:5"
    sort_by="created",
    page=2,
    size=20
)
Source code in src/zenml/models/v2/base/filter.py
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
class BaseFilter(BaseModel):
    """Class to unify all filter, paginate and sort request parameters.

    This Model allows fine-grained filtering, sorting and pagination of
    resources.

    Usage example for subclasses of this class:
    ```
    ResourceListModel(
        name="contains:default",
        project="default"
        count_steps="gte:5"
        sort_by="created",
        page=2,
        size=20
    )
    ```
    """

    # List of fields that cannot be used as filters.
    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        "sort_by",
        "page",
        "size",
        "logical_operator",
    ]
    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = []

    # List of fields that are not even mentioned as options in the CLI.
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = []

    # List of fields that are wrapped with `fastapi.Query(default)` in API.
    API_MULTI_INPUT_PARAMS: ClassVar[List[str]] = []

    sort_by: str = Field(
        default="created", description="Which column to sort by."
    )
    logical_operator: LogicalOperators = Field(
        default=LogicalOperators.AND,
        description="Which logical operator to use between all filters "
        "['and', 'or']",
    )
    page: int = Field(
        default=PAGINATION_STARTING_PAGE, ge=1, description="Page number"
    )
    size: int = Field(
        default=PAGE_SIZE_DEFAULT,
        ge=1,
        le=PAGE_SIZE_MAXIMUM,
        description="Page size",
    )
    id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Id for this resource",
        union_mode="left_to_right",
    )
    created: Optional[Union[datetime, str]] = Field(
        default=None, description="Created", union_mode="left_to_right"
    )
    updated: Optional[Union[datetime, str]] = Field(
        default=None, description="Updated", union_mode="left_to_right"
    )

    _rbac_configuration: Optional[
        Tuple[UUID, Dict[str, Optional[Set[UUID]]]]
    ] = None

    @field_validator("sort_by", mode="before")
    @classmethod
    def validate_sort_by(cls, value: Any) -> Any:
        """Validate that the sort_column is a valid column with a valid operand.

        Args:
            value: The sort_by field value.

        Returns:
            The validated sort_by field value.

        Raises:
            ValidationError: If the sort_by field is not a string.
            ValueError: If the resource can't be sorted by this field.
        """
        # Somehow pydantic allows you to pass in int values, which will be
        #  interpreted as string, however within the validator they are still
        #  integers, which don't have a .split() method
        if not isinstance(value, str):
            raise ValidationError(
                f"str type expected for the sort_by field. "
                f"Received a {type(value)}"
            )
        column = value
        split_value = value.split(":", 1)
        if len(split_value) == 2:
            column = split_value[1]

            if split_value[0] not in SorterOps.values():
                logger.warning(
                    "Invalid operand used for column sorting. "
                    "Only the following operands are supported `%s`. "
                    "Defaulting to 'asc' on column `%s`.",
                    SorterOps.values(),
                    column,
                )
                value = column

        if column in cls.CUSTOM_SORTING_OPTIONS:
            return value
        elif column in cls.FILTER_EXCLUDE_FIELDS:
            raise ValueError(
                f"This resource can not be sorted by this field: '{value}'"
            )
        if column in cls.model_fields:
            return value
        else:
            raise ValueError(
                "You can only sort by valid fields of this resource"
            )

    @model_validator(mode="before")
    @classmethod
    @before_validator_handler
    def filter_ops(cls, data: Dict[str, Any]) -> Dict[str, Any]:
        """Parse incoming filters to ensure all filters are legal.

        Args:
            data: The values of the class.

        Returns:
            The values of the class.
        """
        cls._generate_filter_list(data)
        return data

    @property
    def list_of_filters(self) -> List[Filter]:
        """Converts the class variables into a list of usable Filter Models.

        Returns:
            A list of Filter models.
        """
        return self._generate_filter_list(
            {key: getattr(self, key) for key in type(self).model_fields}
        )

    @property
    def sorting_params(self) -> Tuple[str, SorterOps]:
        """Converts the class variables into a list of usable Filter Models.

        Returns:
            A tuple of the column to sort by and the sorting operand.
        """
        column = self.sort_by
        # The default sorting operand is asc
        operator = SorterOps.ASCENDING

        # Check if user explicitly set an operand
        split_value = self.sort_by.split(":", 1)
        if len(split_value) == 2:
            column = split_value[1]
            operator = SorterOps(split_value[0])

        return column, operator

    def configure_rbac(
        self,
        authenticated_user_id: UUID,
        **column_allowed_ids: Optional[Set[UUID]],
    ) -> None:
        """Configure RBAC allowed column values.

        Args:
            authenticated_user_id: ID of the authenticated user. All entities
                owned by this user will be included.
            column_allowed_ids: Set of IDs per column to limit the query to.
                If given, the remaining filters will be applied to entities
                within this set only. If `None`, the remaining filters will
                be applied to all entries in the table.
        """
        self._rbac_configuration = (authenticated_user_id, column_allowed_ids)

    def generate_rbac_filter(
        self,
        table: Type["AnySchema"],
    ) -> Optional["ColumnElement[bool]"]:
        """Generates an optional RBAC filter.

        Args:
            table: The query table.

        Returns:
            The RBAC filter.
        """
        from sqlmodel import or_

        if not self._rbac_configuration:
            return None

        expressions = []

        for column_name, allowed_ids in self._rbac_configuration[1].items():
            if allowed_ids is not None:
                expression = getattr(table, column_name).in_(allowed_ids)
                expressions.append(expression)

        if expressions and hasattr(table, "user_id"):
            # If `expressions` is not empty, we do not have full access to all
            # rows of the table. In this case, we also include rows which the
            # user owns.

            # Unowned entities are considered server-owned and can be seen
            # by anyone
            expressions.append(getattr(table, "user_id").is_(None))
            # The authenticated user owns this entity
            expressions.append(
                getattr(table, "user_id") == self._rbac_configuration[0]
            )

        if expressions:
            return or_(*expressions)
        else:
            return None

    @classmethod
    def _generate_filter_list(cls, values: Dict[str, Any]) -> List[Filter]:
        """Create a list of filters from a (column, value) dictionary.

        Args:
            values: A dictionary of column names and values to filter on.

        Returns:
            A list of filters.
        """
        list_of_filters: List[Filter] = []

        for key, value in values.items():
            # Ignore excluded filters
            if key in cls.FILTER_EXCLUDE_FIELDS:
                continue

            # Skip filtering for None values
            if value is None:
                continue

            # Determine the operator and filter value
            value, operator = cls._resolve_operator(value)

            # Define the filter
            filter = FilterGenerator(cls).define_filter(
                column=key, value=value, operator=operator
            )
            list_of_filters.append(filter)

        return list_of_filters

    @staticmethod
    def _resolve_operator(value: Any) -> Tuple[Any, GenericFilterOps]:
        """Determine the operator and filter value from a user-provided value.

        If the user-provided value is a string of the form "operator:value",
        then the operator is extracted and the value is returned. Otherwise,
        `GenericFilterOps.EQUALS` is used as default operator and the value
        is returned as-is.

        Args:
            value: The user-provided value.

        Returns:
            A tuple of the filter value and the operator.

        Raises:
            ValueError: when we try to use the `oneof` operator with the wrong
                value.
        """
        operator = GenericFilterOps.EQUALS  # Default operator
        if isinstance(value, str):
            split_value = value.split(":", 1)
            if (
                len(split_value) == 2
                and split_value[0] in GenericFilterOps.values()
            ):
                value = split_value[1]
                operator = GenericFilterOps(split_value[0])

            if operator == operator.ONEOF:
                try:
                    value = json.loads(value)
                    if not isinstance(value, list):
                        raise ValueError
                except ValueError:
                    raise ValueError(ONEOF_ERROR)

        return value, operator

    def generate_name_or_id_query_conditions(
        self,
        value: Union[UUID, str],
        table: Type["NamedSchema"],
        additional_columns: Optional[List[str]] = None,
    ) -> "ColumnElement[bool]":
        """Generate filter conditions for name or id of a table.

        Args:
            value: The filter value.
            table: The table to filter.
            additional_columns: Additional table columns that should also
                filter for the given value as part of the or condition.

        Returns:
            The query conditions.
        """
        from sqlmodel import or_

        value, operator = BaseFilter._resolve_operator(value)
        value = str(value)

        conditions = []

        filter_ = FilterGenerator(table).define_filter(
            column="id", value=value, operator=operator
        )
        conditions.append(filter_.generate_query_conditions(table=table))

        filter_ = FilterGenerator(table).define_filter(
            column="name", value=value, operator=operator
        )
        conditions.append(filter_.generate_query_conditions(table=table))

        for column in additional_columns or []:
            filter_ = FilterGenerator(table).define_filter(
                column=column, value=value, operator=operator
            )
            conditions.append(filter_.generate_query_conditions(table=table))

        return or_(*conditions)

    @staticmethod
    def generate_custom_query_conditions_for_column(
        value: Any,
        table: Type[SQLModel],
        column: str,
    ) -> "ColumnElement[bool]":
        """Generate custom filter conditions for a column of a table.

        Args:
            value: The filter value.
            table: The table which contains the column.
            column: The column name.

        Returns:
            The query conditions.
        """
        value, operator = BaseFilter._resolve_operator(value)
        filter_ = FilterGenerator(table).define_filter(
            column=column, value=value, operator=operator
        )
        return filter_.generate_query_conditions(table=table)

    @property
    def offset(self) -> int:
        """Returns the offset needed for the query on the data persistence layer.

        Returns:
            The offset for the query.
        """
        return self.size * (self.page - 1)

    def generate_filter(
        self, table: Type["AnySchema"]
    ) -> Union["ColumnElement[bool]"]:
        """Generate the filter for the query.

        Args:
            table: The Table that is being queried from.

        Returns:
            The filter expression for the query.

        Raises:
            RuntimeError: If a valid logical operator is not supplied.
        """
        from sqlmodel import and_, or_

        filters = []
        for column_filter in self.list_of_filters:
            filters.append(
                column_filter.generate_query_conditions(table=table)
            )
        for custom_filter in self.get_custom_filters(table):
            filters.append(custom_filter)
        if self.logical_operator == LogicalOperators.OR:
            return or_(False, *filters)
        elif self.logical_operator == LogicalOperators.AND:
            return and_(True, *filters)
        else:
            raise RuntimeError("No valid logical operator was supplied.")

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        This can be overridden by subclasses to define custom filters that are
        not based on the columns of the underlying table.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        return []

    def apply_filter(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Applies the filter to a query.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        rbac_filter = self.generate_rbac_filter(table=table)

        if rbac_filter is not None:
            query = query.where(rbac_filter)

        filters = self.generate_filter(table=table)

        if filters is not None:
            query = query.where(filters)

        return query

    def apply_sorting(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Apply sorting to the query.

        Args:
            query: The query to which to apply the sorting.
            table: The query table.

        Returns:
            The query with sorting applied.
        """
        column, operand = self.sorting_params

        if operand == SorterOps.DESCENDING:
            sort_clause = desc(getattr(table, column))  # type: ignore[var-annotated]
        else:
            sort_clause = asc(getattr(table, column))

        # We always add the `id` column as a tiebreaker to ensure a stable,
        # repeatable order of items, otherwise subsequent pages might contain
        # the same items.
        query = query.order_by(sort_clause, asc(table.id))  # type: ignore[arg-type]

        return query

list_of_filters property

Converts the class variables into a list of usable Filter Models.

Returns:

Type Description
List[Filter]

A list of Filter models.

offset property

Returns the offset needed for the query on the data persistence layer.

Returns:

Type Description
int

The offset for the query.

sorting_params property

Converts the class variables into a list of usable Filter Models.

Returns:

Type Description
Tuple[str, SorterOps]

A tuple of the column to sort by and the sorting operand.

apply_filter(query, table)

Applies the filter to a query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/base/filter.py
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
def apply_filter(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Applies the filter to a query.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    rbac_filter = self.generate_rbac_filter(table=table)

    if rbac_filter is not None:
        query = query.where(rbac_filter)

    filters = self.generate_filter(table=table)

    if filters is not None:
        query = query.where(filters)

    return query

apply_sorting(query, table)

Apply sorting to the query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the sorting.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with sorting applied.

Source code in src/zenml/models/v2/base/filter.py
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
def apply_sorting(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Apply sorting to the query.

    Args:
        query: The query to which to apply the sorting.
        table: The query table.

    Returns:
        The query with sorting applied.
    """
    column, operand = self.sorting_params

    if operand == SorterOps.DESCENDING:
        sort_clause = desc(getattr(table, column))  # type: ignore[var-annotated]
    else:
        sort_clause = asc(getattr(table, column))

    # We always add the `id` column as a tiebreaker to ensure a stable,
    # repeatable order of items, otherwise subsequent pages might contain
    # the same items.
    query = query.order_by(sort_clause, asc(table.id))  # type: ignore[arg-type]

    return query

configure_rbac(authenticated_user_id, **column_allowed_ids)

Configure RBAC allowed column values.

Parameters:

Name Type Description Default
authenticated_user_id UUID

ID of the authenticated user. All entities owned by this user will be included.

required
column_allowed_ids Optional[Set[UUID]]

Set of IDs per column to limit the query to. If given, the remaining filters will be applied to entities within this set only. If None, the remaining filters will be applied to all entries in the table.

{}
Source code in src/zenml/models/v2/base/filter.py
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
def configure_rbac(
    self,
    authenticated_user_id: UUID,
    **column_allowed_ids: Optional[Set[UUID]],
) -> None:
    """Configure RBAC allowed column values.

    Args:
        authenticated_user_id: ID of the authenticated user. All entities
            owned by this user will be included.
        column_allowed_ids: Set of IDs per column to limit the query to.
            If given, the remaining filters will be applied to entities
            within this set only. If `None`, the remaining filters will
            be applied to all entries in the table.
    """
    self._rbac_configuration = (authenticated_user_id, column_allowed_ids)

filter_ops(data) classmethod

Parse incoming filters to ensure all filters are legal.

Parameters:

Name Type Description Default
data Dict[str, Any]

The values of the class.

required

Returns:

Type Description
Dict[str, Any]

The values of the class.

Source code in src/zenml/models/v2/base/filter.py
645
646
647
648
649
650
651
652
653
654
655
656
657
658
@model_validator(mode="before")
@classmethod
@before_validator_handler
def filter_ops(cls, data: Dict[str, Any]) -> Dict[str, Any]:
    """Parse incoming filters to ensure all filters are legal.

    Args:
        data: The values of the class.

    Returns:
        The values of the class.
    """
    cls._generate_filter_list(data)
    return data

generate_custom_query_conditions_for_column(value, table, column) staticmethod

Generate custom filter conditions for a column of a table.

Parameters:

Name Type Description Default
value Any

The filter value.

required
table Type[SQLModel]

The table which contains the column.

required
column str

The column name.

required

Returns:

Type Description
ColumnElement[bool]

The query conditions.

Source code in src/zenml/models/v2/base/filter.py
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
@staticmethod
def generate_custom_query_conditions_for_column(
    value: Any,
    table: Type[SQLModel],
    column: str,
) -> "ColumnElement[bool]":
    """Generate custom filter conditions for a column of a table.

    Args:
        value: The filter value.
        table: The table which contains the column.
        column: The column name.

    Returns:
        The query conditions.
    """
    value, operator = BaseFilter._resolve_operator(value)
    filter_ = FilterGenerator(table).define_filter(
        column=column, value=value, operator=operator
    )
    return filter_.generate_query_conditions(table=table)

generate_filter(table)

Generate the filter for the query.

Parameters:

Name Type Description Default
table Type[AnySchema]

The Table that is being queried from.

required

Returns:

Type Description
Union[ColumnElement[bool]]

The filter expression for the query.

Raises:

Type Description
RuntimeError

If a valid logical operator is not supplied.

Source code in src/zenml/models/v2/base/filter.py
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
def generate_filter(
    self, table: Type["AnySchema"]
) -> Union["ColumnElement[bool]"]:
    """Generate the filter for the query.

    Args:
        table: The Table that is being queried from.

    Returns:
        The filter expression for the query.

    Raises:
        RuntimeError: If a valid logical operator is not supplied.
    """
    from sqlmodel import and_, or_

    filters = []
    for column_filter in self.list_of_filters:
        filters.append(
            column_filter.generate_query_conditions(table=table)
        )
    for custom_filter in self.get_custom_filters(table):
        filters.append(custom_filter)
    if self.logical_operator == LogicalOperators.OR:
        return or_(False, *filters)
    elif self.logical_operator == LogicalOperators.AND:
        return and_(True, *filters)
    else:
        raise RuntimeError("No valid logical operator was supplied.")

generate_name_or_id_query_conditions(value, table, additional_columns=None)

Generate filter conditions for name or id of a table.

Parameters:

Name Type Description Default
value Union[UUID, str]

The filter value.

required
table Type[NamedSchema]

The table to filter.

required
additional_columns Optional[List[str]]

Additional table columns that should also filter for the given value as part of the or condition.

None

Returns:

Type Description
ColumnElement[bool]

The query conditions.

Source code in src/zenml/models/v2/base/filter.py
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
def generate_name_or_id_query_conditions(
    self,
    value: Union[UUID, str],
    table: Type["NamedSchema"],
    additional_columns: Optional[List[str]] = None,
) -> "ColumnElement[bool]":
    """Generate filter conditions for name or id of a table.

    Args:
        value: The filter value.
        table: The table to filter.
        additional_columns: Additional table columns that should also
            filter for the given value as part of the or condition.

    Returns:
        The query conditions.
    """
    from sqlmodel import or_

    value, operator = BaseFilter._resolve_operator(value)
    value = str(value)

    conditions = []

    filter_ = FilterGenerator(table).define_filter(
        column="id", value=value, operator=operator
    )
    conditions.append(filter_.generate_query_conditions(table=table))

    filter_ = FilterGenerator(table).define_filter(
        column="name", value=value, operator=operator
    )
    conditions.append(filter_.generate_query_conditions(table=table))

    for column in additional_columns or []:
        filter_ = FilterGenerator(table).define_filter(
            column=column, value=value, operator=operator
        )
        conditions.append(filter_.generate_query_conditions(table=table))

    return or_(*conditions)

generate_rbac_filter(table)

Generates an optional RBAC filter.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
Optional[ColumnElement[bool]]

The RBAC filter.

Source code in src/zenml/models/v2/base/filter.py
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
def generate_rbac_filter(
    self,
    table: Type["AnySchema"],
) -> Optional["ColumnElement[bool]"]:
    """Generates an optional RBAC filter.

    Args:
        table: The query table.

    Returns:
        The RBAC filter.
    """
    from sqlmodel import or_

    if not self._rbac_configuration:
        return None

    expressions = []

    for column_name, allowed_ids in self._rbac_configuration[1].items():
        if allowed_ids is not None:
            expression = getattr(table, column_name).in_(allowed_ids)
            expressions.append(expression)

    if expressions and hasattr(table, "user_id"):
        # If `expressions` is not empty, we do not have full access to all
        # rows of the table. In this case, we also include rows which the
        # user owns.

        # Unowned entities are considered server-owned and can be seen
        # by anyone
        expressions.append(getattr(table, "user_id").is_(None))
        # The authenticated user owns this entity
        expressions.append(
            getattr(table, "user_id") == self._rbac_configuration[0]
        )

    if expressions:
        return or_(*expressions)
    else:
        return None

get_custom_filters(table)

Get custom filters.

This can be overridden by subclasses to define custom filters that are not based on the columns of the underlying table.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/base/filter.py
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    This can be overridden by subclasses to define custom filters that are
    not based on the columns of the underlying table.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    return []

validate_sort_by(value) classmethod

Validate that the sort_column is a valid column with a valid operand.

Parameters:

Name Type Description Default
value Any

The sort_by field value.

required

Returns:

Type Description
Any

The validated sort_by field value.

Raises:

Type Description
ValidationError

If the sort_by field is not a string.

ValueError

If the resource can't be sorted by this field.

Source code in src/zenml/models/v2/base/filter.py
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
@field_validator("sort_by", mode="before")
@classmethod
def validate_sort_by(cls, value: Any) -> Any:
    """Validate that the sort_column is a valid column with a valid operand.

    Args:
        value: The sort_by field value.

    Returns:
        The validated sort_by field value.

    Raises:
        ValidationError: If the sort_by field is not a string.
        ValueError: If the resource can't be sorted by this field.
    """
    # Somehow pydantic allows you to pass in int values, which will be
    #  interpreted as string, however within the validator they are still
    #  integers, which don't have a .split() method
    if not isinstance(value, str):
        raise ValidationError(
            f"str type expected for the sort_by field. "
            f"Received a {type(value)}"
        )
    column = value
    split_value = value.split(":", 1)
    if len(split_value) == 2:
        column = split_value[1]

        if split_value[0] not in SorterOps.values():
            logger.warning(
                "Invalid operand used for column sorting. "
                "Only the following operands are supported `%s`. "
                "Defaulting to 'asc' on column `%s`.",
                SorterOps.values(),
                column,
            )
            value = column

    if column in cls.CUSTOM_SORTING_OPTIONS:
        return value
    elif column in cls.FILTER_EXCLUDE_FIELDS:
        raise ValueError(
            f"This resource can not be sorted by this field: '{value}'"
        )
    if column in cls.model_fields:
        return value
    else:
        raise ValueError(
            "You can only sort by valid fields of this resource"
        )

BaseIdentifiedResponse

Bases: BaseResponse[AnyDatedBody, AnyMetadata, AnyResources], Generic[AnyDatedBody, AnyMetadata, AnyResources]

Base domain model for resources with DB representation.

Source code in src/zenml/models/v2/base/base.py
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
class BaseIdentifiedResponse(
    BaseResponse[AnyDatedBody, AnyMetadata, AnyResources],
    Generic[AnyDatedBody, AnyMetadata, AnyResources],
):
    """Base domain model for resources with DB representation."""

    id: UUID = Field(title="The unique resource id.")

    permission_denied: bool = False

    # Helper functions
    def __hash__(self) -> int:
        """Implementation of hash magic method.

        Returns:
            Hash of the UUID.
        """
        return hash((type(self),) + tuple([self.id]))

    def __eq__(self, other: Any) -> bool:
        """Implementation of equality magic method.

        Args:
            other: The other object to compare to.

        Returns:
            True if the other object is of the same type and has the same UUID.
        """
        if isinstance(other, type(self)):
            return self.id == other.id
        else:
            return False

    def _validate_hydrated_version(
        self,
        hydrated_model: "BaseResponse[AnyDatedBody, AnyMetadata, AnyResources]",
    ) -> None:
        """Helper method to validate the values within the hydrated version.

        Args:
            hydrated_model: the hydrated version of the model.

        Raises:
            HydrationError: if the hydrated version has different values set
                for either the name of the body fields and the
                _method_body_mutation is set to ResponseBodyUpdate.DENY.
        """
        super()._validate_hydrated_version(hydrated_model)

        assert isinstance(hydrated_model, type(self))

        # Check if the ID is the same
        if self.id != hydrated_model.id:
            raise HydrationError(
                "The hydrated version of the model does not have the same id."
            )

    def get_hydrated_version(
        self,
    ) -> "BaseIdentifiedResponse[AnyDatedBody, AnyMetadata, AnyResources]":
        """Abstract method to fetch the hydrated version of the model.

        Raises:
            NotImplementedError: in case the method is not implemented.
        """
        raise NotImplementedError(
            "Please implement a `get_hydrated_version` method before "
            "using/hydrating the model."
        )

    def get_body(self) -> "AnyDatedBody":
        """Fetch the body of the entity.

        Returns:
            The body field of the response.

        Raises:
            IllegalOperationError: If the user lacks permission to access the
                entity represented by this response.
        """
        if self.permission_denied:
            raise IllegalOperationError(
                f"Missing permissions to access {type(self).__name__} with "
                f"ID {self.id}."
            )

        return super().get_body()

    def get_metadata(self) -> "AnyMetadata":
        """Fetch the metadata of the entity.

        Returns:
            The metadata field of the response.

        Raises:
            IllegalOperationError: If the user lacks permission to access this
                entity represented by this response.
        """
        if self.permission_denied:
            raise IllegalOperationError(
                f"Missing permissions to access {type(self).__name__} with "
                f"ID {self.id}."
            )

        return super().get_metadata()

    # Analytics
    def get_analytics_metadata(self) -> Dict[str, Any]:
        """Fetches the analytics metadata for base response models.

        Returns:
            The analytics metadata.
        """
        metadata = super().get_analytics_metadata()
        metadata["entity_id"] = self.id
        return metadata

    # Body and metadata properties
    @property
    def created(self) -> datetime:
        """The `created` property.

        Returns:
            the value of the property.
        """
        return self.get_body().created

    @property
    def updated(self) -> datetime:
        """The `updated` property.

        Returns:
            the value of the property.
        """
        return self.get_body().updated

created property

The created property.

Returns:

Type Description
datetime

the value of the property.

updated property

The updated property.

Returns:

Type Description
datetime

the value of the property.

__eq__(other)

Implementation of equality magic method.

Parameters:

Name Type Description Default
other Any

The other object to compare to.

required

Returns:

Type Description
bool

True if the other object is of the same type and has the same UUID.

Source code in src/zenml/models/v2/base/base.py
368
369
370
371
372
373
374
375
376
377
378
379
380
def __eq__(self, other: Any) -> bool:
    """Implementation of equality magic method.

    Args:
        other: The other object to compare to.

    Returns:
        True if the other object is of the same type and has the same UUID.
    """
    if isinstance(other, type(self)):
        return self.id == other.id
    else:
        return False

__hash__()

Implementation of hash magic method.

Returns:

Type Description
int

Hash of the UUID.

Source code in src/zenml/models/v2/base/base.py
360
361
362
363
364
365
366
def __hash__(self) -> int:
    """Implementation of hash magic method.

    Returns:
        Hash of the UUID.
    """
    return hash((type(self),) + tuple([self.id]))

get_analytics_metadata()

Fetches the analytics metadata for base response models.

Returns:

Type Description
Dict[str, Any]

The analytics metadata.

Source code in src/zenml/models/v2/base/base.py
456
457
458
459
460
461
462
463
464
def get_analytics_metadata(self) -> Dict[str, Any]:
    """Fetches the analytics metadata for base response models.

    Returns:
        The analytics metadata.
    """
    metadata = super().get_analytics_metadata()
    metadata["entity_id"] = self.id
    return metadata

get_body()

Fetch the body of the entity.

Returns:

Type Description
AnyDatedBody

The body field of the response.

Raises:

Type Description
IllegalOperationError

If the user lacks permission to access the entity represented by this response.

Source code in src/zenml/models/v2/base/base.py
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
def get_body(self) -> "AnyDatedBody":
    """Fetch the body of the entity.

    Returns:
        The body field of the response.

    Raises:
        IllegalOperationError: If the user lacks permission to access the
            entity represented by this response.
    """
    if self.permission_denied:
        raise IllegalOperationError(
            f"Missing permissions to access {type(self).__name__} with "
            f"ID {self.id}."
        )

    return super().get_body()

get_hydrated_version()

Abstract method to fetch the hydrated version of the model.

Raises:

Type Description
NotImplementedError

in case the method is not implemented.

Source code in src/zenml/models/v2/base/base.py
406
407
408
409
410
411
412
413
414
415
416
417
def get_hydrated_version(
    self,
) -> "BaseIdentifiedResponse[AnyDatedBody, AnyMetadata, AnyResources]":
    """Abstract method to fetch the hydrated version of the model.

    Raises:
        NotImplementedError: in case the method is not implemented.
    """
    raise NotImplementedError(
        "Please implement a `get_hydrated_version` method before "
        "using/hydrating the model."
    )

get_metadata()

Fetch the metadata of the entity.

Returns:

Type Description
AnyMetadata

The metadata field of the response.

Raises:

Type Description
IllegalOperationError

If the user lacks permission to access this entity represented by this response.

Source code in src/zenml/models/v2/base/base.py
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
def get_metadata(self) -> "AnyMetadata":
    """Fetch the metadata of the entity.

    Returns:
        The metadata field of the response.

    Raises:
        IllegalOperationError: If the user lacks permission to access this
            entity represented by this response.
    """
    if self.permission_denied:
        raise IllegalOperationError(
            f"Missing permissions to access {type(self).__name__} with "
            f"ID {self.id}."
        )

    return super().get_metadata()

BasePluginFlavorResponse

Bases: BaseResponse[AnyPluginBody, AnyPluginMetadata, AnyPluginResources], Generic[AnyPluginBody, AnyPluginMetadata, AnyPluginResources]

Base response for all Plugin Flavors.

Source code in src/zenml/models/v2/base/base_plugin_flavor.py
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
class BasePluginFlavorResponse(
    BaseResponse[AnyPluginBody, AnyPluginMetadata, AnyPluginResources],
    Generic[AnyPluginBody, AnyPluginMetadata, AnyPluginResources],
):
    """Base response for all Plugin Flavors."""

    name: str = Field(title="Name of the flavor.")
    type: PluginType = Field(title="Type of the plugin.")
    subtype: PluginSubType = Field(title="Subtype of the plugin.")
    model_config = ConfigDict(extra="ignore")

    def get_hydrated_version(
        self,
    ) -> "BasePluginFlavorResponse[AnyPluginBody, AnyPluginMetadata, AnyPluginResources]":
        """Abstract method to fetch the hydrated version of the model.

        Returns:
            Hydrated version of the PluginFlavorResponse
        """
        # TODO: shouldn't this call the Zen store ? The client should not have
        #  to know about the plugin flavor registry
        from zenml.zen_server.utils import plugin_flavor_registry

        plugin_flavor = plugin_flavor_registry().get_flavor_class(
            name=self.name, _type=self.type, subtype=self.subtype
        )
        return plugin_flavor.get_flavor_response_model(hydrate=True)

get_hydrated_version()

Abstract method to fetch the hydrated version of the model.

Returns:

Type Description
BasePluginFlavorResponse[AnyPluginBody, AnyPluginMetadata, AnyPluginResources]

Hydrated version of the PluginFlavorResponse

Source code in src/zenml/models/v2/base/base_plugin_flavor.py
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
def get_hydrated_version(
    self,
) -> "BasePluginFlavorResponse[AnyPluginBody, AnyPluginMetadata, AnyPluginResources]":
    """Abstract method to fetch the hydrated version of the model.

    Returns:
        Hydrated version of the PluginFlavorResponse
    """
    # TODO: shouldn't this call the Zen store ? The client should not have
    #  to know about the plugin flavor registry
    from zenml.zen_server.utils import plugin_flavor_registry

    plugin_flavor = plugin_flavor_registry().get_flavor_class(
        name=self.name, _type=self.type, subtype=self.subtype
    )
    return plugin_flavor.get_flavor_response_model(hydrate=True)

BaseRequest

Bases: BaseZenModel

Base request model.

Used as a base class for all request models.

Source code in src/zenml/models/v2/base/base.py
52
53
54
55
56
class BaseRequest(BaseZenModel):
    """Base request model.

    Used as a base class for all request models.
    """

BaseResponse

Bases: BaseZenModel, Generic[AnyBody, AnyMetadata, AnyResources]

Base domain model for all responses.

Source code in src/zenml/models/v2/base/base.py
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
class BaseResponse(BaseZenModel, Generic[AnyBody, AnyMetadata, AnyResources]):
    """Base domain model for all responses."""

    # Body and metadata pair
    body: Optional["AnyBody"] = Field(
        default=None, title="The body of the resource."
    )
    metadata: Optional["AnyMetadata"] = Field(
        default=None, title="The metadata related to this resource."
    )
    resources: Optional["AnyResources"] = Field(
        default=None, title="The resources related to this resource."
    )

    _response_update_strategy: ResponseUpdateStrategy = (
        ResponseUpdateStrategy.ALLOW
    )
    _warn_on_response_updates: bool = True

    def _validate_hydrated_version(
        self,
        hydrated_model: "BaseResponse[AnyBody, AnyMetadata, AnyResources]",
    ) -> None:
        """Helper method to validate the values within the hydrated version.

        Args:
            hydrated_model: the hydrated version of the model.

        Raises:
            HydrationError: if the hydrated version has different values set
                for either the name of the body fields and the
                _method_body_mutation is set to ResponseBodyUpdate.DENY.
        """
        # Check whether the metadata exists in the hydrated version
        if hydrated_model.metadata is None:
            raise HydrationError(
                "The hydrated model does not have a metadata field."
            )

        # Check if the name has changed
        if "name" in type(self).model_fields:
            original_name = getattr(self, "name")
            hydrated_name = getattr(hydrated_model, "name")

            if original_name != hydrated_name:
                if (
                    self._response_update_strategy
                    == ResponseUpdateStrategy.ALLOW
                ):
                    setattr(self, "name", hydrated_name)

                    if self._warn_on_response_updates:
                        logger.warning(
                            f"The name of the entity has changed from "
                            f"`{original_name}` to `{hydrated_name}`."
                        )

                elif (
                    self._response_update_strategy
                    == ResponseUpdateStrategy.IGNORE
                ):
                    if self._warn_on_response_updates:
                        logger.warning(
                            f"Ignoring the name change in the hydrated version "
                            f"of the response: `{original_name}` to "
                            f"`{hydrated_name}`."
                        )
                elif (
                    self._response_update_strategy
                    == ResponseUpdateStrategy.DENY
                ):
                    raise HydrationError(
                        f"Failing the hydration, because there is a change in "
                        f"the name of the entity: `{original_name}` to "
                        f"`{hydrated_name}`."
                    )

        # Check all the fields in the body
        for field in type(self.get_body()).model_fields:
            original_value = getattr(self.get_body(), field)
            hydrated_value = getattr(hydrated_model.get_body(), field)

            if original_value != hydrated_value:
                if (
                    self._response_update_strategy
                    == ResponseUpdateStrategy.ALLOW
                ):
                    setattr(self.get_body(), field, hydrated_value)

                    if self._warn_on_response_updates:
                        logger.warning(
                            f"The field `{field}` in the body of the response "
                            f"has changed from `{original_value}` to "
                            f"`{hydrated_value}`."
                        )

                elif (
                    self._response_update_strategy
                    == ResponseUpdateStrategy.IGNORE
                ):
                    if self._warn_on_response_updates:
                        logger.warning(
                            f"Ignoring the change in the hydrated version of "
                            f"the field `{field}`: `{original_value}` -> "
                            f"`{hydrated_value}`."
                        )
                elif (
                    self._response_update_strategy
                    == ResponseUpdateStrategy.DENY
                ):
                    raise HydrationError(
                        f"Failing the hydration, because there is a change in "
                        f"the field `{field}`: `{original_value}` -> "
                        f"`{hydrated_value}`"
                    )

    def hydrate(self) -> None:
        """Hydrate the response."""
        hydrated_version = self.get_hydrated_version()
        self._validate_hydrated_version(hydrated_version)

        self.resources = hydrated_version.resources
        self.metadata = hydrated_version.metadata

    def get_hydrated_version(
        self,
    ) -> "BaseResponse[AnyBody, AnyMetadata, AnyResources]":
        """Abstract method to fetch the hydrated version of the model.

        Raises:
            NotImplementedError: in case the method is not implemented.
        """
        raise NotImplementedError(
            "Please implement a `get_hydrated_version` method before "
            "using/hydrating the model."
        )

    def get_body(self) -> "AnyBody":
        """Fetch the body of the entity.

        Returns:
            The body field of the response.

        Raises:
            RuntimeError: If the body was not included in the response.
        """
        if not self.body:
            raise RuntimeError(
                f"Missing response body for {type(self).__name__}."
            )

        return self.body

    def get_metadata(self) -> "AnyMetadata":
        """Fetch the metadata of the entity.

        Returns:
            The metadata field of the response.
        """
        if self.metadata is None:
            # If the metadata is not there, check the class first.
            metadata_annotation = (
                type(self).model_fields["metadata"].annotation
            )
            assert metadata_annotation is not None, (
                "For each response model, an annotated metadata"
                "field should exist."
            )

            # metadata is defined as:
            #   metadata: Optional[....ResponseMetadata] = Field(default=None)
            # We need to find the actual class inside the Optional annotation.
            from zenml.utils.typing_utils import get_args

            metadata_type = get_args(metadata_annotation)[0]
            assert issubclass(metadata_type, BaseResponseMetadata)

            if len(metadata_type.model_fields):
                # If the metadata class defines any fields, fetch the metadata
                # through the hydrated version.
                self.hydrate()
            else:
                # Otherwise, use the metadata class to create an empty metadata
                # object.
                self.metadata = metadata_type()

        assert self.metadata is not None

        return self.metadata

    def get_resources(self) -> "AnyResources":
        """Fetch the resources related to this entity.

        Returns:
            The resources field of the response.

        Raises:
            RuntimeError: If the resources field was not included in the response.
        """
        if self.resources is None:
            # If the resources are not there, check the class first.
            resources_annotation = (
                type(self).model_fields["resources"].annotation
            )
            assert resources_annotation is not None, (
                "For each response model, an annotated resources"
                "field should exist."
            )

            # resources is defined as:
            #   resources: Optional[....ResponseResources] = Field(default=None)
            # We need to find the actual class inside the Optional annotation.
            from zenml.utils.typing_utils import get_args

            resources_type = get_args(resources_annotation)[0]
            assert issubclass(resources_type, BaseResponseResources)

            if len(resources_type.model_fields):
                # If the resources class defines any fields, fetch the resources
                # through the hydrated version.
                self.hydrate()
            else:
                # Otherwise, use the resources class to create an empty
                # resources object.
                self.resources = resources_type()

        if self.resources is None:
            raise RuntimeError(
                f"Missing response resources for {type(self).__name__}."
            )

        return self.resources

get_body()

Fetch the body of the entity.

Returns:

Type Description
AnyBody

The body field of the response.

Raises:

Type Description
RuntimeError

If the body was not included in the response.

Source code in src/zenml/models/v2/base/base.py
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
def get_body(self) -> "AnyBody":
    """Fetch the body of the entity.

    Returns:
        The body field of the response.

    Raises:
        RuntimeError: If the body was not included in the response.
    """
    if not self.body:
        raise RuntimeError(
            f"Missing response body for {type(self).__name__}."
        )

    return self.body

get_hydrated_version()

Abstract method to fetch the hydrated version of the model.

Raises:

Type Description
NotImplementedError

in case the method is not implemented.

Source code in src/zenml/models/v2/base/base.py
221
222
223
224
225
226
227
228
229
230
231
232
def get_hydrated_version(
    self,
) -> "BaseResponse[AnyBody, AnyMetadata, AnyResources]":
    """Abstract method to fetch the hydrated version of the model.

    Raises:
        NotImplementedError: in case the method is not implemented.
    """
    raise NotImplementedError(
        "Please implement a `get_hydrated_version` method before "
        "using/hydrating the model."
    )

get_metadata()

Fetch the metadata of the entity.

Returns:

Type Description
AnyMetadata

The metadata field of the response.

Source code in src/zenml/models/v2/base/base.py
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
def get_metadata(self) -> "AnyMetadata":
    """Fetch the metadata of the entity.

    Returns:
        The metadata field of the response.
    """
    if self.metadata is None:
        # If the metadata is not there, check the class first.
        metadata_annotation = (
            type(self).model_fields["metadata"].annotation
        )
        assert metadata_annotation is not None, (
            "For each response model, an annotated metadata"
            "field should exist."
        )

        # metadata is defined as:
        #   metadata: Optional[....ResponseMetadata] = Field(default=None)
        # We need to find the actual class inside the Optional annotation.
        from zenml.utils.typing_utils import get_args

        metadata_type = get_args(metadata_annotation)[0]
        assert issubclass(metadata_type, BaseResponseMetadata)

        if len(metadata_type.model_fields):
            # If the metadata class defines any fields, fetch the metadata
            # through the hydrated version.
            self.hydrate()
        else:
            # Otherwise, use the metadata class to create an empty metadata
            # object.
            self.metadata = metadata_type()

    assert self.metadata is not None

    return self.metadata

get_resources()

Fetch the resources related to this entity.

Returns:

Type Description
AnyResources

The resources field of the response.

Raises:

Type Description
RuntimeError

If the resources field was not included in the response.

Source code in src/zenml/models/v2/base/base.py
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
def get_resources(self) -> "AnyResources":
    """Fetch the resources related to this entity.

    Returns:
        The resources field of the response.

    Raises:
        RuntimeError: If the resources field was not included in the response.
    """
    if self.resources is None:
        # If the resources are not there, check the class first.
        resources_annotation = (
            type(self).model_fields["resources"].annotation
        )
        assert resources_annotation is not None, (
            "For each response model, an annotated resources"
            "field should exist."
        )

        # resources is defined as:
        #   resources: Optional[....ResponseResources] = Field(default=None)
        # We need to find the actual class inside the Optional annotation.
        from zenml.utils.typing_utils import get_args

        resources_type = get_args(resources_annotation)[0]
        assert issubclass(resources_type, BaseResponseResources)

        if len(resources_type.model_fields):
            # If the resources class defines any fields, fetch the resources
            # through the hydrated version.
            self.hydrate()
        else:
            # Otherwise, use the resources class to create an empty
            # resources object.
            self.resources = resources_type()

    if self.resources is None:
        raise RuntimeError(
            f"Missing response resources for {type(self).__name__}."
        )

    return self.resources

hydrate()

Hydrate the response.

Source code in src/zenml/models/v2/base/base.py
213
214
215
216
217
218
219
def hydrate(self) -> None:
    """Hydrate the response."""
    hydrated_version = self.get_hydrated_version()
    self._validate_hydrated_version(hydrated_version)

    self.resources = hydrated_version.resources
    self.metadata = hydrated_version.metadata

BaseResponseBody

Bases: BaseZenModel

Base body model.

Source code in src/zenml/models/v2/base/base.py
72
73
class BaseResponseBody(BaseZenModel):
    """Base body model."""

BaseResponseMetadata

Bases: BaseZenModel

Base metadata model.

Used as a base class for all metadata models associated with responses.

Source code in src/zenml/models/v2/base/base.py
76
77
78
79
80
class BaseResponseMetadata(BaseZenModel):
    """Base metadata model.

    Used as a base class for all metadata models associated with responses.
    """

BaseResponseResources

Bases: BaseZenModel

Base resources model.

Used as a base class for all resource models associated with responses.

Source code in src/zenml/models/v2/base/base.py
83
84
85
86
87
88
89
class BaseResponseResources(BaseZenModel):
    """Base resources model.

    Used as a base class for all resource models associated with responses.
    """

    model_config = ConfigDict(extra="allow")

BaseUpdate

Bases: BaseZenModel

Base update model.

Used as a base class for all update models.

Source code in src/zenml/models/v2/base/base.py
62
63
64
65
66
class BaseUpdate(BaseZenModel):
    """Base update model.

    Used as a base class for all update models.
    """

BaseZenModel

Bases: YAMLSerializationMixin, AnalyticsTrackedModelMixin

Base model class for all ZenML models.

This class is used as a base class for all ZenML models. It provides functionality for tracking analytics events.

Source code in src/zenml/models/v2/base/base.py
33
34
35
36
37
38
39
40
41
42
43
44
45
46
class BaseZenModel(YAMLSerializationMixin, AnalyticsTrackedModelMixin):
    """Base model class for all ZenML models.

    This class is used as a base class for all ZenML models. It provides
    functionality for tracking analytics events.
    """

    model_config = ConfigDict(
        # Ignore extras on all models to support forwards and backwards
        # compatibility (e.g. new fields in newer versions of ZenML servers
        # are allowed to be passed to older versions of ZenML clients and
        # vice versa but will be ignored).
        extra="ignore",
    )

BoolFilter

Bases: Filter

Filter for all Boolean fields.

Source code in src/zenml/models/v2/base/filter.py
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
class BoolFilter(Filter):
    """Filter for all Boolean fields."""

    ALLOWED_OPS: ClassVar[List[str]] = [
        GenericFilterOps.EQUALS,
        GenericFilterOps.NOT_EQUALS,
    ]

    def generate_query_conditions_from_column(self, column: Any) -> Any:
        """Generate query conditions for a boolean column.

        Args:
            column: The boolean column of an SQLModel table on which to filter.

        Returns:
            A list of query conditions.
        """
        if self.operation == GenericFilterOps.NOT_EQUALS:
            return column != self.value

        return column == self.value

generate_query_conditions_from_column(column)

Generate query conditions for a boolean column.

Parameters:

Name Type Description Default
column Any

The boolean column of an SQLModel table on which to filter.

required

Returns:

Type Description
Any

A list of query conditions.

Source code in src/zenml/models/v2/base/filter.py
156
157
158
159
160
161
162
163
164
165
166
167
168
def generate_query_conditions_from_column(self, column: Any) -> Any:
    """Generate query conditions for a boolean column.

    Args:
        column: The boolean column of an SQLModel table on which to filter.

    Returns:
        A list of query conditions.
    """
    if self.operation == GenericFilterOps.NOT_EQUALS:
        return column != self.value

    return column == self.value

BuildItem

Bases: BaseModel

Pipeline build item.

Attributes:

Name Type Description
image str

The image name or digest.

dockerfile Optional[str]

The contents of the Dockerfile used to build the image.

requirements Optional[str]

The pip requirements installed in the image. This is a string consisting of multiple concatenated requirements.txt files.

settings_checksum Optional[str]

Checksum of the settings used for the build.

contains_code bool

Whether the image contains user files.

requires_code_download bool

Whether the image needs to download files.

Source code in src/zenml/models/v2/misc/build_item.py
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
class BuildItem(BaseModel):
    """Pipeline build item.

    Attributes:
        image: The image name or digest.
        dockerfile: The contents of the Dockerfile used to build the image.
        requirements: The pip requirements installed in the image. This is a
            string consisting of multiple concatenated requirements.txt files.
        settings_checksum: Checksum of the settings used for the build.
        contains_code: Whether the image contains user files.
        requires_code_download: Whether the image needs to download files.
    """

    image: str = Field(title="The image name or digest.")
    dockerfile: Optional[str] = Field(
        default=None, title="The dockerfile used to build the image."
    )
    requirements: Optional[str] = Field(
        default=None, title="The pip requirements installed in the image."
    )
    settings_checksum: Optional[str] = Field(
        default=None, title="The checksum of the build settings."
    )
    contains_code: bool = Field(
        default=True, title="Whether the image contains user files."
    )
    requires_code_download: bool = Field(
        default=False, title="Whether the image needs to download files."
    )

CodeReferenceRequest

Bases: BaseRequest

Request model for code references.

Source code in src/zenml/models/v2/core/code_reference.py
36
37
38
39
40
41
42
43
44
45
class CodeReferenceRequest(BaseRequest):
    """Request model for code references."""

    commit: str = Field(description="The commit of the code reference.")
    subdirectory: str = Field(
        description="The subdirectory of the code reference."
    )
    code_repository: UUID = Field(
        description="The repository of the code reference."
    )

CodeReferenceResponse

Bases: BaseIdentifiedResponse[CodeReferenceResponseBody, CodeReferenceResponseMetadata, CodeReferenceResponseResources]

Response model for code references.

Source code in src/zenml/models/v2/core/code_reference.py
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
class CodeReferenceResponse(
    BaseIdentifiedResponse[
        CodeReferenceResponseBody,
        CodeReferenceResponseMetadata,
        CodeReferenceResponseResources,
    ]
):
    """Response model for code references."""

    def get_hydrated_version(self) -> "CodeReferenceResponse":
        """Get the hydrated version of this code reference.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_code_reference(self.id)

    # Body and metadata properties
    @property
    def commit(self) -> str:
        """The `commit` property.

        Returns:
            the value of the property.
        """
        return self.get_body().commit

    @property
    def subdirectory(self) -> str:
        """The `subdirectory` property.

        Returns:
            the value of the property.
        """
        return self.get_body().subdirectory

    @property
    def code_repository(self) -> "CodeRepositoryResponse":
        """The `code_repository` property.

        Returns:
            the value of the property.
        """
        return self.get_body().code_repository

code_repository property

The code_repository property.

Returns:

Type Description
CodeRepositoryResponse

the value of the property.

commit property

The commit property.

Returns:

Type Description
str

the value of the property.

subdirectory property

The subdirectory property.

Returns:

Type Description
str

the value of the property.

get_hydrated_version()

Get the hydrated version of this code reference.

Returns:

Type Description
CodeReferenceResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/code_reference.py
85
86
87
88
89
90
91
92
93
def get_hydrated_version(self) -> "CodeReferenceResponse":
    """Get the hydrated version of this code reference.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_code_reference(self.id)

CodeReferenceResponseBody

Bases: BaseDatedResponseBody

Response body for code references.

Source code in src/zenml/models/v2/core/code_reference.py
56
57
58
59
60
61
62
63
64
65
class CodeReferenceResponseBody(BaseDatedResponseBody):
    """Response body for code references."""

    commit: str = Field(description="The commit of the code reference.")
    subdirectory: str = Field(
        description="The subdirectory of the code reference."
    )
    code_repository: "CodeRepositoryResponse" = Field(
        description="The repository of the code reference."
    )

CodeReferenceResponseMetadata

Bases: BaseResponseMetadata

Response metadata for code references.

Source code in src/zenml/models/v2/core/code_reference.py
68
69
class CodeReferenceResponseMetadata(BaseResponseMetadata):
    """Response metadata for code references."""

CodeRepositoryFilter

Bases: ProjectScopedFilter

Model to enable advanced filtering of all code repositories.

Source code in src/zenml/models/v2/core/code_repository.py
184
185
186
187
188
189
190
class CodeRepositoryFilter(ProjectScopedFilter):
    """Model to enable advanced filtering of all code repositories."""

    name: Optional[str] = Field(
        description="Name of the code repository.",
        default=None,
    )

CodeRepositoryRequest

Bases: ProjectScopedRequest

Request model for code repositories.

Source code in src/zenml/models/v2/core/code_repository.py
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
class CodeRepositoryRequest(ProjectScopedRequest):
    """Request model for code repositories."""

    name: str = Field(
        title="The name of the code repository.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    config: Dict[str, Any] = Field(
        description="Configuration for the code repository."
    )
    source: Source = Field(description="The code repository source.")
    logo_url: Optional[str] = Field(
        description="Optional URL of a logo (png, jpg or svg) for the "
        "code repository.",
        default=None,
    )
    description: Optional[str] = Field(
        description="Code repository description.",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )

CodeRepositoryResponse

Bases: ProjectScopedResponse[CodeRepositoryResponseBody, CodeRepositoryResponseMetadata, CodeRepositoryResponseResources]

Response model for code repositories.

Source code in src/zenml/models/v2/core/code_repository.py
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
class CodeRepositoryResponse(
    ProjectScopedResponse[
        CodeRepositoryResponseBody,
        CodeRepositoryResponseMetadata,
        CodeRepositoryResponseResources,
    ]
):
    """Response model for code repositories."""

    name: str = Field(
        title="The name of the code repository.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "CodeRepositoryResponse":
        """Get the hydrated version of this code repository.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_code_repository(self.id)

    # Body and metadata properties
    @property
    def source(self) -> Source:
        """The `source` property.

        Returns:
            the value of the property.
        """
        return self.get_body().source

    @property
    def logo_url(self) -> Optional[str]:
        """The `logo_url` property.

        Returns:
            the value of the property.
        """
        return self.get_body().logo_url

    @property
    def config(self) -> Dict[str, Any]:
        """The `config` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().config

    @property
    def description(self) -> Optional[str]:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

config property

The config property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

description property

The description property.

Returns:

Type Description
Optional[str]

the value of the property.

logo_url property

The logo_url property.

Returns:

Type Description
Optional[str]

the value of the property.

source property

The source property.

Returns:

Type Description
Source

the value of the property.

get_hydrated_version()

Get the hydrated version of this code repository.

Returns:

Type Description
CodeRepositoryResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/code_repository.py
133
134
135
136
137
138
139
140
141
def get_hydrated_version(self) -> "CodeRepositoryResponse":
    """Get the hydrated version of this code repository.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_code_repository(self.id)

CodeRepositoryResponseBody

Bases: ProjectScopedResponseBody

Response body for code repositories.

Source code in src/zenml/models/v2/core/code_repository.py
91
92
93
94
95
96
97
98
99
class CodeRepositoryResponseBody(ProjectScopedResponseBody):
    """Response body for code repositories."""

    source: Source = Field(description="The code repository source.")
    logo_url: Optional[str] = Field(
        default=None,
        description="Optional URL of a logo (png, jpg or svg) for the "
        "code repository.",
    )

CodeRepositoryResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for code repositories.

Source code in src/zenml/models/v2/core/code_repository.py
102
103
104
105
106
107
108
109
110
111
112
class CodeRepositoryResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for code repositories."""

    config: Dict[str, Any] = Field(
        description="Configuration for the code repository."
    )
    description: Optional[str] = Field(
        default=None,
        description="Code repository description.",
        max_length=TEXT_FIELD_MAX_LENGTH,
    )

CodeRepositoryResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the code repository entity.

Source code in src/zenml/models/v2/core/code_repository.py
115
116
class CodeRepositoryResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the code repository entity."""

CodeRepositoryUpdate

Bases: BaseUpdate

Update model for code repositories.

Source code in src/zenml/models/v2/core/code_repository.py
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
class CodeRepositoryUpdate(BaseUpdate):
    """Update model for code repositories."""

    name: Optional[str] = Field(
        title="The name of the code repository.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    config: Optional[Dict[str, Any]] = Field(
        description="Configuration for the code repository.",
        default=None,
    )
    source: Optional[SourceWithValidator] = Field(
        description="The code repository source.", default=None
    )
    logo_url: Optional[str] = Field(
        description="Optional URL of a logo (png, jpg or svg) for the "
        "code repository.",
        default=None,
    )
    description: Optional[str] = Field(
        description="Code repository description.",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )

ComponentBase

Bases: BaseModel

Base model for components.

Source code in src/zenml/models/v2/core/component.py
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
class ComponentBase(BaseModel):
    """Base model for components."""

    name: str = Field(
        title="The name of the stack component.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    type: StackComponentType = Field(
        title="The type of the stack component.",
    )

    flavor: str = Field(
        title="The flavor of the stack component.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    configuration: Dict[str, Any] = Field(
        title="The stack component configuration.",
    )

    connector_resource_id: Optional[str] = Field(
        default=None,
        description="The ID of a specific resource instance to "
        "gain access to through the connector",
    )

    labels: Optional[Dict[str, Any]] = Field(
        default=None,
        title="The stack component labels.",
    )

ComponentFilter

Bases: UserScopedFilter

Model to enable advanced stack component filtering.

Source code in src/zenml/models/v2/core/component.py
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
class ComponentFilter(UserScopedFilter):
    """Model to enable advanced stack component filtering."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *UserScopedFilter.FILTER_EXCLUDE_FIELDS,
        "scope_type",
        "stack_id",
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *UserScopedFilter.CLI_EXCLUDE_FIELDS,
        "scope_type",
    ]
    scope_type: Optional[str] = Field(
        default=None,
        description="The type to scope this query to.",
    )
    name: Optional[str] = Field(
        default=None,
        description="Name of the stack component",
    )
    flavor: Optional[str] = Field(
        default=None,
        description="Flavor of the stack component",
    )
    type: Optional[str] = Field(
        default=None,
        description="Type of the stack component",
    )
    connector_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Connector linked to the stack component",
        union_mode="left_to_right",
    )
    stack_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Stack of the stack component",
        union_mode="left_to_right",
    )

    def set_scope_type(self, component_type: str) -> None:
        """Set the type of component on which to perform the filtering to scope the response.

        Args:
            component_type: The type of component to scope the query to.
        """
        self.scope_type = component_type

    def generate_filter(
        self, table: Type["AnySchema"]
    ) -> Union["ColumnElement[bool]"]:
        """Generate the filter for the query.

        Stack components can be scoped by type to narrow the search.

        Args:
            table: The Table that is being queried from.

        Returns:
            The filter expression for the query.
        """
        from sqlmodel import and_, or_

        from zenml.zen_stores.schemas import (
            StackComponentSchema,
            StackCompositionSchema,
        )

        base_filter = super().generate_filter(table)
        if self.scope_type:
            type_filter = getattr(table, "type") == self.scope_type
            return and_(base_filter, type_filter)

        if self.stack_id:
            operator = (
                or_ if self.logical_operator == LogicalOperators.OR else and_
            )

            stack_filter = and_(
                StackCompositionSchema.stack_id == self.stack_id,
                StackCompositionSchema.component_id == StackComponentSchema.id,
            )
            base_filter = operator(base_filter, stack_filter)

        return base_filter

generate_filter(table)

Generate the filter for the query.

Stack components can be scoped by type to narrow the search.

Parameters:

Name Type Description Default
table Type[AnySchema]

The Table that is being queried from.

required

Returns:

Type Description
Union[ColumnElement[bool]]

The filter expression for the query.

Source code in src/zenml/models/v2/core/component.py
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
def generate_filter(
    self, table: Type["AnySchema"]
) -> Union["ColumnElement[bool]"]:
    """Generate the filter for the query.

    Stack components can be scoped by type to narrow the search.

    Args:
        table: The Table that is being queried from.

    Returns:
        The filter expression for the query.
    """
    from sqlmodel import and_, or_

    from zenml.zen_stores.schemas import (
        StackComponentSchema,
        StackCompositionSchema,
    )

    base_filter = super().generate_filter(table)
    if self.scope_type:
        type_filter = getattr(table, "type") == self.scope_type
        return and_(base_filter, type_filter)

    if self.stack_id:
        operator = (
            or_ if self.logical_operator == LogicalOperators.OR else and_
        )

        stack_filter = and_(
            StackCompositionSchema.stack_id == self.stack_id,
            StackCompositionSchema.component_id == StackComponentSchema.id,
        )
        base_filter = operator(base_filter, stack_filter)

    return base_filter

set_scope_type(component_type)

Set the type of component on which to perform the filtering to scope the response.

Parameters:

Name Type Description Default
component_type str

The type of component to scope the query to.

required
Source code in src/zenml/models/v2/core/component.py
383
384
385
386
387
388
389
def set_scope_type(self, component_type: str) -> None:
    """Set the type of component on which to perform the filtering to scope the response.

    Args:
        component_type: The type of component to scope the query to.
    """
    self.scope_type = component_type

ComponentInfo

Bases: BaseModel

Information about each stack components when creating a full stack.

Source code in src/zenml/models/v2/misc/info_models.py
34
35
36
37
38
39
40
41
42
43
44
45
46
class ComponentInfo(BaseModel):
    """Information about each stack components when creating a full stack."""

    flavor: str
    service_connector_index: Optional[int] = Field(
        default=None,
        title="The id of the service connector from the list "
        "`service_connectors`.",
        description="The id of the service connector from the list "
        "`service_connectors` from `FullStackRequest`.",
    )
    service_connector_resource_id: Optional[str] = None
    configuration: Dict[str, Any] = {}

ComponentRequest

Bases: ComponentBase, UserScopedRequest

Request model for stack components.

Source code in src/zenml/models/v2/core/component.py
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
class ComponentRequest(ComponentBase, UserScopedRequest):
    """Request model for stack components."""

    ANALYTICS_FIELDS: ClassVar[List[str]] = ["type", "flavor"]

    connector: Optional[UUID] = Field(
        default=None,
        title="The service connector linked to this stack component.",
    )

    @field_validator("name")
    @classmethod
    def name_cant_be_a_secret_reference(cls, name: str) -> str:
        """Validator to ensure that the given name is not a secret reference.

        Args:
            name: The name to validate.

        Returns:
            The name if it is not a secret reference.

        Raises:
            ValueError: If the name is a secret reference.
        """
        if secret_utils.is_secret_reference(name):
            raise ValueError(
                "Passing the `name` attribute of a stack component as a "
                "secret reference is not allowed."
            )
        return name

name_cant_be_a_secret_reference(name) classmethod

Validator to ensure that the given name is not a secret reference.

Parameters:

Name Type Description Default
name str

The name to validate.

required

Returns:

Type Description
str

The name if it is not a secret reference.

Raises:

Type Description
ValueError

If the name is a secret reference.

Source code in src/zenml/models/v2/core/component.py
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
@field_validator("name")
@classmethod
def name_cant_be_a_secret_reference(cls, name: str) -> str:
    """Validator to ensure that the given name is not a secret reference.

    Args:
        name: The name to validate.

    Returns:
        The name if it is not a secret reference.

    Raises:
        ValueError: If the name is a secret reference.
    """
    if secret_utils.is_secret_reference(name):
        raise ValueError(
            "Passing the `name` attribute of a stack component as a "
            "secret reference is not allowed."
        )
    return name

ComponentResponse

Bases: UserScopedResponse[ComponentResponseBody, ComponentResponseMetadata, ComponentResponseResources]

Response model for stack components.

Source code in src/zenml/models/v2/core/component.py
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
class ComponentResponse(
    UserScopedResponse[
        ComponentResponseBody,
        ComponentResponseMetadata,
        ComponentResponseResources,
    ]
):
    """Response model for stack components."""

    ANALYTICS_FIELDS: ClassVar[List[str]] = ["type"]

    name: str = Field(
        title="The name of the stack component.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_analytics_metadata(self) -> Dict[str, Any]:
        """Add the component labels to analytics metadata.

        Returns:
            Dict of analytics metadata.
        """
        metadata = super().get_analytics_metadata()

        if self.labels is not None:
            metadata.update(
                {
                    label[6:]: value
                    for label, value in self.labels.items()
                    if label.startswith("zenml:")
                }
            )
        metadata["flavor"] = self.flavor_name

        return metadata

    def get_hydrated_version(self) -> "ComponentResponse":
        """Get the hydrated version of this component.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_stack_component(self.id)

    # Body and metadata properties
    @property
    def type(self) -> StackComponentType:
        """The `type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().type

    @property
    def flavor_name(self) -> str:
        """The `flavor_name` property.

        Returns:
            the value of the property.
        """
        return self.get_body().flavor_name

    @property
    def integration(self) -> Optional[str]:
        """The `integration` property.

        Returns:
            the value of the property.
        """
        return self.get_body().integration

    @property
    def logo_url(self) -> Optional[str]:
        """The `logo_url` property.

        Returns:
            the value of the property.
        """
        return self.get_body().logo_url

    @property
    def configuration(self) -> Dict[str, Any]:
        """The `configuration` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().configuration

    @property
    def labels(self) -> Optional[Dict[str, Any]]:
        """The `labels` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().labels

    @property
    def connector_resource_id(self) -> Optional[str]:
        """The `connector_resource_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().connector_resource_id

    @property
    def connector(self) -> Optional["ServiceConnectorResponse"]:
        """The `connector` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().connector

    @property
    def flavor(self) -> "FlavorResponse":
        """The `flavor` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().flavor

configuration property

The configuration property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

connector property

The connector property.

Returns:

Type Description
Optional[ServiceConnectorResponse]

the value of the property.

connector_resource_id property

The connector_resource_id property.

Returns:

Type Description
Optional[str]

the value of the property.

flavor property

The flavor property.

Returns:

Type Description
FlavorResponse

the value of the property.

flavor_name property

The flavor_name property.

Returns:

Type Description
str

the value of the property.

integration property

The integration property.

Returns:

Type Description
Optional[str]

the value of the property.

labels property

The labels property.

Returns:

Type Description
Optional[Dict[str, Any]]

the value of the property.

logo_url property

The logo_url property.

Returns:

Type Description
Optional[str]

the value of the property.

type property

The type property.

Returns:

Type Description
StackComponentType

the value of the property.

get_analytics_metadata()

Add the component labels to analytics metadata.

Returns:

Type Description
Dict[str, Any]

Dict of analytics metadata.

Source code in src/zenml/models/v2/core/component.py
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
def get_analytics_metadata(self) -> Dict[str, Any]:
    """Add the component labels to analytics metadata.

    Returns:
        Dict of analytics metadata.
    """
    metadata = super().get_analytics_metadata()

    if self.labels is not None:
        metadata.update(
            {
                label[6:]: value
                for label, value in self.labels.items()
                if label.startswith("zenml:")
            }
        )
    metadata["flavor"] = self.flavor_name

    return metadata

get_hydrated_version()

Get the hydrated version of this component.

Returns:

Type Description
ComponentResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/component.py
248
249
250
251
252
253
254
255
256
def get_hydrated_version(self) -> "ComponentResponse":
    """Get the hydrated version of this component.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_stack_component(self.id)

ComponentResponseBody

Bases: UserScopedResponseBody

Response body for stack components.

Source code in src/zenml/models/v2/core/component.py
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
class ComponentResponseBody(UserScopedResponseBody):
    """Response body for stack components."""

    type: StackComponentType = Field(
        title="The type of the stack component.",
    )
    flavor_name: str = Field(
        title="The flavor of the stack component.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    integration: Optional[str] = Field(
        default=None,
        title="The name of the integration that the component's flavor "
        "belongs to.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    logo_url: Optional[str] = Field(
        default=None,
        title="Optionally, a url pointing to a png,"
        "svg or jpg can be attached.",
    )

ComponentResponseMetadata

Bases: UserScopedResponseMetadata

Response metadata for stack components.

Source code in src/zenml/models/v2/core/component.py
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
class ComponentResponseMetadata(UserScopedResponseMetadata):
    """Response metadata for stack components."""

    configuration: Dict[str, Any] = Field(
        title="The stack component configuration.",
    )
    labels: Optional[Dict[str, Any]] = Field(
        default=None,
        title="The stack component labels.",
    )
    connector_resource_id: Optional[str] = Field(
        default=None,
        description="The ID of a specific resource instance to "
        "gain access to through the connector",
    )
    connector: Optional["ServiceConnectorResponse"] = Field(
        default=None,
        title="The service connector linked to this stack component.",
    )

ComponentResponseResources

Bases: UserScopedResponseResources

Response resources for stack components.

Source code in src/zenml/models/v2/core/component.py
204
205
206
207
208
209
class ComponentResponseResources(UserScopedResponseResources):
    """Response resources for stack components."""

    flavor: "FlavorResponse" = Field(
        title="The flavor of this stack component.",
    )

ComponentUpdate

Bases: BaseUpdate

Update model for stack components.

Source code in src/zenml/models/v2/core/component.py
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
class ComponentUpdate(BaseUpdate):
    """Update model for stack components."""

    name: Optional[str] = Field(
        title="The name of the stack component.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    configuration: Optional[Dict[str, Any]] = Field(
        title="The stack component configuration.",
        default=None,
    )
    connector_resource_id: Optional[str] = Field(
        description="The ID of a specific resource instance to "
        "gain access to through the connector",
        default=None,
    )
    labels: Optional[Dict[str, Any]] = Field(
        title="The stack component labels.",
        default=None,
    )
    connector: Optional[UUID] = Field(
        title="The service connector linked to this stack component.",
        default=None,
    )

DefaultComponentRequest

Bases: ComponentRequest

Internal component request model used only for default stack components.

Source code in src/zenml/models/v2/core/component.py
123
124
class DefaultComponentRequest(ComponentRequest):
    """Internal component request model used only for default stack components."""

DefaultStackRequest

Bases: StackRequest

Internal stack request model used only for default stacks.

Source code in src/zenml/models/v2/core/stack.py
145
146
class DefaultStackRequest(StackRequest):
    """Internal stack request model used only for default stacks."""

DeployedStack

Bases: BaseModel

Information about a deployed stack.

Source code in src/zenml/models/v2/misc/stack_deployment.py
85
86
87
88
89
90
91
92
93
94
95
96
class DeployedStack(BaseModel):
    """Information about a deployed stack."""

    stack: StackResponse = Field(
        title="The stack that was deployed.",
        description="The stack that was deployed.",
    )
    service_connector: Optional[ServiceConnectorResponse] = Field(
        default=None,
        title="The service connector for the deployed stack.",
        description="The service connector for the deployed stack.",
    )

EventSourceFilter

Bases: ProjectScopedFilter

Model to enable advanced filtering of all EventSourceModels.

Source code in src/zenml/models/v2/core/event_source.py
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
class EventSourceFilter(ProjectScopedFilter):
    """Model to enable advanced filtering of all EventSourceModels."""

    name: Optional[str] = Field(
        default=None,
        description="Name of the event source",
    )
    flavor: Optional[str] = Field(
        default=None,
        description="Flavor of the event source",
    )
    plugin_subtype: Optional[str] = Field(
        default=None,
        title="The plugin sub type of the event source.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

EventSourceFlavorResponse

Bases: BasePluginFlavorResponse[EventSourceFlavorResponseBody, EventSourceFlavorResponseMetadata, EventSourceFlavorResponseResources]

Response model for Event Source Flavors.

Source code in src/zenml/models/v2/core/event_source_flavor.py
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
class EventSourceFlavorResponse(
    BasePluginFlavorResponse[
        EventSourceFlavorResponseBody,
        EventSourceFlavorResponseMetadata,
        EventSourceFlavorResponseResources,
    ]
):
    """Response model for Event Source Flavors."""

    # Body and metadata properties
    @property
    def source_config_schema(self) -> Dict[str, Any]:
        """The `source_config_schema` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().source_config_schema

    @property
    def filter_config_schema(self) -> Dict[str, Any]:
        """The `filter_config_schema` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().filter_config_schema

filter_config_schema property

The filter_config_schema property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

source_config_schema property

The source_config_schema property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

EventSourceFlavorResponseBody

Bases: BasePluginResponseBody

Response body for event flavors.

Source code in src/zenml/models/v2/core/event_source_flavor.py
26
27
class EventSourceFlavorResponseBody(BasePluginResponseBody):
    """Response body for event flavors."""

EventSourceFlavorResponseMetadata

Bases: BasePluginResponseMetadata

Response metadata for event flavors.

Source code in src/zenml/models/v2/core/event_source_flavor.py
30
31
32
33
34
class EventSourceFlavorResponseMetadata(BasePluginResponseMetadata):
    """Response metadata for event flavors."""

    source_config_schema: Dict[str, Any]
    filter_config_schema: Dict[str, Any]

EventSourceFlavorResponseResources

Bases: BasePluginResponseResources

Response resources for event source flavors.

Source code in src/zenml/models/v2/core/event_source_flavor.py
37
38
class EventSourceFlavorResponseResources(BasePluginResponseResources):
    """Response resources for event source flavors."""

EventSourceRequest

Bases: ProjectScopedRequest

BaseModel for all event sources.

Source code in src/zenml/models/v2/core/event_source.py
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
class EventSourceRequest(ProjectScopedRequest):
    """BaseModel for all event sources."""

    name: str = Field(
        title="The name of the event source.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    flavor: str = Field(
        title="The flavor of event source.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    plugin_subtype: PluginSubType = Field(
        title="The plugin subtype of the event source.",
    )
    description: str = Field(
        default="",
        title="The description of the event source.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    configuration: Dict[str, Any] = Field(
        title="The event source configuration.",
    )

EventSourceResponse

Bases: ProjectScopedResponse[EventSourceResponseBody, EventSourceResponseMetadata, EventSourceResponseResources]

Response model for event sources.

Source code in src/zenml/models/v2/core/event_source.py
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
class EventSourceResponse(
    ProjectScopedResponse[
        EventSourceResponseBody,
        EventSourceResponseMetadata,
        EventSourceResponseResources,
    ]
):
    """Response model for event sources."""

    name: str = Field(
        title="The name of the event source.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "EventSourceResponse":
        """Get the hydrated version of this event source.

        Returns:
            An instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_event_source(self.id)

    # Body and metadata properties
    @property
    def flavor(self) -> str:
        """The `flavor` property.

        Returns:
            the value of the property.
        """
        return self.get_body().flavor

    @property
    def is_active(self) -> bool:
        """The `is_active` property.

        Returns:
            the value of the property.
        """
        return self.get_body().is_active

    @property
    def plugin_subtype(self) -> PluginSubType:
        """The `plugin_subtype` property.

        Returns:
            the value of the property.
        """
        return self.get_body().plugin_subtype

    @property
    def description(self) -> str:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

    @property
    def configuration(self) -> Dict[str, Any]:
        """The `configuration` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().configuration

    def set_configuration(self, configuration: Dict[str, Any]) -> None:
        """Set the `configuration` property.

        Args:
            configuration: The value to set.
        """
        self.get_metadata().configuration = configuration

configuration property

The configuration property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

description property

The description property.

Returns:

Type Description
str

the value of the property.

flavor property

The flavor property.

Returns:

Type Description
str

the value of the property.

is_active property

The is_active property.

Returns:

Type Description
bool

the value of the property.

plugin_subtype property

The plugin_subtype property.

Returns:

Type Description
PluginSubType

the value of the property.

get_hydrated_version()

Get the hydrated version of this event source.

Returns:

Type Description
EventSourceResponse

An instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/event_source.py
161
162
163
164
165
166
167
168
169
def get_hydrated_version(self) -> "EventSourceResponse":
    """Get the hydrated version of this event source.

    Returns:
        An instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_event_source(self.id)

set_configuration(configuration)

Set the configuration property.

Parameters:

Name Type Description Default
configuration Dict[str, Any]

The value to set.

required
Source code in src/zenml/models/v2/core/event_source.py
217
218
219
220
221
222
223
def set_configuration(self, configuration: Dict[str, Any]) -> None:
    """Set the `configuration` property.

    Args:
        configuration: The value to set.
    """
    self.get_metadata().configuration = configuration

EventSourceResponseBody

Bases: ProjectScopedResponseBody

ResponseBody for event sources.

Source code in src/zenml/models/v2/core/event_source.py
111
112
113
114
115
116
117
118
119
120
121
122
123
class EventSourceResponseBody(ProjectScopedResponseBody):
    """ResponseBody for event sources."""

    flavor: str = Field(
        title="The flavor of event source.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    plugin_subtype: PluginSubType = Field(
        title="The plugin subtype of the event source.",
    )
    is_active: bool = Field(
        title="Whether the event source is active.",
    )

EventSourceResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for event sources.

Source code in src/zenml/models/v2/core/event_source.py
126
127
128
129
130
131
132
133
134
135
136
class EventSourceResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for event sources."""

    description: str = Field(
        default="",
        title="The description of the event source.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    configuration: Dict[str, Any] = Field(
        title="The event source configuration.",
    )

EventSourceResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the code repository entity.

Source code in src/zenml/models/v2/core/event_source.py
139
140
141
142
143
144
class EventSourceResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the code repository entity."""

    triggers: Page[TriggerResponse] = Field(
        title="The triggers configured with this event source.",
    )

EventSourceUpdate

Bases: BaseUpdate

Update model for event sources.

Source code in src/zenml/models/v2/core/event_source.py
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
class EventSourceUpdate(BaseUpdate):
    """Update model for event sources."""

    name: Optional[str] = Field(
        default=None,
        title="The updated name of the event source.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    description: Optional[str] = Field(
        default=None,
        title="The updated description of the event source.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    configuration: Optional[Dict[str, Any]] = Field(
        default=None,
        title="The updated event source configuration.",
    )
    is_active: Optional[bool] = Field(
        default=None,
        title="The status of the event source.",
    )

    @classmethod
    def from_response(
        cls, response: "EventSourceResponse"
    ) -> "EventSourceUpdate":
        """Create an update model from a response model.

        Args:
            response: The response model to create the update model from.

        Returns:
            The update model.
        """
        return EventSourceUpdate(
            name=response.name,
            description=response.description,
            configuration=copy.deepcopy(response.configuration),
            is_active=response.is_active,
        )

from_response(response) classmethod

Create an update model from a response model.

Parameters:

Name Type Description Default
response EventSourceResponse

The response model to create the update model from.

required

Returns:

Type Description
EventSourceUpdate

The update model.

Source code in src/zenml/models/v2/core/event_source.py
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
@classmethod
def from_response(
    cls, response: "EventSourceResponse"
) -> "EventSourceUpdate":
    """Create an update model from a response model.

    Args:
        response: The response model to create the update model from.

    Returns:
        The update model.
    """
    return EventSourceUpdate(
        name=response.name,
        description=response.description,
        configuration=copy.deepcopy(response.configuration),
        is_active=response.is_active,
    )

ExternalUserModel

Bases: BaseModel

External user model.

Source code in src/zenml/models/v2/misc/external_user.py
22
23
24
25
26
27
28
29
30
class ExternalUserModel(BaseModel):
    """External user model."""

    id: UUID
    email: str
    name: Optional[str] = None
    is_admin: bool = False

    model_config = ConfigDict(extra="ignore")

FlavorFilter

Bases: UserScopedFilter

Model to enable advanced stack component flavor filtering.

Source code in src/zenml/models/v2/core/flavor.py
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
class FlavorFilter(UserScopedFilter):
    """Model to enable advanced stack component flavor filtering."""

    name: Optional[str] = Field(
        default=None,
        description="Name of the flavor",
    )
    type: Optional[str] = Field(
        default=None,
        description="Stack Component Type of the stack flavor",
    )
    integration: Optional[str] = Field(
        default=None,
        description="Integration associated with the flavor",
    )

FlavorRequest

Bases: UserScopedRequest

Request model for stack component flavors.

Source code in src/zenml/models/v2/core/flavor.py
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
class FlavorRequest(UserScopedRequest):
    """Request model for stack component flavors."""

    ANALYTICS_FIELDS: ClassVar[List[str]] = [
        "type",
        "integration",
    ]

    name: str = Field(
        title="The name of the Flavor.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    type: StackComponentType = Field(title="The type of the Flavor.")
    config_schema: Dict[str, Any] = Field(
        title="The JSON schema of this flavor's corresponding configuration.",
    )
    connector_type: Optional[str] = Field(
        default=None,
        title="The type of the connector that this flavor uses.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    connector_resource_type: Optional[str] = Field(
        default=None,
        title="The resource type of the connector that this flavor uses.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    connector_resource_id_attr: Optional[str] = Field(
        default=None,
        title="The name of an attribute in the stack component configuration "
        "that plays the role of resource ID when linked to a service "
        "connector.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    source: str = Field(
        title="The path to the module which contains this Flavor.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    integration: Optional[str] = Field(
        title="The name of the integration that the Flavor belongs to.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    logo_url: Optional[str] = Field(
        default=None,
        title="Optionally, a url pointing to a png,"
        "svg or jpg can be attached.",
    )
    docs_url: Optional[str] = Field(
        default=None,
        title="Optionally, a url pointing to docs, within docs.zenml.io.",
    )
    sdk_docs_url: Optional[str] = Field(
        default=None,
        title="Optionally, a url pointing to SDK docs,"
        "within sdkdocs.zenml.io.",
    )
    is_custom: bool = Field(
        title="Whether or not this flavor is a custom, user created flavor.",
        default=True,
    )

FlavorResponse

Bases: UserScopedResponse[FlavorResponseBody, FlavorResponseMetadata, FlavorResponseResources]

Response model for stack component flavors.

Source code in src/zenml/models/v2/core/flavor.py
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
class FlavorResponse(
    UserScopedResponse[
        FlavorResponseBody,
        FlavorResponseMetadata,
        FlavorResponseResources,
    ]
):
    """Response model for stack component flavors."""

    # Analytics
    ANALYTICS_FIELDS: ClassVar[List[str]] = [
        "id",
        "type",
        "integration",
    ]

    name: str = Field(
        title="The name of the Flavor.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "FlavorResponse":
        """Get the hydrated version of the flavor.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_flavor(self.id)

    # Helper methods
    @property
    def connector_requirements(
        self,
    ) -> Optional["ServiceConnectorRequirements"]:
        """Returns the connector requirements for the flavor.

        Returns:
            The connector requirements for the flavor.
        """
        from zenml.models import (
            ServiceConnectorRequirements,
        )

        if not self.connector_resource_type:
            return None

        return ServiceConnectorRequirements(
            connector_type=self.connector_type,
            resource_type=self.connector_resource_type,
            resource_id_attr=self.connector_resource_id_attr,
        )

    # Body and metadata properties
    @property
    def type(self) -> StackComponentType:
        """The `type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().type

    @property
    def integration(self) -> Optional[str]:
        """The `integration` property.

        Returns:
            the value of the property.
        """
        return self.get_body().integration

    @property
    def source(self) -> str:
        """The `source` property.

        Returns:
            the value of the property.
        """
        return self.get_body().source

    @property
    def logo_url(self) -> Optional[str]:
        """The `logo_url` property.

        Returns:
            the value of the property.
        """
        return self.get_body().logo_url

    @property
    def is_custom(self) -> bool:
        """The `is_custom` property.

        Returns:
            the value of the property.
        """
        return self.get_body().is_custom

    @property
    def config_schema(self) -> Dict[str, Any]:
        """The `config_schema` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().config_schema

    @property
    def connector_type(self) -> Optional[str]:
        """The `connector_type` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().connector_type

    @property
    def connector_resource_type(self) -> Optional[str]:
        """The `connector_resource_type` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().connector_resource_type

    @property
    def connector_resource_id_attr(self) -> Optional[str]:
        """The `connector_resource_id_attr` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().connector_resource_id_attr

    @property
    def docs_url(self) -> Optional[str]:
        """The `docs_url` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().docs_url

    @property
    def sdk_docs_url(self) -> Optional[str]:
        """The `sdk_docs_url` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().sdk_docs_url

config_schema property

The config_schema property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

connector_requirements property

Returns the connector requirements for the flavor.

Returns:

Type Description
Optional[ServiceConnectorRequirements]

The connector requirements for the flavor.

connector_resource_id_attr property

The connector_resource_id_attr property.

Returns:

Type Description
Optional[str]

the value of the property.

connector_resource_type property

The connector_resource_type property.

Returns:

Type Description
Optional[str]

the value of the property.

connector_type property

The connector_type property.

Returns:

Type Description
Optional[str]

the value of the property.

docs_url property

The docs_url property.

Returns:

Type Description
Optional[str]

the value of the property.

integration property

The integration property.

Returns:

Type Description
Optional[str]

the value of the property.

is_custom property

The is_custom property.

Returns:

Type Description
bool

the value of the property.

logo_url property

The logo_url property.

Returns:

Type Description
Optional[str]

the value of the property.

sdk_docs_url property

The sdk_docs_url property.

Returns:

Type Description
Optional[str]

the value of the property.

source property

The source property.

Returns:

Type Description
str

the value of the property.

type property

The type property.

Returns:

Type Description
StackComponentType

the value of the property.

get_hydrated_version()

Get the hydrated version of the flavor.

Returns:

Type Description
FlavorResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/flavor.py
251
252
253
254
255
256
257
258
259
def get_hydrated_version(self) -> "FlavorResponse":
    """Get the hydrated version of the flavor.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_flavor(self.id)

FlavorResponseBody

Bases: UserScopedResponseBody

Response body for stack component flavors.

Source code in src/zenml/models/v2/core/flavor.py
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
class FlavorResponseBody(UserScopedResponseBody):
    """Response body for stack component flavors."""

    type: StackComponentType = Field(title="The type of the Flavor.")
    integration: Optional[str] = Field(
        title="The name of the integration that the Flavor belongs to.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    source: str = Field(
        title="The path to the module which contains this Flavor.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    logo_url: Optional[str] = Field(
        default=None,
        title="Optionally, a url pointing to a png,"
        "svg or jpg can be attached.",
    )
    is_custom: bool = Field(
        title="Whether or not this flavor is a custom, user created flavor.",
        default=True,
    )

FlavorResponseMetadata

Bases: UserScopedResponseMetadata

Response metadata for stack component flavors.

Source code in src/zenml/models/v2/core/flavor.py
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
class FlavorResponseMetadata(UserScopedResponseMetadata):
    """Response metadata for stack component flavors."""

    config_schema: Dict[str, Any] = Field(
        title="The JSON schema of this flavor's corresponding configuration.",
    )
    connector_type: Optional[str] = Field(
        default=None,
        title="The type of the connector that this flavor uses.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    connector_resource_type: Optional[str] = Field(
        default=None,
        title="The resource type of the connector that this flavor uses.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    connector_resource_id_attr: Optional[str] = Field(
        default=None,
        title="The name of an attribute in the stack component configuration "
        "that plays the role of resource ID when linked to a service "
        "connector.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    docs_url: Optional[str] = Field(
        default=None,
        title="Optionally, a url pointing to docs, within docs.zenml.io.",
    )
    sdk_docs_url: Optional[str] = Field(
        default=None,
        title="Optionally, a url pointing to SDK docs,"
        "within sdkdocs.zenml.io.",
    )

FlavorResponseResources

Bases: UserScopedResponseResources

Response resources for stack component flavors.

Source code in src/zenml/models/v2/core/flavor.py
226
227
class FlavorResponseResources(UserScopedResponseResources):
    """Response resources for stack component flavors."""

FlavorUpdate

Bases: BaseUpdate

Update model for stack component flavors.

Source code in src/zenml/models/v2/core/flavor.py
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
class FlavorUpdate(BaseUpdate):
    """Update model for stack component flavors."""

    name: Optional[str] = Field(
        title="The name of the Flavor.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    type: Optional[StackComponentType] = Field(
        title="The type of the Flavor.", default=None
    )
    config_schema: Optional[Dict[str, Any]] = Field(
        title="The JSON schema of this flavor's corresponding configuration.",
        default=None,
    )
    connector_type: Optional[str] = Field(
        title="The type of the connector that this flavor uses.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    connector_resource_type: Optional[str] = Field(
        title="The resource type of the connector that this flavor uses.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    connector_resource_id_attr: Optional[str] = Field(
        title="The name of an attribute in the stack component configuration "
        "that plays the role of resource ID when linked to a service "
        "connector.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    source: Optional[str] = Field(
        title="The path to the module which contains this Flavor.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    integration: Optional[str] = Field(
        title="The name of the integration that the Flavor belongs to.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    logo_url: Optional[str] = Field(
        title="Optionally, a url pointing to a png,"
        "svg or jpg can be attached.",
        default=None,
    )
    docs_url: Optional[str] = Field(
        title="Optionally, a url pointing to docs, within docs.zenml.io.",
        default=None,
    )
    sdk_docs_url: Optional[str] = Field(
        title="Optionally, a url pointing to SDK docs,"
        "within sdkdocs.zenml.io.",
        default=None,
    )
    is_custom: Optional[bool] = Field(
        title="Whether or not this flavor is a custom, user created flavor.",
        default=None,
    )

LoadedVisualization

Bases: BaseModel

Model for loaded visualizations.

Source code in src/zenml/models/v2/misc/loaded_visualization.py
23
24
25
26
27
class LoadedVisualization(BaseModel):
    """Model for loaded visualizations."""

    type: VisualizationType
    value: Union[str, bytes] = Field(union_mode="left_to_right")

LogsRequest

Bases: BaseRequest

Request model for logs.

Source code in src/zenml/models/v2/core/logs.py
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
class LogsRequest(BaseRequest):
    """Request model for logs."""

    uri: str = Field(title="The uri of the logs file")

    artifact_store_id: UUID = Field(
        title="The artifact store ID to associate the logs with.",
    )

    @field_validator("uri")
    @classmethod
    def text_field_max_length_check(cls, value: Any) -> Any:
        """Checks if the length of the value exceeds the maximum text length.

        Args:
            value: the value set in the field

        Returns:
            the value itself.

        Raises:
            AssertionError: if the length of the field is longer than the
                maximum threshold.
        """
        assert len(str(value)) < TEXT_FIELD_MAX_LENGTH, (
            "The length of the value for this field can not "
            f"exceed {TEXT_FIELD_MAX_LENGTH}"
        )
        return value

text_field_max_length_check(value) classmethod

Checks if the length of the value exceeds the maximum text length.

Parameters:

Name Type Description Default
value Any

the value set in the field

required

Returns:

Type Description
Any

the value itself.

Raises:

Type Description
AssertionError

if the length of the field is longer than the maximum threshold.

Source code in src/zenml/models/v2/core/logs.py
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
@field_validator("uri")
@classmethod
def text_field_max_length_check(cls, value: Any) -> Any:
    """Checks if the length of the value exceeds the maximum text length.

    Args:
        value: the value set in the field

    Returns:
        the value itself.

    Raises:
        AssertionError: if the length of the field is longer than the
            maximum threshold.
    """
    assert len(str(value)) < TEXT_FIELD_MAX_LENGTH, (
        "The length of the value for this field can not "
        f"exceed {TEXT_FIELD_MAX_LENGTH}"
    )
    return value

LogsResponse

Bases: BaseIdentifiedResponse[LogsResponseBody, LogsResponseMetadata, LogsResponseResources]

Response model for logs.

Source code in src/zenml/models/v2/core/logs.py
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
class LogsResponse(
    BaseIdentifiedResponse[
        LogsResponseBody, LogsResponseMetadata, LogsResponseResources
    ]
):
    """Response model for logs."""

    def get_hydrated_version(self) -> "LogsResponse":
        """Get the hydrated version of these logs.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_logs(self.id)

    # Body and metadata properties
    @property
    def uri(self) -> str:
        """The `uri` property.

        Returns:
            the value of the property.
        """
        return self.get_body().uri

    @property
    def step_run_id(self) -> Optional[UUID]:
        """The `step_run_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().step_run_id

    @property
    def pipeline_run_id(self) -> Optional[UUID]:
        """The `pipeline_run_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().pipeline_run_id

    @property
    def artifact_store_id(self) -> UUID:
        """The `artifact_store_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().artifact_store_id

artifact_store_id property

The artifact_store_id property.

Returns:

Type Description
UUID

the value of the property.

pipeline_run_id property

The pipeline_run_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

step_run_id property

The step_run_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

uri property

The uri property.

Returns:

Type Description
str

the value of the property.

get_hydrated_version()

Get the hydrated version of these logs.

Returns:

Type Description
LogsResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/logs.py
109
110
111
112
113
114
115
116
117
def get_hydrated_version(self) -> "LogsResponse":
    """Get the hydrated version of these logs.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_logs(self.id)

LogsResponseBody

Bases: BaseDatedResponseBody

Response body for logs.

Source code in src/zenml/models/v2/core/logs.py
71
72
73
74
75
76
77
class LogsResponseBody(BaseDatedResponseBody):
    """Response body for logs."""

    uri: str = Field(
        title="The uri of the logs file",
        max_length=TEXT_FIELD_MAX_LENGTH,
    )

LogsResponseMetadata

Bases: BaseResponseMetadata

Response metadata for logs.

Source code in src/zenml/models/v2/core/logs.py
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
class LogsResponseMetadata(BaseResponseMetadata):
    """Response metadata for logs."""

    step_run_id: Optional[UUID] = Field(
        title="Step ID to associate the logs with.",
        default=None,
        description="When this is set, pipeline_run_id should be set to None.",
    )
    pipeline_run_id: Optional[UUID] = Field(
        title="Pipeline run ID to associate the logs with.",
        default=None,
        description="When this is set, step_run_id should be set to None.",
    )
    artifact_store_id: UUID = Field(
        title="The artifact store ID to associate the logs with.",
    )

ModelFilter

Bases: ProjectScopedFilter, TaggableFilter

Model to enable advanced filtering of all models.

Source code in src/zenml/models/v2/core/model.py
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
class ModelFilter(ProjectScopedFilter, TaggableFilter):
    """Model to enable advanced filtering of all models."""

    name: Optional[str] = Field(
        default=None,
        description="Name of the Model",
    )

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.FILTER_EXCLUDE_FIELDS,
        *TaggableFilter.FILTER_EXCLUDE_FIELDS,
    ]
    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CUSTOM_SORTING_OPTIONS,
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
        SORT_BY_LATEST_VERSION_KEY,
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CLI_EXCLUDE_FIELDS,
        *TaggableFilter.CLI_EXCLUDE_FIELDS,
    ]

    def apply_sorting(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Apply sorting to the query for Models.

        Args:
            query: The query to which to apply the sorting.
            table: The query table.

        Returns:
            The query with sorting applied.
        """
        from sqlmodel import asc, case, col, desc, func, select

        from zenml.enums import SorterOps
        from zenml.zen_stores.schemas import (
            ModelSchema,
            ModelVersionSchema,
        )

        sort_by, operand = self.sorting_params

        if sort_by == SORT_BY_LATEST_VERSION_KEY:
            # Subquery to find the latest version per model
            latest_version_subquery = (
                select(
                    ModelSchema.id,
                    case(
                        (
                            func.max(ModelVersionSchema.created).is_(None),
                            ModelSchema.created,
                        ),
                        else_=func.max(ModelVersionSchema.created),
                    ).label("latest_version_created"),
                )
                .outerjoin(
                    ModelVersionSchema,
                    ModelSchema.id == ModelVersionSchema.model_id,  # type: ignore[arg-type]
                )
                .group_by(col(ModelSchema.id))
                .subquery()
            )

            query = query.add_columns(
                latest_version_subquery.c.latest_version_created,
            ).where(ModelSchema.id == latest_version_subquery.c.id)

            # Apply sorting based on the operand
            if operand == SorterOps.ASCENDING:
                query = query.order_by(
                    asc(latest_version_subquery.c.latest_version_created),
                    asc(ModelSchema.id),
                )
            else:
                query = query.order_by(
                    desc(latest_version_subquery.c.latest_version_created),
                    desc(ModelSchema.id),
                )
            return query

        # For other sorting cases, delegate to the parent class
        return super().apply_sorting(query=query, table=table)

apply_sorting(query, table)

Apply sorting to the query for Models.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the sorting.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with sorting applied.

Source code in src/zenml/models/v2/core/model.py
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
def apply_sorting(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Apply sorting to the query for Models.

    Args:
        query: The query to which to apply the sorting.
        table: The query table.

    Returns:
        The query with sorting applied.
    """
    from sqlmodel import asc, case, col, desc, func, select

    from zenml.enums import SorterOps
    from zenml.zen_stores.schemas import (
        ModelSchema,
        ModelVersionSchema,
    )

    sort_by, operand = self.sorting_params

    if sort_by == SORT_BY_LATEST_VERSION_KEY:
        # Subquery to find the latest version per model
        latest_version_subquery = (
            select(
                ModelSchema.id,
                case(
                    (
                        func.max(ModelVersionSchema.created).is_(None),
                        ModelSchema.created,
                    ),
                    else_=func.max(ModelVersionSchema.created),
                ).label("latest_version_created"),
            )
            .outerjoin(
                ModelVersionSchema,
                ModelSchema.id == ModelVersionSchema.model_id,  # type: ignore[arg-type]
            )
            .group_by(col(ModelSchema.id))
            .subquery()
        )

        query = query.add_columns(
            latest_version_subquery.c.latest_version_created,
        ).where(ModelSchema.id == latest_version_subquery.c.id)

        # Apply sorting based on the operand
        if operand == SorterOps.ASCENDING:
            query = query.order_by(
                asc(latest_version_subquery.c.latest_version_created),
                asc(ModelSchema.id),
            )
        else:
            query = query.order_by(
                desc(latest_version_subquery.c.latest_version_created),
                desc(ModelSchema.id),
            )
        return query

    # For other sorting cases, delegate to the parent class
    return super().apply_sorting(query=query, table=table)

ModelRequest

Bases: ProjectScopedRequest

Request model for models.

Source code in src/zenml/models/v2/core/model.py
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
class ModelRequest(ProjectScopedRequest):
    """Request model for models."""

    name: str = Field(
        title="The name of the model",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    license: Optional[str] = Field(
        title="The license model created under",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    description: Optional[str] = Field(
        title="The description of the model",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    audience: Optional[str] = Field(
        title="The target audience of the model",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    use_cases: Optional[str] = Field(
        title="The use cases of the model",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    limitations: Optional[str] = Field(
        title="The know limitations of the model",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    trade_offs: Optional[str] = Field(
        title="The trade offs of the model",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    ethics: Optional[str] = Field(
        title="The ethical implications of the model",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    tags: Optional[List[str]] = Field(
        title="Tags associated with the model",
        default=None,
    )
    save_models_to_registry: bool = Field(
        title="Whether to save all ModelArtifacts to Model Registry",
        default=True,
    )

ModelResponse

Bases: ProjectScopedResponse[ModelResponseBody, ModelResponseMetadata, ModelResponseResources]

Response model for models.

Source code in src/zenml/models/v2/core/model.py
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
class ModelResponse(
    ProjectScopedResponse[
        ModelResponseBody, ModelResponseMetadata, ModelResponseResources
    ]
):
    """Response model for models."""

    name: str = Field(
        title="The name of the model",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "ModelResponse":
        """Get the hydrated version of this model.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_model(self.id)

    # Body and metadata properties
    @property
    def tags(self) -> List["TagResponse"]:
        """The `tags` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().tags

    @property
    def latest_version_name(self) -> Optional[str]:
        """The `latest_version_name` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().latest_version_name

    @property
    def latest_version_id(self) -> Optional[UUID]:
        """The `latest_version_id` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().latest_version_id

    @property
    def license(self) -> Optional[str]:
        """The `license` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().license

    @property
    def description(self) -> Optional[str]:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

    @property
    def audience(self) -> Optional[str]:
        """The `audience` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().audience

    @property
    def use_cases(self) -> Optional[str]:
        """The `use_cases` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().use_cases

    @property
    def limitations(self) -> Optional[str]:
        """The `limitations` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().limitations

    @property
    def trade_offs(self) -> Optional[str]:
        """The `trade_offs` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().trade_offs

    @property
    def ethics(self) -> Optional[str]:
        """The `ethics` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().ethics

    @property
    def save_models_to_registry(self) -> bool:
        """The `save_models_to_registry` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().save_models_to_registry

    # Helper functions
    @property
    def versions(self) -> List["Model"]:
        """List all versions of the model.

        Returns:
            The list of all model version.
        """
        from zenml.client import Client

        client = Client()
        model_versions = depaginate(
            client.list_model_versions,
            model_name_or_id=self.id,
            project=self.project_id,
        )
        return [
            mv.to_model_class(suppress_class_validation_warnings=True)
            for mv in model_versions
        ]

audience property

The audience property.

Returns:

Type Description
Optional[str]

the value of the property.

description property

The description property.

Returns:

Type Description
Optional[str]

the value of the property.

ethics property

The ethics property.

Returns:

Type Description
Optional[str]

the value of the property.

latest_version_id property

The latest_version_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

latest_version_name property

The latest_version_name property.

Returns:

Type Description
Optional[str]

the value of the property.

license property

The license property.

Returns:

Type Description
Optional[str]

the value of the property.

limitations property

The limitations property.

Returns:

Type Description
Optional[str]

the value of the property.

save_models_to_registry property

The save_models_to_registry property.

Returns:

Type Description
bool

the value of the property.

tags property

The tags property.

Returns:

Type Description
List[TagResponse]

the value of the property.

trade_offs property

The trade_offs property.

Returns:

Type Description
Optional[str]

the value of the property.

use_cases property

The use_cases property.

Returns:

Type Description
Optional[str]

the value of the property.

versions property

List all versions of the model.

Returns:

Type Description
List[Model]

The list of all model version.

get_hydrated_version()

Get the hydrated version of this model.

Returns:

Type Description
ModelResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/model.py
202
203
204
205
206
207
208
209
210
def get_hydrated_version(self) -> "ModelResponse":
    """Get the hydrated version of this model.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_model(self.id)

ModelResponseBody

Bases: ProjectScopedResponseBody

Response body for models.

Source code in src/zenml/models/v2/core/model.py
132
133
class ModelResponseBody(ProjectScopedResponseBody):
    """Response body for models."""

ModelResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for models.

Source code in src/zenml/models/v2/core/model.py
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
class ModelResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for models."""

    license: Optional[str] = Field(
        title="The license model created under",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    description: Optional[str] = Field(
        title="The description of the model",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    audience: Optional[str] = Field(
        title="The target audience of the model",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    use_cases: Optional[str] = Field(
        title="The use cases of the model",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    limitations: Optional[str] = Field(
        title="The know limitations of the model",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    trade_offs: Optional[str] = Field(
        title="The trade offs of the model",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    ethics: Optional[str] = Field(
        title="The ethical implications of the model",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    save_models_to_registry: bool = Field(
        title="Whether to save all ModelArtifacts to Model Registry",
        default=True,
    )

ModelResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the model entity.

Source code in src/zenml/models/v2/core/model.py
180
181
182
183
184
185
186
187
class ModelResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the model entity."""

    tags: List["TagResponse"] = Field(
        title="Tags associated with the model",
    )
    latest_version_name: Optional[str] = None
    latest_version_id: Optional[UUID] = None

ModelUpdate

Bases: BaseUpdate

Update model for models.

Source code in src/zenml/models/v2/core/model.py
113
114
115
116
117
118
119
120
121
122
123
124
125
126
class ModelUpdate(BaseUpdate):
    """Update model for models."""

    name: Optional[str] = None
    license: Optional[str] = None
    description: Optional[str] = None
    audience: Optional[str] = None
    use_cases: Optional[str] = None
    limitations: Optional[str] = None
    trade_offs: Optional[str] = None
    ethics: Optional[str] = None
    add_tags: Optional[List[str]] = None
    remove_tags: Optional[List[str]] = None
    save_models_to_registry: Optional[bool] = None

ModelVersionArtifactFilter

Bases: BaseFilter

Model version pipeline run links filter model.

Source code in src/zenml/models/v2/core/model_version_artifact.py
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
class ModelVersionArtifactFilter(BaseFilter):
    """Model version pipeline run links filter model."""

    # Artifact name and type are not DB fields and need to be handled separately
    FILTER_EXCLUDE_FIELDS = [
        *BaseFilter.FILTER_EXCLUDE_FIELDS,
        "artifact_name",
        "only_data_artifacts",
        "only_model_artifacts",
        "only_deployment_artifacts",
        "has_custom_name",
        "user",
    ]
    CLI_EXCLUDE_FIELDS = [
        *BaseFilter.CLI_EXCLUDE_FIELDS,
        "only_data_artifacts",
        "only_model_artifacts",
        "only_deployment_artifacts",
        "has_custom_name",
        "model_version_id",
        "updated",
        "id",
    ]

    model_version_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Filter by model version ID",
        union_mode="left_to_right",
    )
    artifact_version_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Filter by artifact ID",
        union_mode="left_to_right",
    )
    artifact_name: Optional[str] = Field(
        default=None,
        description="Name of the artifact",
    )
    only_data_artifacts: Optional[bool] = False
    only_model_artifacts: Optional[bool] = False
    only_deployment_artifacts: Optional[bool] = False
    has_custom_name: Optional[bool] = None
    user: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the user that created the artifact.",
    )

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List[Union["ColumnElement[bool]"]]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_, col

        from zenml.zen_stores.schemas import (
            ArtifactSchema,
            ArtifactVersionSchema,
            ModelVersionArtifactSchema,
            UserSchema,
        )

        if self.artifact_name:
            value, filter_operator = self._resolve_operator(self.artifact_name)
            filter_ = StrFilter(
                operation=GenericFilterOps(filter_operator),
                column="name",
                value=value,
            )
            artifact_name_filter = and_(
                ModelVersionArtifactSchema.artifact_version_id
                == ArtifactVersionSchema.id,
                ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
                filter_.generate_query_conditions(ArtifactSchema),
            )
            custom_filters.append(artifact_name_filter)

        if self.only_data_artifacts:
            data_artifact_filter = and_(
                ModelVersionArtifactSchema.artifact_version_id
                == ArtifactVersionSchema.id,
                col(ArtifactVersionSchema.type).not_in(
                    ["ServiceArtifact", "ModelArtifact"]
                ),
            )
            custom_filters.append(data_artifact_filter)

        if self.only_model_artifacts:
            model_artifact_filter = and_(
                ModelVersionArtifactSchema.artifact_version_id
                == ArtifactVersionSchema.id,
                ArtifactVersionSchema.type == "ModelArtifact",
            )
            custom_filters.append(model_artifact_filter)

        if self.only_deployment_artifacts:
            deployment_artifact_filter = and_(
                ModelVersionArtifactSchema.artifact_version_id
                == ArtifactVersionSchema.id,
                ArtifactVersionSchema.type == "ServiceArtifact",
            )
            custom_filters.append(deployment_artifact_filter)

        if self.has_custom_name is not None:
            custom_name_filter = and_(
                ModelVersionArtifactSchema.artifact_version_id
                == ArtifactVersionSchema.id,
                ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
                ArtifactSchema.has_custom_name == self.has_custom_name,
            )
            custom_filters.append(custom_name_filter)

        if self.user:
            user_filter = and_(
                ModelVersionArtifactSchema.artifact_version_id
                == ArtifactVersionSchema.id,
                ArtifactVersionSchema.user_id == UserSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.user,
                    table=UserSchema,
                    additional_columns=["full_name"],
                ),
            )
            custom_filters.append(user_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[Union[ColumnElement[bool]]]

A list of custom filters.

Source code in src/zenml/models/v2/core/model_version_artifact.py
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List[Union["ColumnElement[bool]"]]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_, col

    from zenml.zen_stores.schemas import (
        ArtifactSchema,
        ArtifactVersionSchema,
        ModelVersionArtifactSchema,
        UserSchema,
    )

    if self.artifact_name:
        value, filter_operator = self._resolve_operator(self.artifact_name)
        filter_ = StrFilter(
            operation=GenericFilterOps(filter_operator),
            column="name",
            value=value,
        )
        artifact_name_filter = and_(
            ModelVersionArtifactSchema.artifact_version_id
            == ArtifactVersionSchema.id,
            ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
            filter_.generate_query_conditions(ArtifactSchema),
        )
        custom_filters.append(artifact_name_filter)

    if self.only_data_artifacts:
        data_artifact_filter = and_(
            ModelVersionArtifactSchema.artifact_version_id
            == ArtifactVersionSchema.id,
            col(ArtifactVersionSchema.type).not_in(
                ["ServiceArtifact", "ModelArtifact"]
            ),
        )
        custom_filters.append(data_artifact_filter)

    if self.only_model_artifacts:
        model_artifact_filter = and_(
            ModelVersionArtifactSchema.artifact_version_id
            == ArtifactVersionSchema.id,
            ArtifactVersionSchema.type == "ModelArtifact",
        )
        custom_filters.append(model_artifact_filter)

    if self.only_deployment_artifacts:
        deployment_artifact_filter = and_(
            ModelVersionArtifactSchema.artifact_version_id
            == ArtifactVersionSchema.id,
            ArtifactVersionSchema.type == "ServiceArtifact",
        )
        custom_filters.append(deployment_artifact_filter)

    if self.has_custom_name is not None:
        custom_name_filter = and_(
            ModelVersionArtifactSchema.artifact_version_id
            == ArtifactVersionSchema.id,
            ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
            ArtifactSchema.has_custom_name == self.has_custom_name,
        )
        custom_filters.append(custom_name_filter)

    if self.user:
        user_filter = and_(
            ModelVersionArtifactSchema.artifact_version_id
            == ArtifactVersionSchema.id,
            ArtifactVersionSchema.user_id == UserSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.user,
                table=UserSchema,
                additional_columns=["full_name"],
            ),
        )
        custom_filters.append(user_filter)

    return custom_filters

ModelVersionArtifactRequest

Bases: BaseRequest

Request model for links between model versions and artifacts.

Source code in src/zenml/models/v2/core/model_version_artifact.py
43
44
45
46
47
48
49
50
51
52
53
54
55
class ModelVersionArtifactRequest(BaseRequest):
    """Request model for links between model versions and artifacts."""

    model_version: UUID
    artifact_version: UUID

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

ModelVersionArtifactResponse

Bases: BaseIdentifiedResponse[ModelVersionArtifactResponseBody, BaseResponseMetadata, ModelVersionArtifactResponseResources]

Response model for links between model versions and artifacts.

Source code in src/zenml/models/v2/core/model_version_artifact.py
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
class ModelVersionArtifactResponse(
    BaseIdentifiedResponse[
        ModelVersionArtifactResponseBody,
        BaseResponseMetadata,
        ModelVersionArtifactResponseResources,
    ]
):
    """Response model for links between model versions and artifacts."""

    @property
    def model_version(self) -> UUID:
        """The `model_version` property.

        Returns:
            the value of the property.
        """
        return self.get_body().model_version

    @property
    def artifact_version(self) -> "ArtifactVersionResponse":
        """The `artifact_version` property.

        Returns:
            the value of the property.
        """
        return self.get_body().artifact_version

artifact_version property

The artifact_version property.

Returns:

Type Description
ArtifactVersionResponse

the value of the property.

model_version property

The model_version property.

Returns:

Type Description
UUID

the value of the property.

ModelVersionArtifactResponseBody

Bases: BaseDatedResponseBody

Response body for links between model versions and artifacts.

Source code in src/zenml/models/v2/core/model_version_artifact.py
65
66
67
68
69
70
71
72
73
74
75
76
77
class ModelVersionArtifactResponseBody(BaseDatedResponseBody):
    """Response body for links between model versions and artifacts."""

    model_version: UUID
    artifact_version: "ArtifactVersionResponse"

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

ModelVersionFilter

Bases: ProjectScopedFilter, TaggableFilter, RunMetadataFilterMixin

Filter model for model versions.

Source code in src/zenml/models/v2/core/model_version.py
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
class ModelVersionFilter(
    ProjectScopedFilter, TaggableFilter, RunMetadataFilterMixin
):
    """Filter model for model versions."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.FILTER_EXCLUDE_FIELDS,
        *TaggableFilter.FILTER_EXCLUDE_FIELDS,
        *RunMetadataFilterMixin.FILTER_EXCLUDE_FIELDS,
        "model",
    ]
    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CUSTOM_SORTING_OPTIONS,
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
        *RunMetadataFilterMixin.CUSTOM_SORTING_OPTIONS,
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CLI_EXCLUDE_FIELDS,
        *TaggableFilter.CLI_EXCLUDE_FIELDS,
        *RunMetadataFilterMixin.CLI_EXCLUDE_FIELDS,
    ]
    API_MULTI_INPUT_PARAMS: ClassVar[List[str]] = [
        *ProjectScopedFilter.API_MULTI_INPUT_PARAMS,
        *TaggableFilter.API_MULTI_INPUT_PARAMS,
        *RunMetadataFilterMixin.API_MULTI_INPUT_PARAMS,
    ]

    name: Optional[str] = Field(
        default=None,
        description="The name of the Model Version",
    )
    number: Optional[int] = Field(
        default=None,
        description="The number of the Model Version",
    )
    stage: Optional[Union[str, ModelStages]] = Field(
        description="The model version stage",
        default=None,
        union_mode="left_to_right",
    )
    model: Optional[Union[str, UUID]] = Field(
        default=None,
        description="The name or ID of the model which the search is scoped "
        "to. This field must always be set and is always applied in addition "
        "to the other filters, regardless of the value of the "
        "logical_operator field.",
        union_mode="left_to_right",
    )

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List[Union["ColumnElement[bool]"]]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        from sqlalchemy import and_

        from zenml.zen_stores.schemas import (
            ModelSchema,
            ModelVersionSchema,
        )

        custom_filters = super().get_custom_filters(table)

        if self.model:
            model_filter = and_(
                ModelVersionSchema.model_id == ModelSchema.id,  # type: ignore[arg-type]
                self.generate_name_or_id_query_conditions(
                    value=self.model, table=ModelSchema
                ),
            )
            custom_filters.append(model_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[Union[ColumnElement[bool]]]

A list of custom filters.

Source code in src/zenml/models/v2/core/model_version.py
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List[Union["ColumnElement[bool]"]]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    from sqlalchemy import and_

    from zenml.zen_stores.schemas import (
        ModelSchema,
        ModelVersionSchema,
    )

    custom_filters = super().get_custom_filters(table)

    if self.model:
        model_filter = and_(
            ModelVersionSchema.model_id == ModelSchema.id,  # type: ignore[arg-type]
            self.generate_name_or_id_query_conditions(
                value=self.model, table=ModelSchema
            ),
        )
        custom_filters.append(model_filter)

    return custom_filters

ModelVersionPipelineRunFilter

Bases: BaseFilter

Model version pipeline run links filter model.

Source code in src/zenml/models/v2/core/model_version_pipeline_run.py
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
class ModelVersionPipelineRunFilter(BaseFilter):
    """Model version pipeline run links filter model."""

    FILTER_EXCLUDE_FIELDS = [
        *BaseFilter.FILTER_EXCLUDE_FIELDS,
        "pipeline_run_name",
        "user",
    ]
    CLI_EXCLUDE_FIELDS = [
        *BaseFilter.CLI_EXCLUDE_FIELDS,
        "model_version_id",
        "updated",
        "id",
    ]

    model_version_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Filter by model version ID",
        union_mode="left_to_right",
    )
    pipeline_run_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Filter by pipeline run ID",
        union_mode="left_to_right",
    )
    pipeline_run_name: Optional[str] = Field(
        default=None,
        description="Name of the pipeline run",
    )
    user: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the user that created the pipeline run.",
    )

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_

        from zenml.zen_stores.schemas import (
            ModelVersionPipelineRunSchema,
            PipelineRunSchema,
            UserSchema,
        )

        if self.pipeline_run_name:
            value, filter_operator = self._resolve_operator(
                self.pipeline_run_name
            )
            filter_ = StrFilter(
                operation=GenericFilterOps(filter_operator),
                column="name",
                value=value,
            )
            pipeline_run_name_filter = and_(
                ModelVersionPipelineRunSchema.pipeline_run_id
                == PipelineRunSchema.id,
                filter_.generate_query_conditions(PipelineRunSchema),
            )
            custom_filters.append(pipeline_run_name_filter)

        if self.user:
            user_filter = and_(
                ModelVersionPipelineRunSchema.pipeline_run_id
                == PipelineRunSchema.id,
                PipelineRunSchema.user_id == UserSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.user,
                    table=UserSchema,
                    additional_columns=["full_name"],
                ),
            )
            custom_filters.append(user_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/model_version_pipeline_run.py
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_

    from zenml.zen_stores.schemas import (
        ModelVersionPipelineRunSchema,
        PipelineRunSchema,
        UserSchema,
    )

    if self.pipeline_run_name:
        value, filter_operator = self._resolve_operator(
            self.pipeline_run_name
        )
        filter_ = StrFilter(
            operation=GenericFilterOps(filter_operator),
            column="name",
            value=value,
        )
        pipeline_run_name_filter = and_(
            ModelVersionPipelineRunSchema.pipeline_run_id
            == PipelineRunSchema.id,
            filter_.generate_query_conditions(PipelineRunSchema),
        )
        custom_filters.append(pipeline_run_name_filter)

    if self.user:
        user_filter = and_(
            ModelVersionPipelineRunSchema.pipeline_run_id
            == PipelineRunSchema.id,
            PipelineRunSchema.user_id == UserSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.user,
                table=UserSchema,
                additional_columns=["full_name"],
            ),
        )
        custom_filters.append(user_filter)

    return custom_filters

ModelVersionPipelineRunRequest

Bases: BaseRequest

Request model for links between model versions and pipeline runs.

Source code in src/zenml/models/v2/core/model_version_pipeline_run.py
42
43
44
45
46
47
48
49
50
51
52
53
54
class ModelVersionPipelineRunRequest(BaseRequest):
    """Request model for links between model versions and pipeline runs."""

    model_version: UUID
    pipeline_run: UUID

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

ModelVersionPipelineRunResponse

Bases: BaseIdentifiedResponse[ModelVersionPipelineRunResponseBody, BaseResponseMetadata, ModelVersionPipelineRunResponseResources]

Response model for links between model versions and pipeline runs.

Source code in src/zenml/models/v2/core/model_version_pipeline_run.py
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
class ModelVersionPipelineRunResponse(
    BaseIdentifiedResponse[
        ModelVersionPipelineRunResponseBody,
        BaseResponseMetadata,
        ModelVersionPipelineRunResponseResources,
    ]
):
    """Response model for links between model versions and pipeline runs."""

    @property
    def model_version(self) -> UUID:
        """The `model_version` property.

        Returns:
            the value of the property.
        """
        return self.get_body().model_version

    @property
    def pipeline_run(self) -> "PipelineRunResponse":
        """The `pipeline_run` property.

        Returns:
            the value of the property.
        """
        return self.get_body().pipeline_run

model_version property

The model_version property.

Returns:

Type Description
UUID

the value of the property.

pipeline_run property

The pipeline_run property.

Returns:

Type Description
PipelineRunResponse

the value of the property.

ModelVersionPipelineRunResponseBody

Bases: BaseDatedResponseBody

Response body for links between model versions and pipeline runs.

Source code in src/zenml/models/v2/core/model_version_pipeline_run.py
64
65
66
67
68
69
70
71
72
73
74
75
76
class ModelVersionPipelineRunResponseBody(BaseDatedResponseBody):
    """Response body for links between model versions and pipeline runs."""

    model_version: UUID
    pipeline_run: PipelineRunResponse

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

ModelVersionRequest

Bases: ProjectScopedRequest

Request model for model versions.

Source code in src/zenml/models/v2/core/model_version.py
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
class ModelVersionRequest(ProjectScopedRequest):
    """Request model for model versions."""

    name: Optional[str] = Field(
        description="The name of the model version",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    description: Optional[str] = Field(
        description="The description of the model version",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    stage: Optional[str] = Field(
        description="The stage of the model version",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )

    model: UUID = Field(
        description="The ID of the model containing version",
    )
    tags: Optional[List[str]] = Field(
        title="Tags associated with the model version",
        default=None,
    )

ModelVersionResponse

Bases: ProjectScopedResponse[ModelVersionResponseBody, ModelVersionResponseMetadata, ModelVersionResponseResources]

Response model for model versions.

Source code in src/zenml/models/v2/core/model_version.py
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
class ModelVersionResponse(
    ProjectScopedResponse[
        ModelVersionResponseBody,
        ModelVersionResponseMetadata,
        ModelVersionResponseResources,
    ]
):
    """Response model for model versions."""

    name: Optional[str] = Field(
        description="The name of the model version",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )

    @property
    def stage(self) -> Optional[str]:
        """The `stage` property.

        Returns:
            the value of the property.
        """
        return self.get_body().stage

    @property
    def number(self) -> int:
        """The `number` property.

        Returns:
            the value of the property.
        """
        return self.get_body().number

    @property
    def model(self) -> "ModelResponse":
        """The `model` property.

        Returns:
            the value of the property.
        """
        return self.get_body().model

    @property
    def tags(self) -> List[TagResponse]:
        """The `tags` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().tags

    @property
    def description(self) -> Optional[str]:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

    @property
    def run_metadata(self) -> Dict[str, MetadataType]:
        """The `run_metadata` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().run_metadata

    def get_hydrated_version(self) -> "ModelVersionResponse":
        """Get the hydrated version of this model version.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_model_version(self.id)

    # Helper functions
    def to_model_class(
        self,
        suppress_class_validation_warnings: bool = True,
    ) -> "Model":
        """Convert response model to Model object.

        Args:
            suppress_class_validation_warnings: internally used to suppress
                repeated warnings.

        Returns:
            Model object
        """
        from zenml.model.model import Model

        mv = Model(
            name=self.model.name,
            license=self.model.license,
            description=self.description,
            audience=self.model.audience,
            use_cases=self.model.use_cases,
            limitations=self.model.limitations,
            trade_offs=self.model.trade_offs,
            ethics=self.model.ethics,
            tags=[t.name for t in self.tags],
            version=self.name,
            suppress_class_validation_warnings=suppress_class_validation_warnings,
            model_version_id=self.id,
        )

        return mv

    @property
    def model_artifacts(
        self,
    ) -> Dict[str, Dict[str, "ArtifactVersionResponse"]]:
        """Get all model artifacts linked to this model version.

        Returns:
            Dictionary of model artifacts with versions as
            Dict[str, Dict[str, ArtifactResponse]]
        """
        logger.warning(
            "ModelVersionResponse.model_artifacts is deprecated and will be "
            "removed in a future release."
        )
        from zenml.client import Client

        artifact_versions = pagination_utils.depaginate(
            Client().list_artifact_versions,
            model_version_id=self.id,
            type=ArtifactType.MODEL,
            project=self.project_id,
        )

        result: Dict[str, Dict[str, "ArtifactVersionResponse"]] = {}
        for artifact_version in artifact_versions:
            result.setdefault(artifact_version.name, {})
            result[artifact_version.name][artifact_version.version] = (
                artifact_version
            )

        return result

    @property
    def data_artifact_ids(self) -> Dict[str, Dict[str, UUID]]:
        """Data artifacts linked to this model version.

        Returns:
            Data artifacts linked to this model version.
        """
        logger.warning(
            "ModelVersionResponse.data_artifact_ids is deprecated and will "
            "be removed in a future release."
        )

        return {
            artifact_name: {
                version_name: version_response.id
                for version_name, version_response in artifact_versions.items()
            }
            for artifact_name, artifact_versions in self.data_artifacts.items()
        }

    @property
    def model_artifact_ids(self) -> Dict[str, Dict[str, UUID]]:
        """Model artifacts linked to this model version.

        Returns:
            Model artifacts linked to this model version.
        """
        logger.warning(
            "ModelVersionResponse.model_artifact_ids is deprecated and will "
            "be removed in a future release."
        )

        return {
            artifact_name: {
                version_name: version_response.id
                for version_name, version_response in artifact_versions.items()
            }
            for artifact_name, artifact_versions in self.model_artifacts.items()
        }

    @property
    def deployment_artifact_ids(self) -> Dict[str, Dict[str, UUID]]:
        """Deployment artifacts linked to this model version.

        Returns:
            Deployment artifacts linked to this model version.
        """
        logger.warning(
            "ModelVersionResponse.deployment_artifact_ids is deprecated and "
            "will be removed in a future release."
        )

        return {
            artifact_name: {
                version_name: version_response.id
                for version_name, version_response in artifact_versions.items()
            }
            for artifact_name, artifact_versions in self.deployment_artifacts.items()
        }

    @property
    def data_artifacts(
        self,
    ) -> Dict[str, Dict[str, "ArtifactVersionResponse"]]:
        """Get all data artifacts linked to this model version.

        Returns:
            Dictionary of data artifacts with versions as
            Dict[str, Dict[str, ArtifactResponse]]
        """
        logger.warning(
            "ModelVersionResponse.data_artifacts is deprecated and will be "
            "removed in a future release."
        )

        from zenml.client import Client

        artifact_versions = pagination_utils.depaginate(
            Client().list_artifact_versions,
            model_version_id=self.id,
            type=ArtifactType.DATA,
            project=self.project_id,
        )

        result: Dict[str, Dict[str, "ArtifactVersionResponse"]] = {}
        for artifact_version in artifact_versions:
            result.setdefault(artifact_version.name, {})
            result[artifact_version.name][artifact_version.version] = (
                artifact_version
            )

        return result

    @property
    def deployment_artifacts(
        self,
    ) -> Dict[str, Dict[str, "ArtifactVersionResponse"]]:
        """Get all deployment artifacts linked to this model version.

        Returns:
            Dictionary of deployment artifacts with versions as
            Dict[str, Dict[str, ArtifactResponse]]
        """
        logger.warning(
            "ModelVersionResponse.deployment_artifacts is deprecated and will "
            "be removed in a future release."
        )

        from zenml.client import Client

        artifact_versions = pagination_utils.depaginate(
            Client().list_artifact_versions,
            model_version_id=self.id,
            type=ArtifactType.SERVICE,
            project=self.project_id,
        )

        result: Dict[str, Dict[str, "ArtifactVersionResponse"]] = {}
        for artifact_version in artifact_versions:
            result.setdefault(artifact_version.name, {})
            result[artifact_version.name][artifact_version.version] = (
                artifact_version
            )

        return result

    @property
    def pipeline_run_ids(self) -> Dict[str, UUID]:
        """Pipeline runs linked to this model version.

        Returns:
            Pipeline runs linked to this model version.
        """
        logger.warning(
            "ModelVersionResponse.pipeline_run_ids is deprecated and will be "
            "removed in a future release."
        )

        from zenml.client import Client

        return {
            link.pipeline_run.name: link.pipeline_run.id
            for link in pagination_utils.depaginate(
                Client().list_model_version_pipeline_run_links,
                model_version_id=self.id,
            )
        }

    @property
    def pipeline_runs(self) -> Dict[str, "PipelineRunResponse"]:
        """Get all pipeline runs linked to this version.

        Returns:
            Dictionary of Pipeline Runs as PipelineRunResponseModel
        """
        logger.warning(
            "ModelVersionResponse.pipeline_runs is deprecated and will be "
            "removed in a future release."
        )

        from zenml.client import Client

        return {
            link.pipeline_run.name: link.pipeline_run
            for link in pagination_utils.depaginate(
                Client().list_model_version_pipeline_run_links,
                model_version_id=self.id,
            )
        }

    def _get_linked_object(
        self,
        name: str,
        version: Optional[str] = None,
        type: Optional[ArtifactType] = None,
    ) -> Optional["ArtifactVersionResponse"]:
        """Get the artifact linked to this model version given type.

        Args:
            name: The name of the artifact to retrieve.
            version: The version of the artifact to retrieve (None for
                latest/non-versioned)
            type: The type of the artifact to filter by.

        Returns:
            Specific version of an artifact from collection or None
        """
        from zenml.client import Client

        artifact_versions = Client().list_artifact_versions(
            sort_by="desc:created",
            size=1,
            artifact=name,
            version=version,
            model_version_id=self.id,
            type=type,
            project=self.project_id,
            hydrate=True,
        )

        if not artifact_versions.items:
            return None
        return artifact_versions.items[0]

    def get_artifact(
        self,
        name: str,
        version: Optional[str] = None,
    ) -> Optional["ArtifactVersionResponse"]:
        """Get the artifact linked to this model version.

        Args:
            name: The name of the artifact to retrieve.
            version: The version of the artifact to retrieve (None for
                latest/non-versioned)

        Returns:
            Specific version of an artifact or None
        """
        return self._get_linked_object(name, version)

    def get_model_artifact(
        self,
        name: str,
        version: Optional[str] = None,
    ) -> Optional["ArtifactVersionResponse"]:
        """Get the model artifact linked to this model version.

        Args:
            name: The name of the model artifact to retrieve.
            version: The version of the model artifact to retrieve (None for
                latest/non-versioned)

        Returns:
            Specific version of the model artifact or None
        """
        return self._get_linked_object(name, version, ArtifactType.MODEL)

    def get_data_artifact(
        self,
        name: str,
        version: Optional[str] = None,
    ) -> Optional["ArtifactVersionResponse"]:
        """Get the data artifact linked to this model version.

        Args:
            name: The name of the data artifact to retrieve.
            version: The version of the data artifact to retrieve (None for
                latest/non-versioned)

        Returns:
            Specific version of the data artifact or None
        """
        return self._get_linked_object(name, version, ArtifactType.DATA)

    def get_deployment_artifact(
        self,
        name: str,
        version: Optional[str] = None,
    ) -> Optional["ArtifactVersionResponse"]:
        """Get the deployment artifact linked to this model version.

        Args:
            name: The name of the deployment artifact to retrieve.
            version: The version of the deployment artifact to retrieve (None for
                latest/non-versioned)

        Returns:
            Specific version of the deployment artifact or None
        """
        return self._get_linked_object(name, version, ArtifactType.SERVICE)

    def set_stage(
        self, stage: Union[str, ModelStages], force: bool = False
    ) -> None:
        """Sets this Model Version to a desired stage.

        Args:
            stage: the target stage for model version.
            force: whether to force archiving of current model version in
                target stage or raise.

        Raises:
            ValueError: if model_stage is not valid.
        """
        from zenml.client import Client

        stage = getattr(stage, "value", stage)
        if stage not in [stage.value for stage in ModelStages]:
            raise ValueError(f"`{stage}` is not a valid model stage.")

        Client().update_model_version(
            model_name_or_id=self.model.id,
            version_name_or_id=self.id,
            stage=stage,
            force=force,
        )

data_artifact_ids property

Data artifacts linked to this model version.

Returns:

Type Description
Dict[str, Dict[str, UUID]]

Data artifacts linked to this model version.

data_artifacts property

Get all data artifacts linked to this model version.

Returns:

Type Description
Dict[str, Dict[str, ArtifactVersionResponse]]

Dictionary of data artifacts with versions as

Dict[str, Dict[str, ArtifactVersionResponse]]

Dict[str, Dict[str, ArtifactResponse]]

deployment_artifact_ids property

Deployment artifacts linked to this model version.

Returns:

Type Description
Dict[str, Dict[str, UUID]]

Deployment artifacts linked to this model version.

deployment_artifacts property

Get all deployment artifacts linked to this model version.

Returns:

Type Description
Dict[str, Dict[str, ArtifactVersionResponse]]

Dictionary of deployment artifacts with versions as

Dict[str, Dict[str, ArtifactVersionResponse]]

Dict[str, Dict[str, ArtifactResponse]]

description property

The description property.

Returns:

Type Description
Optional[str]

the value of the property.

model property

The model property.

Returns:

Type Description
ModelResponse

the value of the property.

model_artifact_ids property

Model artifacts linked to this model version.

Returns:

Type Description
Dict[str, Dict[str, UUID]]

Model artifacts linked to this model version.

model_artifacts property

Get all model artifacts linked to this model version.

Returns:

Type Description
Dict[str, Dict[str, ArtifactVersionResponse]]

Dictionary of model artifacts with versions as

Dict[str, Dict[str, ArtifactVersionResponse]]

Dict[str, Dict[str, ArtifactResponse]]

number property

The number property.

Returns:

Type Description
int

the value of the property.

pipeline_run_ids property

Pipeline runs linked to this model version.

Returns:

Type Description
Dict[str, UUID]

Pipeline runs linked to this model version.

pipeline_runs property

Get all pipeline runs linked to this version.

Returns:

Type Description
Dict[str, PipelineRunResponse]

Dictionary of Pipeline Runs as PipelineRunResponseModel

run_metadata property

The run_metadata property.

Returns:

Type Description
Dict[str, MetadataType]

the value of the property.

stage property

The stage property.

Returns:

Type Description
Optional[str]

the value of the property.

tags property

The tags property.

Returns:

Type Description
List[TagResponse]

the value of the property.

get_artifact(name, version=None)

Get the artifact linked to this model version.

Parameters:

Name Type Description Default
name str

The name of the artifact to retrieve.

required
version Optional[str]

The version of the artifact to retrieve (None for latest/non-versioned)

None

Returns:

Type Description
Optional[ArtifactVersionResponse]

Specific version of an artifact or None

Source code in src/zenml/models/v2/core/model_version.py
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
def get_artifact(
    self,
    name: str,
    version: Optional[str] = None,
) -> Optional["ArtifactVersionResponse"]:
    """Get the artifact linked to this model version.

    Args:
        name: The name of the artifact to retrieve.
        version: The version of the artifact to retrieve (None for
            latest/non-versioned)

    Returns:
        Specific version of an artifact or None
    """
    return self._get_linked_object(name, version)

get_data_artifact(name, version=None)

Get the data artifact linked to this model version.

Parameters:

Name Type Description Default
name str

The name of the data artifact to retrieve.

required
version Optional[str]

The version of the data artifact to retrieve (None for latest/non-versioned)

None

Returns:

Type Description
Optional[ArtifactVersionResponse]

Specific version of the data artifact or None

Source code in src/zenml/models/v2/core/model_version.py
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
def get_data_artifact(
    self,
    name: str,
    version: Optional[str] = None,
) -> Optional["ArtifactVersionResponse"]:
    """Get the data artifact linked to this model version.

    Args:
        name: The name of the data artifact to retrieve.
        version: The version of the data artifact to retrieve (None for
            latest/non-versioned)

    Returns:
        Specific version of the data artifact or None
    """
    return self._get_linked_object(name, version, ArtifactType.DATA)

get_deployment_artifact(name, version=None)

Get the deployment artifact linked to this model version.

Parameters:

Name Type Description Default
name str

The name of the deployment artifact to retrieve.

required
version Optional[str]

The version of the deployment artifact to retrieve (None for latest/non-versioned)

None

Returns:

Type Description
Optional[ArtifactVersionResponse]

Specific version of the deployment artifact or None

Source code in src/zenml/models/v2/core/model_version.py
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
def get_deployment_artifact(
    self,
    name: str,
    version: Optional[str] = None,
) -> Optional["ArtifactVersionResponse"]:
    """Get the deployment artifact linked to this model version.

    Args:
        name: The name of the deployment artifact to retrieve.
        version: The version of the deployment artifact to retrieve (None for
            latest/non-versioned)

    Returns:
        Specific version of the deployment artifact or None
    """
    return self._get_linked_object(name, version, ArtifactType.SERVICE)

get_hydrated_version()

Get the hydrated version of this model version.

Returns:

Type Description
ModelVersionResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/model_version.py
262
263
264
265
266
267
268
269
270
def get_hydrated_version(self) -> "ModelVersionResponse":
    """Get the hydrated version of this model version.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_model_version(self.id)

get_model_artifact(name, version=None)

Get the model artifact linked to this model version.

Parameters:

Name Type Description Default
name str

The name of the model artifact to retrieve.

required
version Optional[str]

The version of the model artifact to retrieve (None for latest/non-versioned)

None

Returns:

Type Description
Optional[ArtifactVersionResponse]

Specific version of the model artifact or None

Source code in src/zenml/models/v2/core/model_version.py
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
def get_model_artifact(
    self,
    name: str,
    version: Optional[str] = None,
) -> Optional["ArtifactVersionResponse"]:
    """Get the model artifact linked to this model version.

    Args:
        name: The name of the model artifact to retrieve.
        version: The version of the model artifact to retrieve (None for
            latest/non-versioned)

    Returns:
        Specific version of the model artifact or None
    """
    return self._get_linked_object(name, version, ArtifactType.MODEL)

set_stage(stage, force=False)

Sets this Model Version to a desired stage.

Parameters:

Name Type Description Default
stage Union[str, ModelStages]

the target stage for model version.

required
force bool

whether to force archiving of current model version in target stage or raise.

False

Raises:

Type Description
ValueError

if model_stage is not valid.

Source code in src/zenml/models/v2/core/model_version.py
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
def set_stage(
    self, stage: Union[str, ModelStages], force: bool = False
) -> None:
    """Sets this Model Version to a desired stage.

    Args:
        stage: the target stage for model version.
        force: whether to force archiving of current model version in
            target stage or raise.

    Raises:
        ValueError: if model_stage is not valid.
    """
    from zenml.client import Client

    stage = getattr(stage, "value", stage)
    if stage not in [stage.value for stage in ModelStages]:
        raise ValueError(f"`{stage}` is not a valid model stage.")

    Client().update_model_version(
        model_name_or_id=self.model.id,
        version_name_or_id=self.id,
        stage=stage,
        force=force,
    )

to_model_class(suppress_class_validation_warnings=True)

Convert response model to Model object.

Parameters:

Name Type Description Default
suppress_class_validation_warnings bool

internally used to suppress repeated warnings.

True

Returns:

Type Description
Model

Model object

Source code in src/zenml/models/v2/core/model_version.py
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
def to_model_class(
    self,
    suppress_class_validation_warnings: bool = True,
) -> "Model":
    """Convert response model to Model object.

    Args:
        suppress_class_validation_warnings: internally used to suppress
            repeated warnings.

    Returns:
        Model object
    """
    from zenml.model.model import Model

    mv = Model(
        name=self.model.name,
        license=self.model.license,
        description=self.description,
        audience=self.model.audience,
        use_cases=self.model.use_cases,
        limitations=self.model.limitations,
        trade_offs=self.model.trade_offs,
        ethics=self.model.ethics,
        tags=[t.name for t in self.tags],
        version=self.name,
        suppress_class_validation_warnings=suppress_class_validation_warnings,
        model_version_id=self.id,
    )

    return mv

ModelVersionResponseBody

Bases: ProjectScopedResponseBody

Response body for model versions.

Source code in src/zenml/models/v2/core/model_version.py
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
class ModelVersionResponseBody(ProjectScopedResponseBody):
    """Response body for model versions."""

    stage: Optional[str] = Field(
        description="The stage of the model version",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    number: int = Field(
        description="The number of the model version",
    )
    model: "ModelResponse" = Field(
        description="The model containing version",
    )

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

ModelVersionResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for model versions.

Source code in src/zenml/models/v2/core/model_version.py
168
169
170
171
172
173
174
175
176
177
178
179
class ModelVersionResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for model versions."""

    description: Optional[str] = Field(
        description="The description of the model version",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    run_metadata: Dict[str, MetadataType] = Field(
        description="Metadata linked to the model version",
        default={},
    )

ModelVersionResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the model version entity.

Source code in src/zenml/models/v2/core/model_version.py
182
183
184
185
186
187
188
189
190
class ModelVersionResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the model version entity."""

    services: Page[ServiceResponse] = Field(
        description="Services linked to the model version",
    )
    tags: List[TagResponse] = Field(
        title="Tags associated with the model version", default=[]
    )

ModelVersionUpdate

Bases: BaseUpdate

Update model for model versions.

Source code in src/zenml/models/v2/core/model_version.py
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
class ModelVersionUpdate(BaseUpdate):
    """Update model for model versions."""

    stage: Optional[Union[str, ModelStages]] = Field(
        description="Target model version stage to be set",
        default=None,
        union_mode="left_to_right",
    )
    force: bool = Field(
        description="Whether existing model version in target stage should be "
        "silently archived or an error should be raised.",
        default=False,
    )
    name: Optional[str] = Field(
        description="Target model version name to be set",
        default=None,
    )
    description: Optional[str] = Field(
        description="Target model version description to be set",
        default=None,
    )
    add_tags: Optional[List[str]] = Field(
        description="Tags to be added to the model version",
        default=None,
    )
    remove_tags: Optional[List[str]] = Field(
        description="Tags to be removed from the model version",
        default=None,
    )

    @field_validator("stage")
    @classmethod
    def _validate_stage(cls, stage: str) -> str:
        stage = getattr(stage, "value", stage)
        if stage is not None and stage not in [
            stage.value for stage in ModelStages
        ]:
            raise ValueError(f"`{stage}` is not a valid model stage.")
        return stage

NumericFilter

Bases: Filter

Filter for all numeric fields.

Source code in src/zenml/models/v2/base/filter.py
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
class NumericFilter(Filter):
    """Filter for all numeric fields."""

    value: Union[float, datetime] = Field(union_mode="left_to_right")

    ALLOWED_OPS: ClassVar[List[str]] = [
        GenericFilterOps.EQUALS,
        GenericFilterOps.NOT_EQUALS,
        GenericFilterOps.GT,
        GenericFilterOps.GTE,
        GenericFilterOps.LT,
        GenericFilterOps.LTE,
    ]

    def generate_query_conditions_from_column(self, column: Any) -> Any:
        """Generate query conditions for a numeric column.

        Args:
            column: The numeric column of an SQLModel table on which to filter.

        Returns:
            A list of query conditions.
        """
        if self.operation == GenericFilterOps.GTE:
            return column >= self.value
        if self.operation == GenericFilterOps.GT:
            return column > self.value
        if self.operation == GenericFilterOps.LTE:
            return column <= self.value
        if self.operation == GenericFilterOps.LT:
            return column < self.value
        if self.operation == GenericFilterOps.NOT_EQUALS:
            return column != self.value
        return column == self.value

generate_query_conditions_from_column(column)

Generate query conditions for a numeric column.

Parameters:

Name Type Description Default
column Any

The numeric column of an SQLModel table on which to filter.

required

Returns:

Type Description
Any

A list of query conditions.

Source code in src/zenml/models/v2/base/filter.py
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def generate_query_conditions_from_column(self, column: Any) -> Any:
    """Generate query conditions for a numeric column.

    Args:
        column: The numeric column of an SQLModel table on which to filter.

    Returns:
        A list of query conditions.
    """
    if self.operation == GenericFilterOps.GTE:
        return column >= self.value
    if self.operation == GenericFilterOps.GT:
        return column > self.value
    if self.operation == GenericFilterOps.LTE:
        return column <= self.value
    if self.operation == GenericFilterOps.LT:
        return column < self.value
    if self.operation == GenericFilterOps.NOT_EQUALS:
        return column != self.value
    return column == self.value

OAuthDeviceAuthorizationRequest

Bases: BaseModel

OAuth2 device authorization grant request.

Source code in src/zenml/models/v2/misc/auth_models.py
28
29
30
31
32
class OAuthDeviceAuthorizationRequest(BaseModel):
    """OAuth2 device authorization grant request."""

    client_id: UUID
    device_id: Optional[UUID] = None

OAuthDeviceAuthorizationResponse

Bases: BaseModel

OAuth2 device authorization grant response.

Source code in src/zenml/models/v2/misc/auth_models.py
104
105
106
107
108
109
110
111
112
class OAuthDeviceAuthorizationResponse(BaseModel):
    """OAuth2 device authorization grant response."""

    device_code: str
    user_code: str
    verification_uri: str
    verification_uri_complete: Optional[str] = None
    expires_in: int
    interval: int

OAuthDeviceFilter

Bases: UserScopedFilter

Model to enable advanced filtering of OAuth2 devices.

Source code in src/zenml/models/v2/core/device.py
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
class OAuthDeviceFilter(UserScopedFilter):
    """Model to enable advanced filtering of OAuth2 devices."""

    expires: Optional[Union[datetime, str, None]] = Field(
        default=None,
        description="The expiration date of the OAuth2 device.",
        union_mode="left_to_right",
    )
    client_id: Union[UUID, str, None] = Field(
        default=None,
        description="The client ID of the OAuth2 device.",
        union_mode="left_to_right",
    )
    status: Union[OAuthDeviceStatus, str, None] = Field(
        default=None,
        description="The status of the OAuth2 device.",
        union_mode="left_to_right",
    )
    trusted_device: Union[bool, str, None] = Field(
        default=None,
        description="Whether the OAuth2 device was marked as trusted.",
        union_mode="left_to_right",
    )
    failed_auth_attempts: Union[int, str, None] = Field(
        default=None,
        description="The number of failed authentication attempts.",
        union_mode="left_to_right",
    )
    last_login: Optional[Union[datetime, str, None]] = Field(
        default=None,
        description="The date of the last successful login.",
        union_mode="left_to_right",
    )

OAuthDeviceInternalRequest

Bases: BaseRequest

Internal request model for OAuth2 devices.

Source code in src/zenml/models/v2/core/device.py
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
class OAuthDeviceInternalRequest(BaseRequest):
    """Internal request model for OAuth2 devices."""

    client_id: UUID = Field(description="The client ID of the OAuth2 device.")
    expires_in: int = Field(
        description="The number of seconds after which the OAuth2 device "
        "expires and can no longer be used for authentication."
    )
    os: Optional[str] = Field(
        default=None,
        description="The operating system of the device used for "
        "authentication.",
    )
    ip_address: Optional[str] = Field(
        default=None,
        description="The IP address of the device used for authentication.",
    )
    hostname: Optional[str] = Field(
        default=None,
        description="The hostname of the device used for authentication.",
    )
    python_version: Optional[str] = Field(
        default=None,
        description="The Python version of the device used for authentication.",
    )
    zenml_version: Optional[str] = Field(
        default=None,
        description="The ZenML version of the device used for authentication.",
    )
    city: Optional[str] = Field(
        default=None,
        description="The city where the device is located.",
    )
    region: Optional[str] = Field(
        default=None,
        description="The region where the device is located.",
    )
    country: Optional[str] = Field(
        default=None,
        description="The country where the device is located.",
    )

OAuthDeviceInternalResponse

Bases: OAuthDeviceResponse

OAuth2 device response model used internally for authentication.

Source code in src/zenml/models/v2/core/device.py
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
class OAuthDeviceInternalResponse(OAuthDeviceResponse):
    """OAuth2 device response model used internally for authentication."""

    user_code: str = Field(
        title="The user code.",
    )
    device_code: str = Field(
        title="The device code.",
    )

    def _verify_code(
        self,
        code: str,
        code_hash: Optional[str],
    ) -> bool:
        """Verifies a given code against the stored (hashed) code.

        Args:
            code: The code to verify.
            code_hash: The hashed code to verify against.

        Returns:
            True if the code is valid, False otherwise.
        """
        context = CryptContext(schemes=["bcrypt"], deprecated="auto")
        result = context.verify(code, code_hash)

        return result

    def verify_user_code(
        self,
        user_code: str,
    ) -> bool:
        """Verifies a given user code against the stored (hashed) user code.

        Args:
            user_code: The user code to verify.

        Returns:
            True if the user code is valid, False otherwise.
        """
        return self._verify_code(user_code, self.user_code)

    def verify_device_code(
        self,
        device_code: str,
    ) -> bool:
        """Verifies a given device code against the stored (hashed) device code.

        Args:
            device_code: The device code to verify.

        Returns:
            True if the device code is valid, False otherwise.
        """
        return self._verify_code(device_code, self.device_code)

verify_device_code(device_code)

Verifies a given device code against the stored (hashed) device code.

Parameters:

Name Type Description Default
device_code str

The device code to verify.

required

Returns:

Type Description
bool

True if the device code is valid, False otherwise.

Source code in src/zenml/models/v2/core/device.py
422
423
424
425
426
427
428
429
430
431
432
433
434
def verify_device_code(
    self,
    device_code: str,
) -> bool:
    """Verifies a given device code against the stored (hashed) device code.

    Args:
        device_code: The device code to verify.

    Returns:
        True if the device code is valid, False otherwise.
    """
    return self._verify_code(device_code, self.device_code)

verify_user_code(user_code)

Verifies a given user code against the stored (hashed) user code.

Parameters:

Name Type Description Default
user_code str

The user code to verify.

required

Returns:

Type Description
bool

True if the user code is valid, False otherwise.

Source code in src/zenml/models/v2/core/device.py
408
409
410
411
412
413
414
415
416
417
418
419
420
def verify_user_code(
    self,
    user_code: str,
) -> bool:
    """Verifies a given user code against the stored (hashed) user code.

    Args:
        user_code: The user code to verify.

    Returns:
        True if the user code is valid, False otherwise.
    """
    return self._verify_code(user_code, self.user_code)

OAuthDeviceInternalUpdate

Bases: OAuthDeviceUpdate

OAuth2 device update model used internally for authentication.

Source code in src/zenml/models/v2/core/device.py
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
class OAuthDeviceInternalUpdate(OAuthDeviceUpdate):
    """OAuth2 device update model used internally for authentication."""

    user_id: Optional[UUID] = Field(
        default=None, description="User that owns the OAuth2 device."
    )
    status: Optional[OAuthDeviceStatus] = Field(
        default=None, description="The new status of the OAuth2 device."
    )
    expires_in: Optional[int] = Field(
        default=None,
        description="Set the device to expire in the given number of seconds. "
        "If the value is 0 or negative, the device is set to never expire.",
    )
    failed_auth_attempts: Optional[int] = Field(
        default=None,
        description="Set the number of failed authentication attempts.",
    )
    trusted_device: Optional[bool] = Field(
        default=None,
        description="Whether to mark the OAuth2 device as trusted. A trusted "
        "device has a much longer validity time.",
    )
    update_last_login: bool = Field(
        default=False, description="Whether to update the last login date."
    )
    generate_new_codes: bool = Field(
        default=False,
        description="Whether to generate new user and device codes.",
    )
    os: Optional[str] = Field(
        default=None,
        description="The operating system of the device used for "
        "authentication.",
    )
    ip_address: Optional[str] = Field(
        default=None,
        description="The IP address of the device used for authentication.",
    )
    hostname: Optional[str] = Field(
        default=None,
        description="The hostname of the device used for authentication.",
    )
    python_version: Optional[str] = Field(
        default=None,
        description="The Python version of the device used for authentication.",
    )
    zenml_version: Optional[str] = Field(
        default=None,
        description="The ZenML version of the device used for authentication.",
    )
    city: Optional[str] = Field(
        default=None,
        description="The city where the device is located.",
    )
    region: Optional[str] = Field(
        default=None,
        description="The region where the device is located.",
    )
    country: Optional[str] = Field(
        default=None,
        description="The country where the device is located.",
    )

OAuthDeviceResponse

Bases: UserScopedResponse[OAuthDeviceResponseBody, OAuthDeviceResponseMetadata, OAuthDeviceResponseResources]

Response model for OAuth2 devices.

Source code in src/zenml/models/v2/core/device.py
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
class OAuthDeviceResponse(
    UserScopedResponse[
        OAuthDeviceResponseBody,
        OAuthDeviceResponseMetadata,
        OAuthDeviceResponseResources,
    ]
):
    """Response model for OAuth2 devices."""

    _warn_on_response_updates = False

    def get_hydrated_version(self) -> "OAuthDeviceResponse":
        """Get the hydrated version of this OAuth2 device.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_authorized_device(self.id)

    # Body and metadata properties
    @property
    def client_id(self) -> UUID:
        """The `client_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().client_id

    @property
    def expires(self) -> Optional[datetime]:
        """The `expires` property.

        Returns:
            the value of the property.
        """
        return self.get_body().expires

    @property
    def trusted_device(self) -> bool:
        """The `trusted_device` property.

        Returns:
            the value of the property.
        """
        return self.get_body().trusted_device

    @property
    def status(self) -> OAuthDeviceStatus:
        """The `status` property.

        Returns:
            the value of the property.
        """
        return self.get_body().status

    @property
    def os(self) -> Optional[str]:
        """The `os` property.

        Returns:
            the value of the property.
        """
        return self.get_body().os

    @property
    def ip_address(self) -> Optional[str]:
        """The `ip_address` property.

        Returns:
            the value of the property.
        """
        return self.get_body().ip_address

    @property
    def hostname(self) -> Optional[str]:
        """The `hostname` property.

        Returns:
            the value of the property.
        """
        return self.get_body().hostname

    @property
    def python_version(self) -> Optional[str]:
        """The `python_version` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().python_version

    @property
    def zenml_version(self) -> Optional[str]:
        """The `zenml_version` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().zenml_version

    @property
    def city(self) -> Optional[str]:
        """The `city` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().city

    @property
    def region(self) -> Optional[str]:
        """The `region` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().region

    @property
    def country(self) -> Optional[str]:
        """The `country` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().country

    @property
    def failed_auth_attempts(self) -> int:
        """The `failed_auth_attempts` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().failed_auth_attempts

    @property
    def last_login(self) -> Optional[datetime]:
        """The `last_login` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().last_login

city property

The city property.

Returns:

Type Description
Optional[str]

the value of the property.

client_id property

The client_id property.

Returns:

Type Description
UUID

the value of the property.

country property

The country property.

Returns:

Type Description
Optional[str]

the value of the property.

expires property

The expires property.

Returns:

Type Description
Optional[datetime]

the value of the property.

failed_auth_attempts property

The failed_auth_attempts property.

Returns:

Type Description
int

the value of the property.

hostname property

The hostname property.

Returns:

Type Description
Optional[str]

the value of the property.

ip_address property

The ip_address property.

Returns:

Type Description
Optional[str]

the value of the property.

last_login property

The last_login property.

Returns:

Type Description
Optional[datetime]

the value of the property.

os property

The os property.

Returns:

Type Description
Optional[str]

the value of the property.

python_version property

The python_version property.

Returns:

Type Description
Optional[str]

the value of the property.

region property

The region property.

Returns:

Type Description
Optional[str]

the value of the property.

status property

The status property.

Returns:

Type Description
OAuthDeviceStatus

the value of the property.

trusted_device property

The trusted_device property.

Returns:

Type Description
bool

the value of the property.

zenml_version property

The zenml_version property.

Returns:

Type Description
Optional[str]

the value of the property.

get_hydrated_version()

Get the hydrated version of this OAuth2 device.

Returns:

Type Description
OAuthDeviceResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/device.py
241
242
243
244
245
246
247
248
249
def get_hydrated_version(self) -> "OAuthDeviceResponse":
    """Get the hydrated version of this OAuth2 device.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_authorized_device(self.id)

OAuthDeviceResponseBody

Bases: UserScopedResponseBody

Response body for OAuth2 devices.

Source code in src/zenml/models/v2/core/device.py
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
class OAuthDeviceResponseBody(UserScopedResponseBody):
    """Response body for OAuth2 devices."""

    client_id: UUID = Field(description="The client ID of the OAuth2 device.")
    expires: Optional[datetime] = Field(
        default=None,
        description="The expiration date of the OAuth2 device after which "
        "the device is no longer valid and cannot be used for "
        "authentication.",
    )
    trusted_device: bool = Field(
        description="Whether the OAuth2 device was marked as trusted. A "
        "trusted device has a much longer validity time.",
    )
    status: OAuthDeviceStatus = Field(
        description="The status of the OAuth2 device."
    )
    os: Optional[str] = Field(
        default=None,
        description="The operating system of the device used for "
        "authentication.",
    )
    ip_address: Optional[str] = Field(
        default=None,
        description="The IP address of the device used for authentication.",
    )
    hostname: Optional[str] = Field(
        default=None,
        description="The hostname of the device used for authentication.",
    )

OAuthDeviceResponseMetadata

Bases: UserScopedResponseMetadata

Response metadata for OAuth2 devices.

Source code in src/zenml/models/v2/core/device.py
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
class OAuthDeviceResponseMetadata(UserScopedResponseMetadata):
    """Response metadata for OAuth2 devices."""

    python_version: Optional[str] = Field(
        default=None,
        description="The Python version of the device used for authentication.",
    )
    zenml_version: Optional[str] = Field(
        default=None,
        description="The ZenML version of the device used for authentication.",
    )
    city: Optional[str] = Field(
        default=None,
        description="The city where the device is located.",
    )
    region: Optional[str] = Field(
        default=None,
        description="The region where the device is located.",
    )
    country: Optional[str] = Field(
        default=None,
        description="The country where the device is located.",
    )
    failed_auth_attempts: int = Field(
        description="The number of failed authentication attempts.",
    )
    last_login: Optional[datetime] = Field(
        description="The date of the last successful login."
    )

OAuthDeviceResponseResources

Bases: UserScopedResponseResources

Class for all resource models associated with the OAuthDevice entity.

Source code in src/zenml/models/v2/core/device.py
226
227
class OAuthDeviceResponseResources(UserScopedResponseResources):
    """Class for all resource models associated with the OAuthDevice entity."""

OAuthDeviceTokenRequest

Bases: BaseModel

OAuth2 device authorization grant request.

Source code in src/zenml/models/v2/misc/auth_models.py
42
43
44
45
46
47
class OAuthDeviceTokenRequest(BaseModel):
    """OAuth2 device authorization grant request."""

    grant_type: str = OAuthGrantTypes.OAUTH_DEVICE_CODE
    client_id: UUID
    device_code: str

OAuthDeviceUpdate

Bases: BaseUpdate

OAuth2 device update model.

Source code in src/zenml/models/v2/core/device.py
85
86
87
88
89
90
91
92
class OAuthDeviceUpdate(BaseUpdate):
    """OAuth2 device update model."""

    locked: Optional[bool] = Field(
        default=None,
        description="Whether to lock or unlock the OAuth2 device. A locked "
        "device cannot be used for authentication.",
    )

OAuthDeviceUserAgentHeader

Bases: BaseModel

OAuth2 device user agent header.

Source code in src/zenml/models/v2/misc/auth_models.py
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
class OAuthDeviceUserAgentHeader(BaseModel):
    """OAuth2 device user agent header."""

    hostname: Optional[str] = None
    os: Optional[str] = None
    python_version: Optional[str] = None
    zenml_version: Optional[str] = None

    @classmethod
    def decode(cls, header_str: str) -> "OAuthDeviceUserAgentHeader":
        """Decode the user agent header.

        Args:
            header_str: The user agent header string value.

        Returns:
            The decoded user agent header.
        """
        header = cls()
        properties = header_str.strip().split(" ")
        for property in properties:
            try:
                key, value = property.split("/", maxsplit=1)
            except ValueError:
                continue
            if key == "Host":
                header.hostname = value
            elif key == "ZenML":
                header.zenml_version = value
            elif key == "Python":
                header.python_version = value
            elif key == "OS":
                header.os = value
        return header

    def encode(self) -> str:
        """Encode the user agent header.

        Returns:
            The encoded user agent header.
        """
        return (
            f"Host/{self.hostname} "
            f"ZenML/{self.zenml_version} "
            f"Python/{self.python_version} "
            f"OS/{self.os}"
        )

decode(header_str) classmethod

Decode the user agent header.

Parameters:

Name Type Description Default
header_str str

The user agent header string value.

required

Returns:

Type Description
OAuthDeviceUserAgentHeader

The decoded user agent header.

Source code in src/zenml/models/v2/misc/auth_models.py
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
@classmethod
def decode(cls, header_str: str) -> "OAuthDeviceUserAgentHeader":
    """Decode the user agent header.

    Args:
        header_str: The user agent header string value.

    Returns:
        The decoded user agent header.
    """
    header = cls()
    properties = header_str.strip().split(" ")
    for property in properties:
        try:
            key, value = property.split("/", maxsplit=1)
        except ValueError:
            continue
        if key == "Host":
            header.hostname = value
        elif key == "ZenML":
            header.zenml_version = value
        elif key == "Python":
            header.python_version = value
        elif key == "OS":
            header.os = value
    return header

encode()

Encode the user agent header.

Returns:

Type Description
str

The encoded user agent header.

Source code in src/zenml/models/v2/misc/auth_models.py
85
86
87
88
89
90
91
92
93
94
95
96
def encode(self) -> str:
    """Encode the user agent header.

    Returns:
        The encoded user agent header.
    """
    return (
        f"Host/{self.hostname} "
        f"ZenML/{self.zenml_version} "
        f"Python/{self.python_version} "
        f"OS/{self.os}"
    )

OAuthDeviceVerificationRequest

Bases: BaseModel

OAuth2 device authorization verification request.

Source code in src/zenml/models/v2/misc/auth_models.py
35
36
37
38
39
class OAuthDeviceVerificationRequest(BaseModel):
    """OAuth2 device authorization verification request."""

    user_code: str
    trusted_device: bool = False

OAuthRedirectResponse

Bases: BaseModel

Redirect response.

Source code in src/zenml/models/v2/misc/auth_models.py
133
134
135
136
class OAuthRedirectResponse(BaseModel):
    """Redirect response."""

    authorization_url: str

OAuthTokenResponse

Bases: BaseModel

OAuth2 device authorization token response.

Source code in src/zenml/models/v2/misc/auth_models.py
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
class OAuthTokenResponse(BaseModel):
    """OAuth2 device authorization token response."""

    access_token: str
    token_type: str
    expires_in: Optional[int] = None
    refresh_token: Optional[str] = None
    csrf_token: Optional[str] = None
    scope: Optional[str] = None
    device_id: Optional[UUID] = None
    device_metadata: Optional[Dict[str, Any]] = None

    model_config = ConfigDict(
        # Allow extra attributes to allow compatibility with different versions
        extra="allow",
    )

Page

Bases: BaseModel, Generic[B]

Return Model for List Models to accommodate pagination.

Source code in src/zenml/models/v2/base/page.py
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
class Page(BaseModel, Generic[B]):
    """Return Model for List Models to accommodate pagination."""

    index: PositiveInt
    max_size: PositiveInt
    total_pages: NonNegativeInt
    total: NonNegativeInt
    items: List[B]

    __params_type__ = BaseFilter

    @property
    def size(self) -> int:
        """Return the item count of the page.

        Returns:
            The amount of items in the page.
        """
        return len(self.items)

    def __len__(self) -> int:
        """Return the item count of the page.

        This enables `len(page)`.

        Returns:
            The amount of items in the page.
        """
        return len(self.items)

    def __getitem__(self, index: int) -> B:
        """Return the item at the given index.

        This enables `page[index]`.

        Args:
            index: The index to get the item from.

        Returns:
            The item at the given index.
        """
        return self.items[index]

    def __iter__(self) -> Generator[B, None, None]:  # type: ignore[override]
        """Return an iterator over the items in the page.

        This enables `for item in page` loops, but breaks `dict(page)`.

        Yields:
            An iterator over the items in the page.
        """
        for item in self.items.__iter__():
            yield item

    def __contains__(self, item: B) -> bool:
        """Returns whether the page contains a specific item.

        This enables `item in page` checks.

        Args:
            item: The item to check for.

        Returns:
            Whether the item is in the page.
        """
        return item in self.items

size property

Return the item count of the page.

Returns:

Type Description
int

The amount of items in the page.

__contains__(item)

Returns whether the page contains a specific item.

This enables item in page checks.

Parameters:

Name Type Description Default
item B

The item to check for.

required

Returns:

Type Description
bool

Whether the item is in the page.

Source code in src/zenml/models/v2/base/page.py
81
82
83
84
85
86
87
88
89
90
91
92
def __contains__(self, item: B) -> bool:
    """Returns whether the page contains a specific item.

    This enables `item in page` checks.

    Args:
        item: The item to check for.

    Returns:
        Whether the item is in the page.
    """
    return item in self.items

__getitem__(index)

Return the item at the given index.

This enables page[index].

Parameters:

Name Type Description Default
index int

The index to get the item from.

required

Returns:

Type Description
B

The item at the given index.

Source code in src/zenml/models/v2/base/page.py
57
58
59
60
61
62
63
64
65
66
67
68
def __getitem__(self, index: int) -> B:
    """Return the item at the given index.

    This enables `page[index]`.

    Args:
        index: The index to get the item from.

    Returns:
        The item at the given index.
    """
    return self.items[index]

__iter__()

Return an iterator over the items in the page.

This enables for item in page loops, but breaks dict(page).

Yields:

Type Description
B

An iterator over the items in the page.

Source code in src/zenml/models/v2/base/page.py
70
71
72
73
74
75
76
77
78
79
def __iter__(self) -> Generator[B, None, None]:  # type: ignore[override]
    """Return an iterator over the items in the page.

    This enables `for item in page` loops, but breaks `dict(page)`.

    Yields:
        An iterator over the items in the page.
    """
    for item in self.items.__iter__():
        yield item

__len__()

Return the item count of the page.

This enables len(page).

Returns:

Type Description
int

The amount of items in the page.

Source code in src/zenml/models/v2/base/page.py
47
48
49
50
51
52
53
54
55
def __len__(self) -> int:
    """Return the item count of the page.

    This enables `len(page)`.

    Returns:
        The amount of items in the page.
    """
    return len(self.items)

PipelineBuildBase

Bases: BaseZenModel

Base model for pipeline builds.

Source code in src/zenml/models/v2/core/pipeline_build.py
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
class PipelineBuildBase(BaseZenModel):
    """Base model for pipeline builds."""

    images: Dict[str, BuildItem] = Field(
        default={}, title="The images of this build."
    )
    is_local: bool = Field(
        title="Whether the build images are stored in a container registry "
        "or locally.",
    )
    contains_code: bool = Field(
        title="Whether any image of the build contains user code.",
    )
    zenml_version: Optional[str] = Field(
        title="The version of ZenML used for this build.", default=None
    )
    python_version: Optional[str] = Field(
        title="The Python version used for this build.", default=None
    )
    duration: Optional[int] = Field(
        title="The duration of the build in seconds.", default=None
    )

    # Helper methods
    @property
    def requires_code_download(self) -> bool:
        """Whether the build requires code download.

        Returns:
            Whether the build requires code download.
        """
        return any(
            item.requires_code_download for item in self.images.values()
        )

    @staticmethod
    def get_image_key(component_key: str, step: Optional[str] = None) -> str:
        """Get the image key.

        Args:
            component_key: The component key.
            step: The pipeline step for which the image was built.

        Returns:
            The image key.
        """
        if step:
            return f"{step}.{component_key}"
        else:
            return component_key

    def get_image(self, component_key: str, step: Optional[str] = None) -> str:
        """Get the image built for a specific key.

        Args:
            component_key: The key for which to get the image.
            step: The pipeline step for which to get the image. If no image
                exists for this step, will fall back to the pipeline image for
                the same key.

        Returns:
            The image name or digest.
        """
        return self._get_item(component_key=component_key, step=step).image

    def get_settings_checksum(
        self, component_key: str, step: Optional[str] = None
    ) -> Optional[str]:
        """Get the settings checksum for a specific key.

        Args:
            component_key: The key for which to get the checksum.
            step: The pipeline step for which to get the checksum. If no
                image exists for this step, will fall back to the pipeline image
                for the same key.

        Returns:
            The settings checksum.
        """
        return self._get_item(
            component_key=component_key, step=step
        ).settings_checksum

    def _get_item(
        self, component_key: str, step: Optional[str] = None
    ) -> "BuildItem":
        """Get the item for a specific key.

        Args:
            component_key: The key for which to get the item.
            step: The pipeline step for which to get the item. If no item
                exists for this step, will fall back to the item for
                the same key.

        Raises:
            KeyError: If no item exists for the given key.

        Returns:
            The build item.
        """
        if step:
            try:
                combined_key = self.get_image_key(
                    component_key=component_key, step=step
                )
                return self.images[combined_key]
            except KeyError:
                pass

        try:
            return self.images[component_key]
        except KeyError:
            raise KeyError(
                f"Unable to find image for key {component_key}. Available keys: "
                f"{set(self.images)}."
            )

requires_code_download property

Whether the build requires code download.

Returns:

Type Description
bool

Whether the build requires code download.

get_image(component_key, step=None)

Get the image built for a specific key.

Parameters:

Name Type Description Default
component_key str

The key for which to get the image.

required
step Optional[str]

The pipeline step for which to get the image. If no image exists for this step, will fall back to the pipeline image for the same key.

None

Returns:

Type Description
str

The image name or digest.

Source code in src/zenml/models/v2/core/pipeline_build.py
107
108
109
110
111
112
113
114
115
116
117
118
119
def get_image(self, component_key: str, step: Optional[str] = None) -> str:
    """Get the image built for a specific key.

    Args:
        component_key: The key for which to get the image.
        step: The pipeline step for which to get the image. If no image
            exists for this step, will fall back to the pipeline image for
            the same key.

    Returns:
        The image name or digest.
    """
    return self._get_item(component_key=component_key, step=step).image

get_image_key(component_key, step=None) staticmethod

Get the image key.

Parameters:

Name Type Description Default
component_key str

The component key.

required
step Optional[str]

The pipeline step for which the image was built.

None

Returns:

Type Description
str

The image key.

Source code in src/zenml/models/v2/core/pipeline_build.py
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
@staticmethod
def get_image_key(component_key: str, step: Optional[str] = None) -> str:
    """Get the image key.

    Args:
        component_key: The component key.
        step: The pipeline step for which the image was built.

    Returns:
        The image key.
    """
    if step:
        return f"{step}.{component_key}"
    else:
        return component_key

get_settings_checksum(component_key, step=None)

Get the settings checksum for a specific key.

Parameters:

Name Type Description Default
component_key str

The key for which to get the checksum.

required
step Optional[str]

The pipeline step for which to get the checksum. If no image exists for this step, will fall back to the pipeline image for the same key.

None

Returns:

Type Description
Optional[str]

The settings checksum.

Source code in src/zenml/models/v2/core/pipeline_build.py
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
def get_settings_checksum(
    self, component_key: str, step: Optional[str] = None
) -> Optional[str]:
    """Get the settings checksum for a specific key.

    Args:
        component_key: The key for which to get the checksum.
        step: The pipeline step for which to get the checksum. If no
            image exists for this step, will fall back to the pipeline image
            for the same key.

    Returns:
        The settings checksum.
    """
    return self._get_item(
        component_key=component_key, step=step
    ).settings_checksum

PipelineBuildFilter

Bases: ProjectScopedFilter

Model to enable advanced filtering of all pipeline builds.

Source code in src/zenml/models/v2/core/pipeline_build.py
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
class PipelineBuildFilter(ProjectScopedFilter):
    """Model to enable advanced filtering of all pipeline builds."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.FILTER_EXCLUDE_FIELDS,
        "container_registry_id",
    ]

    pipeline_id: Optional[Union[UUID, str]] = Field(
        description="Pipeline associated with the pipeline build.",
        default=None,
        union_mode="left_to_right",
    )
    stack_id: Optional[Union[UUID, str]] = Field(
        description="Stack associated with the pipeline build.",
        default=None,
        union_mode="left_to_right",
    )
    container_registry_id: Optional[Union[UUID, str]] = Field(
        description="Container registry associated with the pipeline build.",
        default=None,
        union_mode="left_to_right",
    )
    is_local: Optional[bool] = Field(
        description="Whether the build images are stored in a container "
        "registry or locally.",
        default=None,
    )
    contains_code: Optional[bool] = Field(
        description="Whether any image of the build contains user code.",
        default=None,
    )
    zenml_version: Optional[str] = Field(
        description="The version of ZenML used for this build.", default=None
    )
    python_version: Optional[str] = Field(
        description="The Python version used for this build.", default=None
    )
    checksum: Optional[str] = Field(
        description="The build checksum.", default=None
    )
    stack_checksum: Optional[str] = Field(
        description="The stack checksum.", default=None
    )
    duration: Optional[Union[int, str]] = Field(
        description="The duration of the build in seconds.", default=None
    )

    def get_custom_filters(
        self,
        table: Type["AnySchema"],
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_

        from zenml.enums import StackComponentType
        from zenml.zen_stores.schemas import (
            PipelineBuildSchema,
            StackComponentSchema,
            StackCompositionSchema,
            StackSchema,
        )

        if self.container_registry_id:
            container_registry_filter = and_(
                PipelineBuildSchema.stack_id == StackSchema.id,
                StackSchema.id == StackCompositionSchema.stack_id,
                StackCompositionSchema.component_id == StackComponentSchema.id,
                StackComponentSchema.type
                == StackComponentType.CONTAINER_REGISTRY.value,
                StackComponentSchema.id == self.container_registry_id,
            )
            custom_filters.append(container_registry_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/pipeline_build.py
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
def get_custom_filters(
    self,
    table: Type["AnySchema"],
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_

    from zenml.enums import StackComponentType
    from zenml.zen_stores.schemas import (
        PipelineBuildSchema,
        StackComponentSchema,
        StackCompositionSchema,
        StackSchema,
    )

    if self.container_registry_id:
        container_registry_filter = and_(
            PipelineBuildSchema.stack_id == StackSchema.id,
            StackSchema.id == StackCompositionSchema.stack_id,
            StackCompositionSchema.component_id == StackComponentSchema.id,
            StackComponentSchema.type
            == StackComponentType.CONTAINER_REGISTRY.value,
            StackComponentSchema.id == self.container_registry_id,
        )
        custom_filters.append(container_registry_filter)

    return custom_filters

PipelineBuildRequest

Bases: PipelineBuildBase, ProjectScopedRequest

Request model for pipelines builds.

Source code in src/zenml/models/v2/core/pipeline_build.py
174
175
176
177
178
179
180
181
182
183
184
185
186
187
class PipelineBuildRequest(PipelineBuildBase, ProjectScopedRequest):
    """Request model for pipelines builds."""

    checksum: Optional[str] = Field(title="The build checksum.", default=None)
    stack_checksum: Optional[str] = Field(
        title="The stack checksum.", default=None
    )

    stack: Optional[UUID] = Field(
        title="The stack that was used for this build.", default=None
    )
    pipeline: Optional[UUID] = Field(
        title="The pipeline that was used for this build.", default=None
    )

PipelineBuildResponse

Bases: ProjectScopedResponse[PipelineBuildResponseBody, PipelineBuildResponseMetadata, PipelineBuildResponseResources]

Response model for pipeline builds.

Source code in src/zenml/models/v2/core/pipeline_build.py
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
class PipelineBuildResponse(
    ProjectScopedResponse[
        PipelineBuildResponseBody,
        PipelineBuildResponseMetadata,
        PipelineBuildResponseResources,
    ]
):
    """Response model for pipeline builds."""

    def get_hydrated_version(self) -> "PipelineBuildResponse":
        """Return the hydrated version of this pipeline build.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_build(self.id)

    # Helper methods
    def to_yaml(self) -> Dict[str, Any]:
        """Create a yaml representation of the pipeline build.

        Create a yaml representation of the pipeline build that can be used
        to create a PipelineBuildBase instance.

        Returns:
            The yaml representation of the pipeline build.
        """
        # Get the base attributes
        yaml_dict: Dict[str, Any] = json.loads(
            self.model_dump_json(
                exclude={
                    "body",
                    "metadata",
                }
            )
        )
        images = json.loads(
            self.get_metadata().model_dump_json(
                exclude={
                    "pipeline",
                    "stack",
                    "project",
                }
            )
        )
        yaml_dict.update(images)
        return yaml_dict

    @property
    def requires_code_download(self) -> bool:
        """Whether the build requires code download.

        Returns:
            Whether the build requires code download.
        """
        return any(
            item.requires_code_download for item in self.images.values()
        )

    @staticmethod
    def get_image_key(component_key: str, step: Optional[str] = None) -> str:
        """Get the image key.

        Args:
            component_key: The component key.
            step: The pipeline step for which the image was built.

        Returns:
            The image key.
        """
        if step:
            return f"{step}.{component_key}"
        else:
            return component_key

    def get_image(self, component_key: str, step: Optional[str] = None) -> str:
        """Get the image built for a specific key.

        Args:
            component_key: The key for which to get the image.
            step: The pipeline step for which to get the image. If no image
                exists for this step, will fall back to the pipeline image for
                the same key.

        Returns:
            The image name or digest.
        """
        return self._get_item(component_key=component_key, step=step).image

    def get_settings_checksum(
        self, component_key: str, step: Optional[str] = None
    ) -> Optional[str]:
        """Get the settings checksum for a specific key.

        Args:
            component_key: The key for which to get the checksum.
            step: The pipeline step for which to get the checksum. If no
                image exists for this step, will fall back to the pipeline image
                for the same key.

        Returns:
            The settings checksum.
        """
        return self._get_item(
            component_key=component_key, step=step
        ).settings_checksum

    def _get_item(
        self, component_key: str, step: Optional[str] = None
    ) -> "BuildItem":
        """Get the item for a specific key.

        Args:
            component_key: The key for which to get the item.
            step: The pipeline step for which to get the item. If no item
                exists for this step, will fall back to the item for
                the same key.

        Raises:
            KeyError: If no item exists for the given key.

        Returns:
            The build item.
        """
        if step:
            try:
                combined_key = self.get_image_key(
                    component_key=component_key, step=step
                )
                return self.images[combined_key]
            except KeyError:
                pass

        try:
            return self.images[component_key]
        except KeyError:
            raise KeyError(
                f"Unable to find image for key {component_key}. Available keys: "
                f"{set(self.images)}."
            )

    # Body and metadata properties
    @property
    def pipeline(self) -> Optional["PipelineResponse"]:
        """The `pipeline` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().pipeline

    @property
    def stack(self) -> Optional["StackResponse"]:
        """The `stack` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().stack

    @property
    def images(self) -> Dict[str, "BuildItem"]:
        """The `images` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().images

    @property
    def zenml_version(self) -> Optional[str]:
        """The `zenml_version` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().zenml_version

    @property
    def python_version(self) -> Optional[str]:
        """The `python_version` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().python_version

    @property
    def checksum(self) -> Optional[str]:
        """The `checksum` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().checksum

    @property
    def stack_checksum(self) -> Optional[str]:
        """The `stack_checksum` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().stack_checksum

    @property
    def is_local(self) -> bool:
        """The `is_local` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().is_local

    @property
    def contains_code(self) -> bool:
        """The `contains_code` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().contains_code

    @property
    def duration(self) -> Optional[int]:
        """The `duration` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().duration

checksum property

The checksum property.

Returns:

Type Description
Optional[str]

the value of the property.

contains_code property

The contains_code property.

Returns:

Type Description
bool

the value of the property.

duration property

The duration property.

Returns:

Type Description
Optional[int]

the value of the property.

images property

The images property.

Returns:

Type Description
Dict[str, BuildItem]

the value of the property.

is_local property

The is_local property.

Returns:

Type Description
bool

the value of the property.

pipeline property

The pipeline property.

Returns:

Type Description
Optional[PipelineResponse]

the value of the property.

python_version property

The python_version property.

Returns:

Type Description
Optional[str]

the value of the property.

requires_code_download property

Whether the build requires code download.

Returns:

Type Description
bool

Whether the build requires code download.

stack property

The stack property.

Returns:

Type Description
Optional[StackResponse]

the value of the property.

stack_checksum property

The stack_checksum property.

Returns:

Type Description
Optional[str]

the value of the property.

zenml_version property

The zenml_version property.

Returns:

Type Description
Optional[str]

the value of the property.

get_hydrated_version()

Return the hydrated version of this pipeline build.

Returns:

Type Description
PipelineBuildResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/pipeline_build.py
251
252
253
254
255
256
257
258
259
def get_hydrated_version(self) -> "PipelineBuildResponse":
    """Return the hydrated version of this pipeline build.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_build(self.id)

get_image(component_key, step=None)

Get the image built for a specific key.

Parameters:

Name Type Description Default
component_key str

The key for which to get the image.

required
step Optional[str]

The pipeline step for which to get the image. If no image exists for this step, will fall back to the pipeline image for the same key.

None

Returns:

Type Description
str

The image name or digest.

Source code in src/zenml/models/v2/core/pipeline_build.py
319
320
321
322
323
324
325
326
327
328
329
330
331
def get_image(self, component_key: str, step: Optional[str] = None) -> str:
    """Get the image built for a specific key.

    Args:
        component_key: The key for which to get the image.
        step: The pipeline step for which to get the image. If no image
            exists for this step, will fall back to the pipeline image for
            the same key.

    Returns:
        The image name or digest.
    """
    return self._get_item(component_key=component_key, step=step).image

get_image_key(component_key, step=None) staticmethod

Get the image key.

Parameters:

Name Type Description Default
component_key str

The component key.

required
step Optional[str]

The pipeline step for which the image was built.

None

Returns:

Type Description
str

The image key.

Source code in src/zenml/models/v2/core/pipeline_build.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
@staticmethod
def get_image_key(component_key: str, step: Optional[str] = None) -> str:
    """Get the image key.

    Args:
        component_key: The component key.
        step: The pipeline step for which the image was built.

    Returns:
        The image key.
    """
    if step:
        return f"{step}.{component_key}"
    else:
        return component_key

get_settings_checksum(component_key, step=None)

Get the settings checksum for a specific key.

Parameters:

Name Type Description Default
component_key str

The key for which to get the checksum.

required
step Optional[str]

The pipeline step for which to get the checksum. If no image exists for this step, will fall back to the pipeline image for the same key.

None

Returns:

Type Description
Optional[str]

The settings checksum.

Source code in src/zenml/models/v2/core/pipeline_build.py
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
def get_settings_checksum(
    self, component_key: str, step: Optional[str] = None
) -> Optional[str]:
    """Get the settings checksum for a specific key.

    Args:
        component_key: The key for which to get the checksum.
        step: The pipeline step for which to get the checksum. If no
            image exists for this step, will fall back to the pipeline image
            for the same key.

    Returns:
        The settings checksum.
    """
    return self._get_item(
        component_key=component_key, step=step
    ).settings_checksum

to_yaml()

Create a yaml representation of the pipeline build.

Create a yaml representation of the pipeline build that can be used to create a PipelineBuildBase instance.

Returns:

Type Description
Dict[str, Any]

The yaml representation of the pipeline build.

Source code in src/zenml/models/v2/core/pipeline_build.py
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
def to_yaml(self) -> Dict[str, Any]:
    """Create a yaml representation of the pipeline build.

    Create a yaml representation of the pipeline build that can be used
    to create a PipelineBuildBase instance.

    Returns:
        The yaml representation of the pipeline build.
    """
    # Get the base attributes
    yaml_dict: Dict[str, Any] = json.loads(
        self.model_dump_json(
            exclude={
                "body",
                "metadata",
            }
        )
    )
    images = json.loads(
        self.get_metadata().model_dump_json(
            exclude={
                "pipeline",
                "stack",
                "project",
            }
        )
    )
    yaml_dict.update(images)
    return yaml_dict

PipelineBuildResponseBody

Bases: ProjectScopedResponseBody

Response body for pipeline builds.

Source code in src/zenml/models/v2/core/pipeline_build.py
196
197
class PipelineBuildResponseBody(ProjectScopedResponseBody):
    """Response body for pipeline builds."""

PipelineBuildResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for pipeline builds.

Source code in src/zenml/models/v2/core/pipeline_build.py
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
class PipelineBuildResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for pipeline builds."""

    __zenml_skip_dehydration__: ClassVar[List[str]] = [
        "images",
    ]

    pipeline: Optional["PipelineResponse"] = Field(
        default=None, title="The pipeline that was used for this build."
    )
    stack: Optional["StackResponse"] = Field(
        default=None, title="The stack that was used for this build."
    )
    images: Dict[str, "BuildItem"] = Field(
        default={}, title="The images of this build."
    )
    zenml_version: Optional[str] = Field(
        default=None, title="The version of ZenML used for this build."
    )
    python_version: Optional[str] = Field(
        default=None, title="The Python version used for this build."
    )
    checksum: Optional[str] = Field(default=None, title="The build checksum.")
    stack_checksum: Optional[str] = Field(
        default=None, title="The stack checksum."
    )
    is_local: bool = Field(
        title="Whether the build images are stored in a container "
        "registry or locally.",
    )
    contains_code: bool = Field(
        title="Whether any image of the build contains user code.",
    )
    duration: Optional[int] = Field(
        title="The duration of the build in seconds.", default=None
    )

PipelineBuildResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the pipeline build entity.

Source code in src/zenml/models/v2/core/pipeline_build.py
238
239
class PipelineBuildResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the pipeline build entity."""

PipelineDeploymentBase

Bases: BaseZenModel

Base model for pipeline deployments.

Source code in src/zenml/models/v2/core/pipeline_deployment.py
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
class PipelineDeploymentBase(BaseZenModel):
    """Base model for pipeline deployments."""

    run_name_template: str = Field(
        title="The run name template for runs created using this deployment.",
    )
    pipeline_configuration: PipelineConfiguration = Field(
        title="The pipeline configuration for this deployment."
    )
    step_configurations: Dict[str, Step] = Field(
        default={}, title="The step configurations for this deployment."
    )
    client_environment: Dict[str, Any] = Field(
        default={}, title="The client environment for this deployment."
    )
    client_version: Optional[str] = Field(
        default=None,
        title="The version of the ZenML installation on the client side.",
    )
    server_version: Optional[str] = Field(
        default=None,
        title="The version of the ZenML installation on the server side.",
    )
    pipeline_version_hash: Optional[str] = Field(
        default=None,
        title="The pipeline version hash of the deployment.",
    )
    pipeline_spec: Optional[PipelineSpec] = Field(
        default=None,
        title="The pipeline spec of the deployment.",
    )

    @property
    def should_prevent_build_reuse(self) -> bool:
        """Whether the deployment prevents a build reuse.

        Returns:
            Whether the deployment prevents a build reuse.
        """
        return any(
            step.config.docker_settings.prevent_build_reuse
            for step in self.step_configurations.values()
        )

should_prevent_build_reuse property

Whether the deployment prevents a build reuse.

Returns:

Type Description
bool

Whether the deployment prevents a build reuse.

PipelineDeploymentFilter

Bases: ProjectScopedFilter

Model to enable advanced filtering of all pipeline deployments.

Source code in src/zenml/models/v2/core/pipeline_deployment.py
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
class PipelineDeploymentFilter(ProjectScopedFilter):
    """Model to enable advanced filtering of all pipeline deployments."""

    pipeline_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Pipeline associated with the deployment.",
        union_mode="left_to_right",
    )
    stack_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Stack associated with the deployment.",
        union_mode="left_to_right",
    )
    build_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Build associated with the deployment.",
        union_mode="left_to_right",
    )
    schedule_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Schedule associated with the deployment.",
        union_mode="left_to_right",
    )
    template_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Template used as base for the deployment.",
        union_mode="left_to_right",
    )

PipelineDeploymentRequest

Bases: PipelineDeploymentBase, ProjectScopedRequest

Request model for pipeline deployments.

Source code in src/zenml/models/v2/core/pipeline_deployment.py
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
class PipelineDeploymentRequest(PipelineDeploymentBase, ProjectScopedRequest):
    """Request model for pipeline deployments."""

    stack: UUID = Field(title="The stack associated with the deployment.")
    pipeline: Optional[UUID] = Field(
        default=None, title="The pipeline associated with the deployment."
    )
    build: Optional[UUID] = Field(
        default=None, title="The build associated with the deployment."
    )
    schedule: Optional[UUID] = Field(
        default=None, title="The schedule associated with the deployment."
    )
    code_reference: Optional["CodeReferenceRequest"] = Field(
        default=None,
        title="The code reference associated with the deployment.",
    )
    code_path: Optional[str] = Field(
        default=None,
        title="Optional path where the code is stored in the artifact store.",
    )
    template: Optional[UUID] = Field(
        default=None,
        description="Template used for the deployment.",
    )

PipelineDeploymentResponse

Bases: ProjectScopedResponse[PipelineDeploymentResponseBody, PipelineDeploymentResponseMetadata, PipelineDeploymentResponseResources]

Response model for pipeline deployments.

Source code in src/zenml/models/v2/core/pipeline_deployment.py
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
class PipelineDeploymentResponse(
    ProjectScopedResponse[
        PipelineDeploymentResponseBody,
        PipelineDeploymentResponseMetadata,
        PipelineDeploymentResponseResources,
    ]
):
    """Response model for pipeline deployments."""

    def get_hydrated_version(self) -> "PipelineDeploymentResponse":
        """Return the hydrated version of this pipeline deployment.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_deployment(self.id)

    # Body and metadata properties
    @property
    def run_name_template(self) -> str:
        """The `run_name_template` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().run_name_template

    @property
    def pipeline_configuration(self) -> PipelineConfiguration:
        """The `pipeline_configuration` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().pipeline_configuration

    @property
    def step_configurations(self) -> Dict[str, Step]:
        """The `step_configurations` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().step_configurations

    @property
    def client_environment(self) -> Dict[str, Any]:
        """The `client_environment` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().client_environment

    @property
    def client_version(self) -> Optional[str]:
        """The `client_version` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().client_version

    @property
    def server_version(self) -> Optional[str]:
        """The `server_version` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().server_version

    @property
    def pipeline_version_hash(self) -> Optional[str]:
        """The `pipeline_version_hash` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().pipeline_version_hash

    @property
    def pipeline_spec(self) -> Optional[PipelineSpec]:
        """The `pipeline_spec` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().pipeline_spec

    @property
    def code_path(self) -> Optional[str]:
        """The `code_path` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().code_path

    @property
    def pipeline(self) -> Optional[PipelineResponse]:
        """The `pipeline` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().pipeline

    @property
    def stack(self) -> Optional[StackResponse]:
        """The `stack` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().stack

    @property
    def build(self) -> Optional[PipelineBuildResponse]:
        """The `build` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().build

    @property
    def schedule(self) -> Optional[ScheduleResponse]:
        """The `schedule` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().schedule

    @property
    def code_reference(self) -> Optional[CodeReferenceResponse]:
        """The `code_reference` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().code_reference

    @property
    def template_id(self) -> Optional[UUID]:
        """The `template_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().template_id

build property

The build property.

Returns:

Type Description
Optional[PipelineBuildResponse]

the value of the property.

client_environment property

The client_environment property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

client_version property

The client_version property.

Returns:

Type Description
Optional[str]

the value of the property.

code_path property

The code_path property.

Returns:

Type Description
Optional[str]

the value of the property.

code_reference property

The code_reference property.

Returns:

Type Description
Optional[CodeReferenceResponse]

the value of the property.

pipeline property

The pipeline property.

Returns:

Type Description
Optional[PipelineResponse]

the value of the property.

pipeline_configuration property

The pipeline_configuration property.

Returns:

Type Description
PipelineConfiguration

the value of the property.

pipeline_spec property

The pipeline_spec property.

Returns:

Type Description
Optional[PipelineSpec]

the value of the property.

pipeline_version_hash property

The pipeline_version_hash property.

Returns:

Type Description
Optional[str]

the value of the property.

run_name_template property

The run_name_template property.

Returns:

Type Description
str

the value of the property.

schedule property

The schedule property.

Returns:

Type Description
Optional[ScheduleResponse]

the value of the property.

server_version property

The server_version property.

Returns:

Type Description
Optional[str]

the value of the property.

stack property

The stack property.

Returns:

Type Description
Optional[StackResponse]

the value of the property.

step_configurations property

The step_configurations property.

Returns:

Type Description
Dict[str, Step]

the value of the property.

template_id property

The template_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

get_hydrated_version()

Return the hydrated version of this pipeline deployment.

Returns:

Type Description
PipelineDeploymentResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/pipeline_deployment.py
205
206
207
208
209
210
211
212
213
def get_hydrated_version(self) -> "PipelineDeploymentResponse":
    """Return the hydrated version of this pipeline deployment.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_deployment(self.id)

PipelineDeploymentResponseBody

Bases: ProjectScopedResponseBody

Response body for pipeline deployments.

Source code in src/zenml/models/v2/core/pipeline_deployment.py
126
127
class PipelineDeploymentResponseBody(ProjectScopedResponseBody):
    """Response body for pipeline deployments."""

PipelineDeploymentResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for pipeline deployments.

Source code in src/zenml/models/v2/core/pipeline_deployment.py
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
class PipelineDeploymentResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for pipeline deployments."""

    __zenml_skip_dehydration__: ClassVar[List[str]] = [
        "pipeline_configuration",
        "step_configurations",
        "client_environment",
        "pipeline_spec",
    ]

    run_name_template: str = Field(
        title="The run name template for runs created using this deployment.",
    )
    pipeline_configuration: PipelineConfiguration = Field(
        title="The pipeline configuration for this deployment."
    )
    step_configurations: Dict[str, Step] = Field(
        default={}, title="The step configurations for this deployment."
    )
    client_environment: Dict[str, Any] = Field(
        default={}, title="The client environment for this deployment."
    )
    client_version: Optional[str] = Field(
        title="The version of the ZenML installation on the client side."
    )
    server_version: Optional[str] = Field(
        title="The version of the ZenML installation on the server side."
    )
    pipeline_version_hash: Optional[str] = Field(
        default=None, title="The pipeline version hash of the deployment."
    )
    pipeline_spec: Optional[PipelineSpec] = Field(
        default=None, title="The pipeline spec of the deployment."
    )
    code_path: Optional[str] = Field(
        default=None,
        title="Optional path where the code is stored in the artifact store.",
    )

    pipeline: Optional[PipelineResponse] = Field(
        default=None, title="The pipeline associated with the deployment."
    )
    stack: Optional[StackResponse] = Field(
        default=None, title="The stack associated with the deployment."
    )
    build: Optional[PipelineBuildResponse] = Field(
        default=None,
        title="The pipeline build associated with the deployment.",
    )
    schedule: Optional[ScheduleResponse] = Field(
        default=None, title="The schedule associated with the deployment."
    )
    code_reference: Optional[CodeReferenceResponse] = Field(
        default=None,
        title="The code reference associated with the deployment.",
    )
    template_id: Optional[UUID] = Field(
        default=None,
        description="Template used for the pipeline run.",
    )

PipelineDeploymentResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the pipeline deployment entity.

Source code in src/zenml/models/v2/core/pipeline_deployment.py
192
193
class PipelineDeploymentResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the pipeline deployment entity."""

PipelineFilter

Bases: ProjectScopedFilter, TaggableFilter

Pipeline filter model.

Source code in src/zenml/models/v2/core/pipeline.py
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
class PipelineFilter(ProjectScopedFilter, TaggableFilter):
    """Pipeline filter model."""

    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CUSTOM_SORTING_OPTIONS,
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
        SORT_PIPELINES_BY_LATEST_RUN_KEY,
    ]
    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.FILTER_EXCLUDE_FIELDS,
        *TaggableFilter.FILTER_EXCLUDE_FIELDS,
        "latest_run_status",
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CLI_EXCLUDE_FIELDS,
        *TaggableFilter.CLI_EXCLUDE_FIELDS,
    ]

    name: Optional[str] = Field(
        default=None,
        description="Name of the Pipeline",
    )
    latest_run_status: Optional[str] = Field(
        default=None,
        description="Filter by the status of the latest run of a pipeline. "
        "This will always be applied as an `AND` filter for now.",
    )

    def apply_filter(
        self, query: AnyQuery, table: Type["AnySchema"]
    ) -> AnyQuery:
        """Applies the filter to a query.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        query = super().apply_filter(query, table)

        from sqlmodel import and_, col, func, select

        from zenml.zen_stores.schemas import PipelineRunSchema, PipelineSchema

        if self.latest_run_status:
            latest_pipeline_run_subquery = (
                select(
                    PipelineRunSchema.pipeline_id,
                    func.max(PipelineRunSchema.created).label("created"),
                )
                .where(col(PipelineRunSchema.pipeline_id).is_not(None))
                .group_by(col(PipelineRunSchema.pipeline_id))
                .subquery()
            )

            query = (
                query.join(
                    PipelineRunSchema,
                    PipelineSchema.id == PipelineRunSchema.pipeline_id,
                )
                .join(
                    latest_pipeline_run_subquery,
                    and_(
                        PipelineRunSchema.pipeline_id
                        == latest_pipeline_run_subquery.c.pipeline_id,
                        PipelineRunSchema.created
                        == latest_pipeline_run_subquery.c.created,
                    ),
                )
                .where(
                    self.generate_custom_query_conditions_for_column(
                        value=self.latest_run_status,
                        table=PipelineRunSchema,
                        column="status",
                    )
                )
            )

        return query

    def apply_sorting(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Apply sorting to the query.

        Args:
            query: The query to which to apply the sorting.
            table: The query table.

        Returns:
            The query with sorting applied.
        """
        from sqlmodel import asc, case, col, desc, func, select

        from zenml.enums import SorterOps
        from zenml.zen_stores.schemas import PipelineRunSchema, PipelineSchema

        sort_by, operand = self.sorting_params

        if sort_by == SORT_PIPELINES_BY_LATEST_RUN_KEY:
            # Subquery to find the latest run per pipeline
            latest_run_subquery = (
                select(
                    PipelineSchema.id,
                    case(
                        (
                            func.max(PipelineRunSchema.created).is_(None),
                            PipelineSchema.created,
                        ),
                        else_=func.max(PipelineRunSchema.created),
                    ).label("latest_run"),
                )
                .outerjoin(
                    PipelineRunSchema,
                    PipelineSchema.id == PipelineRunSchema.pipeline_id,  # type: ignore[arg-type]
                )
                .group_by(col(PipelineSchema.id))
                .subquery()
            )

            query = query.add_columns(
                latest_run_subquery.c.latest_run,
            ).where(PipelineSchema.id == latest_run_subquery.c.id)

            if operand == SorterOps.ASCENDING:
                query = query.order_by(
                    asc(latest_run_subquery.c.latest_run),
                    asc(PipelineSchema.id),
                )
            else:
                query = query.order_by(
                    desc(latest_run_subquery.c.latest_run),
                    desc(PipelineSchema.id),
                )
            return query
        else:
            return super().apply_sorting(query=query, table=table)

apply_filter(query, table)

Applies the filter to a query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/core/pipeline.py
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
def apply_filter(
    self, query: AnyQuery, table: Type["AnySchema"]
) -> AnyQuery:
    """Applies the filter to a query.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    query = super().apply_filter(query, table)

    from sqlmodel import and_, col, func, select

    from zenml.zen_stores.schemas import PipelineRunSchema, PipelineSchema

    if self.latest_run_status:
        latest_pipeline_run_subquery = (
            select(
                PipelineRunSchema.pipeline_id,
                func.max(PipelineRunSchema.created).label("created"),
            )
            .where(col(PipelineRunSchema.pipeline_id).is_not(None))
            .group_by(col(PipelineRunSchema.pipeline_id))
            .subquery()
        )

        query = (
            query.join(
                PipelineRunSchema,
                PipelineSchema.id == PipelineRunSchema.pipeline_id,
            )
            .join(
                latest_pipeline_run_subquery,
                and_(
                    PipelineRunSchema.pipeline_id
                    == latest_pipeline_run_subquery.c.pipeline_id,
                    PipelineRunSchema.created
                    == latest_pipeline_run_subquery.c.created,
                ),
            )
            .where(
                self.generate_custom_query_conditions_for_column(
                    value=self.latest_run_status,
                    table=PipelineRunSchema,
                    column="status",
                )
            )
        )

    return query

apply_sorting(query, table)

Apply sorting to the query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the sorting.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with sorting applied.

Source code in src/zenml/models/v2/core/pipeline.py
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
def apply_sorting(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Apply sorting to the query.

    Args:
        query: The query to which to apply the sorting.
        table: The query table.

    Returns:
        The query with sorting applied.
    """
    from sqlmodel import asc, case, col, desc, func, select

    from zenml.enums import SorterOps
    from zenml.zen_stores.schemas import PipelineRunSchema, PipelineSchema

    sort_by, operand = self.sorting_params

    if sort_by == SORT_PIPELINES_BY_LATEST_RUN_KEY:
        # Subquery to find the latest run per pipeline
        latest_run_subquery = (
            select(
                PipelineSchema.id,
                case(
                    (
                        func.max(PipelineRunSchema.created).is_(None),
                        PipelineSchema.created,
                    ),
                    else_=func.max(PipelineRunSchema.created),
                ).label("latest_run"),
            )
            .outerjoin(
                PipelineRunSchema,
                PipelineSchema.id == PipelineRunSchema.pipeline_id,  # type: ignore[arg-type]
            )
            .group_by(col(PipelineSchema.id))
            .subquery()
        )

        query = query.add_columns(
            latest_run_subquery.c.latest_run,
        ).where(PipelineSchema.id == latest_run_subquery.c.id)

        if operand == SorterOps.ASCENDING:
            query = query.order_by(
                asc(latest_run_subquery.c.latest_run),
                asc(PipelineSchema.id),
            )
        else:
            query = query.order_by(
                desc(latest_run_subquery.c.latest_run),
                desc(PipelineSchema.id),
            )
        return query
    else:
        return super().apply_sorting(query=query, table=table)

PipelineRequest

Bases: ProjectScopedRequest

Request model for pipelines.

Source code in src/zenml/models/v2/core/pipeline.py
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
class PipelineRequest(ProjectScopedRequest):
    """Request model for pipelines."""

    name: str = Field(
        title="The name of the pipeline.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    description: Optional[str] = Field(
        default=None,
        title="The description of the pipeline.",
        max_length=TEXT_FIELD_MAX_LENGTH,
    )
    tags: Optional[List[str]] = Field(
        default=None,
        title="Tags of the pipeline.",
    )

PipelineResponse

Bases: ProjectScopedResponse[PipelineResponseBody, PipelineResponseMetadata, PipelineResponseResources]

Response model for pipelines.

Source code in src/zenml/models/v2/core/pipeline.py
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
class PipelineResponse(
    ProjectScopedResponse[
        PipelineResponseBody,
        PipelineResponseMetadata,
        PipelineResponseResources,
    ]
):
    """Response model for pipelines."""

    name: str = Field(
        title="The name of the pipeline.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "PipelineResponse":
        """Get the hydrated version of this pipeline.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_pipeline(self.id)

    # Helper methods
    def get_runs(self, **kwargs: Any) -> List["PipelineRunResponse"]:
        """Get runs of this pipeline.

        Can be used to fetch runs other than `self.runs` and supports
        fine-grained filtering and pagination.

        Args:
            **kwargs: Further arguments for filtering or pagination that are
                passed to `client.list_pipeline_runs()`.

        Returns:
            List of runs of this pipeline.
        """
        from zenml.client import Client

        return Client().list_pipeline_runs(pipeline_id=self.id, **kwargs).items

    @property
    def runs(self) -> List["PipelineRunResponse"]:
        """Returns the 20 most recent runs of this pipeline in descending order.

        Returns:
            The 20 most recent runs of this pipeline in descending order.
        """
        return self.get_runs()

    @property
    def num_runs(self) -> int:
        """Returns the number of runs of this pipeline.

        Returns:
            The number of runs of this pipeline.
        """
        from zenml.client import Client

        return Client().list_pipeline_runs(pipeline_id=self.id, size=1).total

    @property
    def last_run(self) -> "PipelineRunResponse":
        """Returns the last run of this pipeline.

        Returns:
            The last run of this pipeline.

        Raises:
            RuntimeError: If no runs were found for this pipeline.
        """
        runs = self.get_runs(size=1)
        if not runs:
            raise RuntimeError(
                f"No runs found for pipeline '{self.name}' with id {self.id}."
            )
        return runs[0]

    @property
    def last_successful_run(self) -> "PipelineRunResponse":
        """Returns the last successful run of this pipeline.

        Returns:
            The last successful run of this pipeline.

        Raises:
            RuntimeError: If no successful runs were found for this pipeline.
        """
        runs = self.get_runs(status=ExecutionStatus.COMPLETED, size=1)
        if not runs:
            raise RuntimeError(
                f"No successful runs found for pipeline '{self.name}' with id "
                f"{self.id}."
            )
        return runs[0]

    @property
    def latest_run_id(self) -> Optional[UUID]:
        """The `latest_run_id` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().latest_run_id

    @property
    def latest_run_status(self) -> Optional[ExecutionStatus]:
        """The `latest_run_status` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().latest_run_status

    @property
    def tags(self) -> List[TagResponse]:
        """The `tags` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().tags

last_run property

Returns the last run of this pipeline.

Returns:

Type Description
PipelineRunResponse

The last run of this pipeline.

Raises:

Type Description
RuntimeError

If no runs were found for this pipeline.

last_successful_run property

Returns the last successful run of this pipeline.

Returns:

Type Description
PipelineRunResponse

The last successful run of this pipeline.

Raises:

Type Description
RuntimeError

If no successful runs were found for this pipeline.

latest_run_id property

The latest_run_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

latest_run_status property

The latest_run_status property.

Returns:

Type Description
Optional[ExecutionStatus]

the value of the property.

num_runs property

Returns the number of runs of this pipeline.

Returns:

Type Description
int

The number of runs of this pipeline.

runs property

Returns the 20 most recent runs of this pipeline in descending order.

Returns:

Type Description
List[PipelineRunResponse]

The 20 most recent runs of this pipeline in descending order.

tags property

The tags property.

Returns:

Type Description
List[TagResponse]

the value of the property.

get_hydrated_version()

Get the hydrated version of this pipeline.

Returns:

Type Description
PipelineResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/pipeline.py
145
146
147
148
149
150
151
152
153
def get_hydrated_version(self) -> "PipelineResponse":
    """Get the hydrated version of this pipeline.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_pipeline(self.id)

get_runs(**kwargs)

Get runs of this pipeline.

Can be used to fetch runs other than self.runs and supports fine-grained filtering and pagination.

Parameters:

Name Type Description Default
**kwargs Any

Further arguments for filtering or pagination that are passed to client.list_pipeline_runs().

{}

Returns:

Type Description
List[PipelineRunResponse]

List of runs of this pipeline.

Source code in src/zenml/models/v2/core/pipeline.py
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
def get_runs(self, **kwargs: Any) -> List["PipelineRunResponse"]:
    """Get runs of this pipeline.

    Can be used to fetch runs other than `self.runs` and supports
    fine-grained filtering and pagination.

    Args:
        **kwargs: Further arguments for filtering or pagination that are
            passed to `client.list_pipeline_runs()`.

    Returns:
        List of runs of this pipeline.
    """
    from zenml.client import Client

    return Client().list_pipeline_runs(pipeline_id=self.id, **kwargs).items

PipelineResponseBody

Bases: ProjectScopedResponseBody

Response body for pipelines.

Source code in src/zenml/models/v2/core/pipeline.py
98
99
class PipelineResponseBody(ProjectScopedResponseBody):
    """Response body for pipelines."""

PipelineResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for pipelines.

Source code in src/zenml/models/v2/core/pipeline.py
102
103
104
105
106
107
108
class PipelineResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for pipelines."""

    description: Optional[str] = Field(
        default=None,
        title="The description of the pipeline.",
    )

PipelineResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the pipeline entity.

Source code in src/zenml/models/v2/core/pipeline.py
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
class PipelineResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the pipeline entity."""

    latest_run_user: Optional["UserResponse"] = Field(
        default=None,
        title="The user that created the latest run of this pipeline.",
    )
    latest_run_id: Optional[UUID] = Field(
        default=None,
        title="The ID of the latest run of the pipeline.",
    )
    latest_run_status: Optional[ExecutionStatus] = Field(
        default=None,
        title="The status of the latest run of the pipeline.",
    )
    tags: List[TagResponse] = Field(
        title="Tags associated with the pipeline.",
    )

PipelineRunDAG

Bases: BaseModel

Pipeline run DAG.

Source code in src/zenml/models/v2/misc/pipeline_run_dag.py
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
class PipelineRunDAG(BaseModel):
    """Pipeline run DAG."""

    id: UUID
    status: ExecutionStatus
    nodes: List["Node"]
    edges: List["Edge"]

    class Node(BaseModel):
        """Node in the pipeline run DAG."""

        node_id: str
        type: str
        id: Optional[UUID] = None
        name: str
        metadata: Dict[str, Any] = {}

    class Edge(BaseModel):
        """Edge in the pipeline run DAG."""

        source: str
        target: str
        metadata: Dict[str, Any] = {}

Edge

Bases: BaseModel

Edge in the pipeline run DAG.

Source code in src/zenml/models/v2/misc/pipeline_run_dag.py
41
42
43
44
45
46
class Edge(BaseModel):
    """Edge in the pipeline run DAG."""

    source: str
    target: str
    metadata: Dict[str, Any] = {}

Node

Bases: BaseModel

Node in the pipeline run DAG.

Source code in src/zenml/models/v2/misc/pipeline_run_dag.py
32
33
34
35
36
37
38
39
class Node(BaseModel):
    """Node in the pipeline run DAG."""

    node_id: str
    type: str
    id: Optional[UUID] = None
    name: str
    metadata: Dict[str, Any] = {}

PipelineRunFilter

Bases: ProjectScopedFilter, TaggableFilter, RunMetadataFilterMixin

Model to enable advanced filtering of all pipeline runs.

Source code in src/zenml/models/v2/core/pipeline_run.py
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
class PipelineRunFilter(
    ProjectScopedFilter, TaggableFilter, RunMetadataFilterMixin
):
    """Model to enable advanced filtering of all pipeline runs."""

    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CUSTOM_SORTING_OPTIONS,
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
        *RunMetadataFilterMixin.CUSTOM_SORTING_OPTIONS,
        "tag",
        "stack",
        "pipeline",
        "model",
        "model_version",
    ]
    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.FILTER_EXCLUDE_FIELDS,
        *TaggableFilter.FILTER_EXCLUDE_FIELDS,
        *RunMetadataFilterMixin.FILTER_EXCLUDE_FIELDS,
        "unlisted",
        "code_repository_id",
        "build_id",
        "schedule_id",
        "stack_id",
        "template_id",
        "pipeline",
        "stack",
        "code_repository",
        "model",
        "stack_component",
        "pipeline_name",
        "templatable",
    ]
    CLI_EXCLUDE_FIELDS = [
        *ProjectScopedFilter.CLI_EXCLUDE_FIELDS,
        *TaggableFilter.CLI_EXCLUDE_FIELDS,
        *RunMetadataFilterMixin.CLI_EXCLUDE_FIELDS,
    ]
    API_MULTI_INPUT_PARAMS: ClassVar[List[str]] = [
        *ProjectScopedFilter.API_MULTI_INPUT_PARAMS,
        *TaggableFilter.API_MULTI_INPUT_PARAMS,
        *RunMetadataFilterMixin.API_MULTI_INPUT_PARAMS,
    ]

    name: Optional[str] = Field(
        default=None,
        description="Name of the Pipeline Run",
    )
    orchestrator_run_id: Optional[str] = Field(
        default=None,
        description="Name of the Pipeline Run within the orchestrator",
    )
    pipeline_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Pipeline associated with the Pipeline Run",
        union_mode="left_to_right",
    )
    stack_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Stack used for the Pipeline Run",
        union_mode="left_to_right",
    )
    schedule_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Schedule that triggered the Pipeline Run",
        union_mode="left_to_right",
    )
    build_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Build used for the Pipeline Run",
        union_mode="left_to_right",
    )
    deployment_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Deployment used for the Pipeline Run",
        union_mode="left_to_right",
    )
    code_repository_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Code repository used for the Pipeline Run",
        union_mode="left_to_right",
    )
    template_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Template used for the pipeline run.",
        union_mode="left_to_right",
    )
    model_version_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Model version associated with the pipeline run.",
        union_mode="left_to_right",
    )
    status: Optional[str] = Field(
        default=None,
        description="Name of the Pipeline Run",
    )
    start_time: Optional[Union[datetime, str]] = Field(
        default=None,
        description="Start time for this run",
        union_mode="left_to_right",
    )
    end_time: Optional[Union[datetime, str]] = Field(
        default=None,
        description="End time for this run",
        union_mode="left_to_right",
    )
    unlisted: Optional[bool] = None
    # TODO: Remove once frontend is ready for it. This is replaced by the more
    #   generic `pipeline` filter below.
    pipeline_name: Optional[str] = Field(
        default=None,
        description="Name of the pipeline associated with the run",
    )
    pipeline: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the pipeline associated with the run.",
    )
    stack: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the stack associated with the run.",
    )
    code_repository: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the code repository associated with the run.",
    )
    model: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the model associated with the run.",
    )
    stack_component: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the stack component associated with the run.",
    )
    templatable: Optional[bool] = Field(
        default=None, description="Whether the run is templatable."
    )
    model_config = ConfigDict(protected_namespaces=())

    def get_custom_filters(
        self,
        table: Type["AnySchema"],
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_, col, or_

        from zenml.zen_stores.schemas import (
            CodeReferenceSchema,
            CodeRepositorySchema,
            ModelSchema,
            ModelVersionSchema,
            PipelineBuildSchema,
            PipelineDeploymentSchema,
            PipelineRunSchema,
            PipelineSchema,
            ScheduleSchema,
            StackComponentSchema,
            StackCompositionSchema,
            StackSchema,
        )

        if self.unlisted is not None:
            if self.unlisted is True:
                unlisted_filter = PipelineRunSchema.pipeline_id.is_(None)  # type: ignore[union-attr]
            else:
                unlisted_filter = PipelineRunSchema.pipeline_id.is_not(None)  # type: ignore[union-attr]
            custom_filters.append(unlisted_filter)

        if self.code_repository_id:
            code_repo_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.code_reference_id
                == CodeReferenceSchema.id,
                CodeReferenceSchema.code_repository_id
                == self.code_repository_id,
            )
            custom_filters.append(code_repo_filter)

        if self.stack_id:
            stack_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.stack_id == StackSchema.id,
                StackSchema.id == self.stack_id,
            )
            custom_filters.append(stack_filter)

        if self.schedule_id:
            schedule_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.schedule_id == ScheduleSchema.id,
                ScheduleSchema.id == self.schedule_id,
            )
            custom_filters.append(schedule_filter)

        if self.build_id:
            pipeline_build_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.build_id == PipelineBuildSchema.id,
                PipelineBuildSchema.id == self.build_id,
            )
            custom_filters.append(pipeline_build_filter)

        if self.template_id:
            run_template_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.template_id == self.template_id,
            )
            custom_filters.append(run_template_filter)

        if self.pipeline:
            pipeline_filter = and_(
                PipelineRunSchema.pipeline_id == PipelineSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.pipeline, table=PipelineSchema
                ),
            )
            custom_filters.append(pipeline_filter)

        if self.stack:
            stack_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.stack_id == StackSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.stack,
                    table=StackSchema,
                ),
            )
            custom_filters.append(stack_filter)

        if self.code_repository:
            code_repo_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.code_reference_id
                == CodeReferenceSchema.id,
                CodeReferenceSchema.code_repository_id
                == CodeRepositorySchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.code_repository,
                    table=CodeRepositorySchema,
                ),
            )
            custom_filters.append(code_repo_filter)

        if self.model:
            model_filter = and_(
                PipelineRunSchema.model_version_id == ModelVersionSchema.id,
                ModelVersionSchema.model_id == ModelSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.model, table=ModelSchema
                ),
            )
            custom_filters.append(model_filter)

        if self.stack_component:
            component_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.stack_id == StackSchema.id,
                StackSchema.id == StackCompositionSchema.stack_id,
                StackCompositionSchema.component_id == StackComponentSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.stack_component,
                    table=StackComponentSchema,
                ),
            )
            custom_filters.append(component_filter)

        if self.pipeline_name:
            pipeline_name_filter = and_(
                PipelineRunSchema.pipeline_id == PipelineSchema.id,
                self.generate_custom_query_conditions_for_column(
                    value=self.pipeline_name,
                    table=PipelineSchema,
                    column="name",
                ),
            )
            custom_filters.append(pipeline_name_filter)

        if self.templatable is not None:
            if self.templatable is True:
                templatable_filter = and_(
                    # The following condition is not perfect as it does not
                    # consider stacks with custom flavor components or local
                    # components, but the best we can do currently with our
                    # table columns.
                    PipelineRunSchema.deployment_id
                    == PipelineDeploymentSchema.id,
                    PipelineDeploymentSchema.build_id
                    == PipelineBuildSchema.id,
                    col(PipelineBuildSchema.is_local).is_(False),
                    col(PipelineBuildSchema.stack_id).is_not(None),
                )
            else:
                templatable_filter = or_(
                    col(PipelineRunSchema.deployment_id).is_(None),
                    and_(
                        PipelineRunSchema.deployment_id
                        == PipelineDeploymentSchema.id,
                        col(PipelineDeploymentSchema.build_id).is_(None),
                    ),
                    and_(
                        PipelineRunSchema.deployment_id
                        == PipelineDeploymentSchema.id,
                        PipelineDeploymentSchema.build_id
                        == PipelineBuildSchema.id,
                        or_(
                            col(PipelineBuildSchema.is_local).is_(True),
                            col(PipelineBuildSchema.stack_id).is_(None),
                        ),
                    ),
                )

            custom_filters.append(templatable_filter)

        return custom_filters

    def apply_sorting(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Apply sorting to the query.

        Args:
            query: The query to which to apply the sorting.
            table: The query table.

        Returns:
            The query with sorting applied.
        """
        from sqlmodel import asc, desc

        from zenml.enums import SorterOps
        from zenml.zen_stores.schemas import (
            ModelSchema,
            ModelVersionSchema,
            PipelineDeploymentSchema,
            PipelineRunSchema,
            PipelineSchema,
            StackSchema,
        )

        sort_by, operand = self.sorting_params

        if sort_by == "pipeline":
            query = query.outerjoin(
                PipelineSchema,
                PipelineRunSchema.pipeline_id == PipelineSchema.id,
            )
            column = PipelineSchema.name
        elif sort_by == "stack":
            query = query.outerjoin(
                PipelineDeploymentSchema,
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            ).outerjoin(
                StackSchema,
                PipelineDeploymentSchema.stack_id == StackSchema.id,
            )
            column = StackSchema.name
        elif sort_by == "model":
            query = query.outerjoin(
                ModelVersionSchema,
                PipelineRunSchema.model_version_id == ModelVersionSchema.id,
            ).outerjoin(
                ModelSchema,
                ModelVersionSchema.model_id == ModelSchema.id,
            )
            column = ModelSchema.name
        elif sort_by == "model_version":
            query = query.outerjoin(
                ModelVersionSchema,
                PipelineRunSchema.model_version_id == ModelVersionSchema.id,
            )
            column = ModelVersionSchema.name
        else:
            return super().apply_sorting(query=query, table=table)

        query = query.add_columns(column)

        if operand == SorterOps.ASCENDING:
            query = query.order_by(asc(column))
        else:
            query = query.order_by(desc(column))

        return query

apply_sorting(query, table)

Apply sorting to the query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the sorting.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with sorting applied.

Source code in src/zenml/models/v2/core/pipeline_run.py
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
def apply_sorting(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Apply sorting to the query.

    Args:
        query: The query to which to apply the sorting.
        table: The query table.

    Returns:
        The query with sorting applied.
    """
    from sqlmodel import asc, desc

    from zenml.enums import SorterOps
    from zenml.zen_stores.schemas import (
        ModelSchema,
        ModelVersionSchema,
        PipelineDeploymentSchema,
        PipelineRunSchema,
        PipelineSchema,
        StackSchema,
    )

    sort_by, operand = self.sorting_params

    if sort_by == "pipeline":
        query = query.outerjoin(
            PipelineSchema,
            PipelineRunSchema.pipeline_id == PipelineSchema.id,
        )
        column = PipelineSchema.name
    elif sort_by == "stack":
        query = query.outerjoin(
            PipelineDeploymentSchema,
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
        ).outerjoin(
            StackSchema,
            PipelineDeploymentSchema.stack_id == StackSchema.id,
        )
        column = StackSchema.name
    elif sort_by == "model":
        query = query.outerjoin(
            ModelVersionSchema,
            PipelineRunSchema.model_version_id == ModelVersionSchema.id,
        ).outerjoin(
            ModelSchema,
            ModelVersionSchema.model_id == ModelSchema.id,
        )
        column = ModelSchema.name
    elif sort_by == "model_version":
        query = query.outerjoin(
            ModelVersionSchema,
            PipelineRunSchema.model_version_id == ModelVersionSchema.id,
        )
        column = ModelVersionSchema.name
    else:
        return super().apply_sorting(query=query, table=table)

    query = query.add_columns(column)

    if operand == SorterOps.ASCENDING:
        query = query.order_by(asc(column))
    else:
        query = query.order_by(desc(column))

    return query

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/pipeline_run.py
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
def get_custom_filters(
    self,
    table: Type["AnySchema"],
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_, col, or_

    from zenml.zen_stores.schemas import (
        CodeReferenceSchema,
        CodeRepositorySchema,
        ModelSchema,
        ModelVersionSchema,
        PipelineBuildSchema,
        PipelineDeploymentSchema,
        PipelineRunSchema,
        PipelineSchema,
        ScheduleSchema,
        StackComponentSchema,
        StackCompositionSchema,
        StackSchema,
    )

    if self.unlisted is not None:
        if self.unlisted is True:
            unlisted_filter = PipelineRunSchema.pipeline_id.is_(None)  # type: ignore[union-attr]
        else:
            unlisted_filter = PipelineRunSchema.pipeline_id.is_not(None)  # type: ignore[union-attr]
        custom_filters.append(unlisted_filter)

    if self.code_repository_id:
        code_repo_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.code_reference_id
            == CodeReferenceSchema.id,
            CodeReferenceSchema.code_repository_id
            == self.code_repository_id,
        )
        custom_filters.append(code_repo_filter)

    if self.stack_id:
        stack_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.stack_id == StackSchema.id,
            StackSchema.id == self.stack_id,
        )
        custom_filters.append(stack_filter)

    if self.schedule_id:
        schedule_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.schedule_id == ScheduleSchema.id,
            ScheduleSchema.id == self.schedule_id,
        )
        custom_filters.append(schedule_filter)

    if self.build_id:
        pipeline_build_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.build_id == PipelineBuildSchema.id,
            PipelineBuildSchema.id == self.build_id,
        )
        custom_filters.append(pipeline_build_filter)

    if self.template_id:
        run_template_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.template_id == self.template_id,
        )
        custom_filters.append(run_template_filter)

    if self.pipeline:
        pipeline_filter = and_(
            PipelineRunSchema.pipeline_id == PipelineSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.pipeline, table=PipelineSchema
            ),
        )
        custom_filters.append(pipeline_filter)

    if self.stack:
        stack_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.stack_id == StackSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.stack,
                table=StackSchema,
            ),
        )
        custom_filters.append(stack_filter)

    if self.code_repository:
        code_repo_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.code_reference_id
            == CodeReferenceSchema.id,
            CodeReferenceSchema.code_repository_id
            == CodeRepositorySchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.code_repository,
                table=CodeRepositorySchema,
            ),
        )
        custom_filters.append(code_repo_filter)

    if self.model:
        model_filter = and_(
            PipelineRunSchema.model_version_id == ModelVersionSchema.id,
            ModelVersionSchema.model_id == ModelSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.model, table=ModelSchema
            ),
        )
        custom_filters.append(model_filter)

    if self.stack_component:
        component_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.stack_id == StackSchema.id,
            StackSchema.id == StackCompositionSchema.stack_id,
            StackCompositionSchema.component_id == StackComponentSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.stack_component,
                table=StackComponentSchema,
            ),
        )
        custom_filters.append(component_filter)

    if self.pipeline_name:
        pipeline_name_filter = and_(
            PipelineRunSchema.pipeline_id == PipelineSchema.id,
            self.generate_custom_query_conditions_for_column(
                value=self.pipeline_name,
                table=PipelineSchema,
                column="name",
            ),
        )
        custom_filters.append(pipeline_name_filter)

    if self.templatable is not None:
        if self.templatable is True:
            templatable_filter = and_(
                # The following condition is not perfect as it does not
                # consider stacks with custom flavor components or local
                # components, but the best we can do currently with our
                # table columns.
                PipelineRunSchema.deployment_id
                == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.build_id
                == PipelineBuildSchema.id,
                col(PipelineBuildSchema.is_local).is_(False),
                col(PipelineBuildSchema.stack_id).is_not(None),
            )
        else:
            templatable_filter = or_(
                col(PipelineRunSchema.deployment_id).is_(None),
                and_(
                    PipelineRunSchema.deployment_id
                    == PipelineDeploymentSchema.id,
                    col(PipelineDeploymentSchema.build_id).is_(None),
                ),
                and_(
                    PipelineRunSchema.deployment_id
                    == PipelineDeploymentSchema.id,
                    PipelineDeploymentSchema.build_id
                    == PipelineBuildSchema.id,
                    or_(
                        col(PipelineBuildSchema.is_local).is_(True),
                        col(PipelineBuildSchema.stack_id).is_(None),
                    ),
                ),
            )

        custom_filters.append(templatable_filter)

    return custom_filters

PipelineRunRequest

Bases: ProjectScopedRequest

Request model for pipeline runs.

Source code in src/zenml/models/v2/core/pipeline_run.py
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
class PipelineRunRequest(ProjectScopedRequest):
    """Request model for pipeline runs."""

    name: str = Field(
        title="The name of the pipeline run.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    deployment: UUID = Field(
        title="The deployment associated with the pipeline run."
    )
    pipeline: Optional[UUID] = Field(
        title="The pipeline associated with the pipeline run.",
        default=None,
    )
    orchestrator_run_id: Optional[str] = Field(
        title="The orchestrator run ID.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    start_time: Optional[datetime] = Field(
        title="The start time of the pipeline run.",
        default=None,
    )
    end_time: Optional[datetime] = Field(
        title="The end time of the pipeline run.",
        default=None,
    )
    status: ExecutionStatus = Field(
        title="The status of the pipeline run.",
    )
    orchestrator_environment: Dict[str, Any] = Field(
        default={},
        title=(
            "Environment of the orchestrator that executed this pipeline run "
            "(OS, Python version, etc.)."
        ),
    )
    trigger_execution_id: Optional[UUID] = Field(
        default=None,
        title="ID of the trigger execution that triggered this run.",
    )
    tags: Optional[List[Union[str, Tag]]] = Field(
        default=None,
        title="Tags of the pipeline run.",
    )
    logs: Optional[LogsRequest] = Field(
        default=None,
        title="Logs of the pipeline run.",
    )

    model_config = ConfigDict(protected_namespaces=())

PipelineRunResponse

Bases: ProjectScopedResponse[PipelineRunResponseBody, PipelineRunResponseMetadata, PipelineRunResponseResources]

Response model for pipeline runs.

Source code in src/zenml/models/v2/core/pipeline_run.py
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
class PipelineRunResponse(
    ProjectScopedResponse[
        PipelineRunResponseBody,
        PipelineRunResponseMetadata,
        PipelineRunResponseResources,
    ]
):
    """Response model for pipeline runs."""

    name: str = Field(
        title="The name of the pipeline run.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "PipelineRunResponse":
        """Get the hydrated version of this pipeline run.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_run(self.id)

    # Helper methods
    @property
    def artifact_versions(self) -> List["ArtifactVersionResponse"]:
        """Get all artifact versions that are outputs of steps of this run.

        Returns:
            All output artifact versions of this run (including cached ones).
        """
        from zenml.artifacts.utils import (
            get_artifacts_versions_of_pipeline_run,
        )

        return get_artifacts_versions_of_pipeline_run(self)

    @property
    def produced_artifact_versions(self) -> List["ArtifactVersionResponse"]:
        """Get all artifact versions produced during this pipeline run.

        Returns:
            A list of all artifact versions produced during this pipeline run.
        """
        from zenml.artifacts.utils import (
            get_artifacts_versions_of_pipeline_run,
        )

        return get_artifacts_versions_of_pipeline_run(self, only_produced=True)

    def refresh_run_status(self) -> "PipelineRunResponse":
        """Method to refresh the status of a run if it is initializing/running.

        Returns:
            The updated pipeline.

        Raises:
            ValueError: If the stack of the run response is None.
        """
        if self.status in [
            ExecutionStatus.INITIALIZING,
            ExecutionStatus.RUNNING,
        ]:
            # Check if the stack still accessible
            if self.stack is None:
                raise ValueError(
                    "The stack that this pipeline run response was executed on"
                    "has been deleted."
                )

            # Create the orchestrator instance
            from zenml.enums import StackComponentType
            from zenml.orchestrators.base_orchestrator import BaseOrchestrator
            from zenml.stack.stack_component import StackComponent

            # Check if the stack still accessible
            orchestrator_list = self.stack.components.get(
                StackComponentType.ORCHESTRATOR, []
            )
            if len(orchestrator_list) == 0:
                raise ValueError(
                    "The orchestrator that this pipeline run response was "
                    "executed with has been deleted."
                )

            orchestrator = cast(
                BaseOrchestrator,
                StackComponent.from_model(
                    component_model=orchestrator_list[0]
                ),
            )

            # Fetch the status
            status = orchestrator.fetch_status(run=self)

            # If it is different from the current status, update it
            if status != self.status:
                from zenml.client import Client
                from zenml.models import PipelineRunUpdate

                client = Client()
                return client.zen_store.update_run(
                    run_id=self.id,
                    run_update=PipelineRunUpdate(status=status),
                )

        return self

    # Body and metadata properties
    @property
    def status(self) -> ExecutionStatus:
        """The `status` property.

        Returns:
            the value of the property.
        """
        return self.get_body().status

    @property
    def stack(self) -> Optional["StackResponse"]:
        """The `stack` property.

        Returns:
            the value of the property.
        """
        return self.get_body().stack

    @property
    def pipeline(self) -> Optional["PipelineResponse"]:
        """The `pipeline` property.

        Returns:
            the value of the property.
        """
        return self.get_body().pipeline

    @property
    def build(self) -> Optional["PipelineBuildResponse"]:
        """The `build` property.

        Returns:
            the value of the property.
        """
        return self.get_body().build

    @property
    def schedule(self) -> Optional["ScheduleResponse"]:
        """The `schedule` property.

        Returns:
            the value of the property.
        """
        return self.get_body().schedule

    @property
    def trigger_execution(self) -> Optional["TriggerExecutionResponse"]:
        """The `trigger_execution` property.

        Returns:
            the value of the property.
        """
        return self.get_body().trigger_execution

    @property
    def code_reference(self) -> Optional["CodeReferenceResponse"]:
        """The `schedule` property.

        Returns:
            the value of the property.
        """
        return self.get_body().code_reference

    @property
    def deployment_id(self) -> Optional["UUID"]:
        """The `deployment_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().deployment_id

    @property
    def model_version_id(self) -> Optional[UUID]:
        """The `model_version_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().model_version_id

    @property
    def run_metadata(self) -> Dict[str, MetadataType]:
        """The `run_metadata` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().run_metadata

    @property
    def steps(self) -> Dict[str, "StepRunResponse"]:
        """The `steps` property.

        Returns:
            the value of the property.
        """
        from zenml.client import Client

        return {
            step.name: step
            for step in pagination_utils.depaginate(
                Client().list_run_steps,
                pipeline_run_id=self.id,
            )
        }

    @property
    def config(self) -> PipelineConfiguration:
        """The `config` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().config

    @property
    def start_time(self) -> Optional[datetime]:
        """The `start_time` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().start_time

    @property
    def end_time(self) -> Optional[datetime]:
        """The `end_time` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().end_time

    @property
    def client_environment(self) -> Dict[str, Any]:
        """The `client_environment` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().client_environment

    @property
    def orchestrator_environment(self) -> Dict[str, Any]:
        """The `orchestrator_environment` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().orchestrator_environment

    @property
    def orchestrator_run_id(self) -> Optional[str]:
        """The `orchestrator_run_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().orchestrator_run_id

    @property
    def code_path(self) -> Optional[str]:
        """The `code_path` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().code_path

    @property
    def template_id(self) -> Optional[UUID]:
        """The `template_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().template_id

    @property
    def is_templatable(self) -> bool:
        """The `is_templatable` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().is_templatable

    @property
    def model_version(self) -> Optional[ModelVersionResponse]:
        """The `model_version` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().model_version

    @property
    def tags(self) -> List[TagResponse]:
        """The `tags` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().tags

    @property
    def logs(self) -> Optional["LogsResponse"]:
        """The `logs` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().logs

artifact_versions property

Get all artifact versions that are outputs of steps of this run.

Returns:

Type Description
List[ArtifactVersionResponse]

All output artifact versions of this run (including cached ones).

build property

The build property.

Returns:

Type Description
Optional[PipelineBuildResponse]

the value of the property.

client_environment property

The client_environment property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

code_path property

The code_path property.

Returns:

Type Description
Optional[str]

the value of the property.

code_reference property

The schedule property.

Returns:

Type Description
Optional[CodeReferenceResponse]

the value of the property.

config property

The config property.

Returns:

Type Description
PipelineConfiguration

the value of the property.

deployment_id property

The deployment_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

end_time property

The end_time property.

Returns:

Type Description
Optional[datetime]

the value of the property.

is_templatable property

The is_templatable property.

Returns:

Type Description
bool

the value of the property.

logs property

The logs property.

Returns:

Type Description
Optional[LogsResponse]

the value of the property.

model_version property

The model_version property.

Returns:

Type Description
Optional[ModelVersionResponse]

the value of the property.

model_version_id property

The model_version_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

orchestrator_environment property

The orchestrator_environment property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

orchestrator_run_id property

The orchestrator_run_id property.

Returns:

Type Description
Optional[str]

the value of the property.

pipeline property

The pipeline property.

Returns:

Type Description
Optional[PipelineResponse]

the value of the property.

produced_artifact_versions property

Get all artifact versions produced during this pipeline run.

Returns:

Type Description
List[ArtifactVersionResponse]

A list of all artifact versions produced during this pipeline run.

run_metadata property

The run_metadata property.

Returns:

Type Description
Dict[str, MetadataType]

the value of the property.

schedule property

The schedule property.

Returns:

Type Description
Optional[ScheduleResponse]

the value of the property.

stack property

The stack property.

Returns:

Type Description
Optional[StackResponse]

the value of the property.

start_time property

The start_time property.

Returns:

Type Description
Optional[datetime]

the value of the property.

status property

The status property.

Returns:

Type Description
ExecutionStatus

the value of the property.

steps property

The steps property.

Returns:

Type Description
Dict[str, StepRunResponse]

the value of the property.

tags property

The tags property.

Returns:

Type Description
List[TagResponse]

the value of the property.

template_id property

The template_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

trigger_execution property

The trigger_execution property.

Returns:

Type Description
Optional[TriggerExecutionResponse]

the value of the property.

get_hydrated_version()

Get the hydrated version of this pipeline run.

Returns:

Type Description
PipelineRunResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/pipeline_run.py
283
284
285
286
287
288
289
290
291
def get_hydrated_version(self) -> "PipelineRunResponse":
    """Get the hydrated version of this pipeline run.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_run(self.id)

refresh_run_status()

Method to refresh the status of a run if it is initializing/running.

Returns:

Type Description
PipelineRunResponse

The updated pipeline.

Raises:

Type Description
ValueError

If the stack of the run response is None.

Source code in src/zenml/models/v2/core/pipeline_run.py
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
def refresh_run_status(self) -> "PipelineRunResponse":
    """Method to refresh the status of a run if it is initializing/running.

    Returns:
        The updated pipeline.

    Raises:
        ValueError: If the stack of the run response is None.
    """
    if self.status in [
        ExecutionStatus.INITIALIZING,
        ExecutionStatus.RUNNING,
    ]:
        # Check if the stack still accessible
        if self.stack is None:
            raise ValueError(
                "The stack that this pipeline run response was executed on"
                "has been deleted."
            )

        # Create the orchestrator instance
        from zenml.enums import StackComponentType
        from zenml.orchestrators.base_orchestrator import BaseOrchestrator
        from zenml.stack.stack_component import StackComponent

        # Check if the stack still accessible
        orchestrator_list = self.stack.components.get(
            StackComponentType.ORCHESTRATOR, []
        )
        if len(orchestrator_list) == 0:
            raise ValueError(
                "The orchestrator that this pipeline run response was "
                "executed with has been deleted."
            )

        orchestrator = cast(
            BaseOrchestrator,
            StackComponent.from_model(
                component_model=orchestrator_list[0]
            ),
        )

        # Fetch the status
        status = orchestrator.fetch_status(run=self)

        # If it is different from the current status, update it
        if status != self.status:
            from zenml.client import Client
            from zenml.models import PipelineRunUpdate

            client = Client()
            return client.zen_store.update_run(
                run_id=self.id,
                run_update=PipelineRunUpdate(status=status),
            )

    return self

PipelineRunResponseBody

Bases: ProjectScopedResponseBody

Response body for pipeline runs.

Source code in src/zenml/models/v2/core/pipeline_run.py
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
class PipelineRunResponseBody(ProjectScopedResponseBody):
    """Response body for pipeline runs."""

    status: ExecutionStatus = Field(
        title="The status of the pipeline run.",
    )
    stack: Optional["StackResponse"] = Field(
        default=None, title="The stack that was used for this run."
    )
    pipeline: Optional["PipelineResponse"] = Field(
        default=None, title="The pipeline this run belongs to."
    )
    build: Optional["PipelineBuildResponse"] = Field(
        default=None, title="The pipeline build that was used for this run."
    )
    schedule: Optional["ScheduleResponse"] = Field(
        default=None, title="The schedule that was used for this run."
    )
    code_reference: Optional["CodeReferenceResponse"] = Field(
        default=None, title="The code reference that was used for this run."
    )
    deployment_id: Optional[UUID] = Field(
        default=None, title="The deployment that was used for this run."
    )
    trigger_execution: Optional["TriggerExecutionResponse"] = Field(
        default=None, title="The trigger execution that triggered this run."
    )
    model_version_id: Optional[UUID] = Field(
        title="The ID of the model version that was "
        "configured by this pipeline run explicitly.",
        default=None,
    )

    model_config = ConfigDict(protected_namespaces=())

PipelineRunResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for pipeline runs.

Source code in src/zenml/models/v2/core/pipeline_run.py
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
class PipelineRunResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for pipeline runs."""

    __zenml_skip_dehydration__: ClassVar[List[str]] = [
        "run_metadata",
        "config",
        "client_environment",
        "orchestrator_environment",
    ]

    run_metadata: Dict[str, MetadataType] = Field(
        default={},
        title="Metadata associated with this pipeline run.",
    )
    config: PipelineConfiguration = Field(
        title="The pipeline configuration used for this pipeline run.",
    )
    start_time: Optional[datetime] = Field(
        title="The start time of the pipeline run.",
        default=None,
    )
    end_time: Optional[datetime] = Field(
        title="The end time of the pipeline run.",
        default=None,
    )
    client_environment: Dict[str, Any] = Field(
        default={},
        title=(
            "Environment of the client that initiated this pipeline run "
            "(OS, Python version, etc.)."
        ),
    )
    orchestrator_environment: Dict[str, Any] = Field(
        default={},
        title=(
            "Environment of the orchestrator that executed this pipeline run "
            "(OS, Python version, etc.)."
        ),
    )
    orchestrator_run_id: Optional[str] = Field(
        title="The orchestrator run ID.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    code_path: Optional[str] = Field(
        default=None,
        title="Optional path where the code is stored in the artifact store.",
    )
    template_id: Optional[UUID] = Field(
        default=None,
        description="Template used for the pipeline run.",
    )
    is_templatable: bool = Field(
        default=False,
        description="Whether a template can be created from this run.",
    )

PipelineRunResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the pipeline run entity.

Source code in src/zenml/models/v2/core/pipeline_run.py
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
class PipelineRunResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the pipeline run entity."""

    model_version: Optional[ModelVersionResponse] = None
    tags: List[TagResponse] = Field(
        title="Tags associated with the pipeline run.",
    )
    logs: Optional["LogsResponse"] = Field(
        title="Logs associated with this pipeline run.",
        default=None,
    )

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

PipelineRunUpdate

Bases: BaseUpdate

Pipeline run update model.

Source code in src/zenml/models/v2/core/pipeline_run.py
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
class PipelineRunUpdate(BaseUpdate):
    """Pipeline run update model."""

    status: Optional[ExecutionStatus] = None
    end_time: Optional[datetime] = None
    # TODO: we should maybe have a different update model here, the upper
    #  three attributes should only be for internal use
    add_tags: Optional[List[str]] = Field(
        default=None, title="New tags to add to the pipeline run."
    )
    remove_tags: Optional[List[str]] = Field(
        default=None, title="Tags to remove from the pipeline run."
    )

    model_config = ConfigDict(protected_namespaces=())

PipelineUpdate

Bases: BaseUpdate

Update model for pipelines.

Source code in src/zenml/models/v2/core/pipeline.py
79
80
81
82
83
84
85
86
87
88
89
90
91
92
class PipelineUpdate(BaseUpdate):
    """Update model for pipelines."""

    description: Optional[str] = Field(
        default=None,
        title="The description of the pipeline.",
        max_length=TEXT_FIELD_MAX_LENGTH,
    )
    add_tags: Optional[List[str]] = Field(
        default=None, title="New tags to add to the pipeline."
    )
    remove_tags: Optional[List[str]] = Field(
        default=None, title="Tags to remove from the pipeline."
    )

ProjectFilter

Bases: BaseFilter

Model to enable advanced filtering of all projects.

Source code in src/zenml/models/v2/core/project.py
192
193
194
195
196
197
198
199
200
201
202
203
class ProjectFilter(BaseFilter):
    """Model to enable advanced filtering of all projects."""

    name: Optional[str] = Field(
        default=None,
        description="Name of the project",
    )

    display_name: Optional[str] = Field(
        default=None,
        description="Display name of the project",
    )

ProjectRequest

Bases: BaseRequest

Request model for projects.

Source code in src/zenml/models/v2/core/project.py
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
class ProjectRequest(BaseRequest):
    """Request model for projects."""

    name: str = Field(
        title="The unique name of the project. The project name must only "
        "contain only lowercase letters, numbers, underscores, and hyphens and "
        "be at most 50 characters long.",
        min_length=1,
        max_length=STR_ID_FIELD_MAX_LENGTH,
        pattern=r"^[a-z0-9_-]+$",
    )
    display_name: str = Field(
        default="",
        title="The display name of the project.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    description: str = Field(
        default="",
        title="The description of the project.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    @model_validator(mode="before")
    @classmethod
    @before_validator_handler
    def _validate_project_name(cls, data: Dict[str, Any]) -> Dict[str, Any]:
        """Validate the project name.

        Args:
            data: The values to validate.

        Returns:
            The validated values.
        """
        name = data.get("name")
        display_name = data.get("display_name")

        if not name and not display_name:
            return data

        if not name:
            assert display_name

            project_name = display_name.lower().replace(" ", "-")
            project_name = re.sub(r"[^a-z0-9_-]", "", project_name)

            data["name"] = project_name

        if not display_name:
            # We just use the name as the display name
            data["display_name"] = name

        return data

ProjectResponse

Bases: BaseIdentifiedResponse[ProjectResponseBody, ProjectResponseMetadata, ProjectResponseResources]

Response model for projects.

Source code in src/zenml/models/v2/core/project.py
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
class ProjectResponse(
    BaseIdentifiedResponse[
        ProjectResponseBody,
        ProjectResponseMetadata,
        ProjectResponseResources,
    ]
):
    """Response model for projects."""

    name: str = Field(
        title="The unique name of the project.",
        max_length=STR_ID_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "ProjectResponse":
        """Get the hydrated version of this project.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_project(self.id)

    # Body and metadata properties

    @property
    def display_name(self) -> str:
        """The `display_name` property.

        Returns:
            the value of the property.
        """
        return self.get_body().display_name

    @property
    def description(self) -> str:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

description property

The description property.

Returns:

Type Description
str

the value of the property.

display_name property

The display_name property.

Returns:

Type Description
str

the value of the property.

get_hydrated_version()

Get the hydrated version of this project.

Returns:

Type Description
ProjectResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/project.py
158
159
160
161
162
163
164
165
166
def get_hydrated_version(self) -> "ProjectResponse":
    """Get the hydrated version of this project.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_project(self.id)

ProjectResponseBody

Bases: BaseDatedResponseBody

Response body for projects.

Source code in src/zenml/models/v2/core/project.py
121
122
123
124
125
126
127
class ProjectResponseBody(BaseDatedResponseBody):
    """Response body for projects."""

    display_name: str = Field(
        title="The display name of the project.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

ProjectResponseMetadata

Bases: BaseResponseMetadata

Response metadata for projects.

Source code in src/zenml/models/v2/core/project.py
130
131
132
133
134
135
136
137
class ProjectResponseMetadata(BaseResponseMetadata):
    """Response metadata for projects."""

    description: str = Field(
        default="",
        title="The description of the project.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

ProjectScopedFilter

Bases: UserScopedFilter

Model to enable advanced scoping with project.

Source code in src/zenml/models/v2/base/scoped.py
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
class ProjectScopedFilter(UserScopedFilter):
    """Model to enable advanced scoping with project."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *UserScopedFilter.FILTER_EXCLUDE_FIELDS,
        "project",
    ]
    project: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the project which the search is scoped to. "
        "This field must always be set and is always applied in addition to "
        "the other filters, regardless of the value of the "
        "logical_operator field.",
        union_mode="left_to_right",
    )

    def apply_filter(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Applies the filter to a query.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.

        Raises:
            ValueError: If the project scope is missing from the filter.
        """
        query = super().apply_filter(query=query, table=table)

        # The project scope must always be set and must be a UUID. If the
        # client sets this to a string, the server will try to resolve it to a
        # project ID.
        #
        # If not set by the client, the server will fall back to using the
        # user's default project or even the server's default project, if
        # they are configured. If this also fails to yield a project, this
        # method will raise a ValueError.
        #
        # See: SqlZenStore._set_filter_project_id

        if not self.project:
            raise ValueError("Project scope missing from the filter.")

        if not isinstance(self.project, UUID):
            raise ValueError(
                f"Project scope must be a UUID, got {type(self.project)}."
            )

        scope_filter = getattr(table, "project_id") == self.project
        query = query.where(scope_filter)

        return query

apply_filter(query, table)

Applies the filter to a query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Raises:

Type Description
ValueError

If the project scope is missing from the filter.

Source code in src/zenml/models/v2/base/scoped.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
def apply_filter(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Applies the filter to a query.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.

    Raises:
        ValueError: If the project scope is missing from the filter.
    """
    query = super().apply_filter(query=query, table=table)

    # The project scope must always be set and must be a UUID. If the
    # client sets this to a string, the server will try to resolve it to a
    # project ID.
    #
    # If not set by the client, the server will fall back to using the
    # user's default project or even the server's default project, if
    # they are configured. If this also fails to yield a project, this
    # method will raise a ValueError.
    #
    # See: SqlZenStore._set_filter_project_id

    if not self.project:
        raise ValueError("Project scope missing from the filter.")

    if not isinstance(self.project, UUID):
        raise ValueError(
            f"Project scope must be a UUID, got {type(self.project)}."
        )

    scope_filter = getattr(table, "project_id") == self.project
    query = query.where(scope_filter)

    return query

ProjectScopedRequest

Bases: UserScopedRequest

Base project-scoped request domain model.

Used as a base class for all domain models that are project-scoped.

Source code in src/zenml/models/v2/base/scoped.py
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
class ProjectScopedRequest(UserScopedRequest):
    """Base project-scoped request domain model.

    Used as a base class for all domain models that are project-scoped.
    """

    project: UUID = Field(title="The project to which this resource belongs.")

    def get_analytics_metadata(self) -> Dict[str, Any]:
        """Fetches the analytics metadata for project scoped models.

        Returns:
            The analytics metadata.
        """
        metadata = super().get_analytics_metadata()
        metadata["project_id"] = self.project
        return metadata

get_analytics_metadata()

Fetches the analytics metadata for project scoped models.

Returns:

Type Description
Dict[str, Any]

The analytics metadata.

Source code in src/zenml/models/v2/base/scoped.py
91
92
93
94
95
96
97
98
99
def get_analytics_metadata(self) -> Dict[str, Any]:
    """Fetches the analytics metadata for project scoped models.

    Returns:
        The analytics metadata.
    """
    metadata = super().get_analytics_metadata()
    metadata["project_id"] = self.project
    return metadata

ProjectScopedResponse

Bases: UserScopedResponse[ProjectBody, ProjectMetadata, ProjectResources], Generic[ProjectBody, ProjectMetadata, ProjectResources]

Base project-scoped domain model.

Used as a base class for all domain models that are project-scoped.

Source code in src/zenml/models/v2/base/scoped.py
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
class ProjectScopedResponse(
    UserScopedResponse[ProjectBody, ProjectMetadata, ProjectResources],
    Generic[ProjectBody, ProjectMetadata, ProjectResources],
):
    """Base project-scoped domain model.

    Used as a base class for all domain models that are project-scoped.
    """

    # Analytics
    def get_analytics_metadata(self) -> Dict[str, Any]:
        """Fetches the analytics metadata for project scoped models.

        Returns:
            The analytics metadata.
        """
        metadata = super().get_analytics_metadata()
        metadata["project_id"] = self.project_id
        return metadata

    @property
    def project_id(self) -> UUID:
        """The project ID property.

        Returns:
            the value of the property.
        """
        return self.get_body().project_id

    # Helper
    @property
    def project(self) -> "ProjectResponse":
        """The project property.

        Returns:
            the value of the property.
        """
        from zenml.client import Client

        return Client().get_project(self.project_id)

project property

The project property.

Returns:

Type Description
ProjectResponse

the value of the property.

project_id property

The project ID property.

Returns:

Type Description
UUID

the value of the property.

get_analytics_metadata()

Fetches the analytics metadata for project scoped models.

Returns:

Type Description
Dict[str, Any]

The analytics metadata.

Source code in src/zenml/models/v2/base/scoped.py
333
334
335
336
337
338
339
340
341
def get_analytics_metadata(self) -> Dict[str, Any]:
    """Fetches the analytics metadata for project scoped models.

    Returns:
        The analytics metadata.
    """
    metadata = super().get_analytics_metadata()
    metadata["project_id"] = self.project_id
    return metadata

ProjectScopedResponseBody

Bases: UserScopedResponseBody

Base project-scoped body.

Source code in src/zenml/models/v2/base/scoped.py
300
301
302
303
class ProjectScopedResponseBody(UserScopedResponseBody):
    """Base project-scoped body."""

    project_id: UUID = Field(title="The project id.")

ProjectScopedResponseMetadata

Bases: UserScopedResponseMetadata

Base project-scoped metadata.

Source code in src/zenml/models/v2/base/scoped.py
306
307
class ProjectScopedResponseMetadata(UserScopedResponseMetadata):
    """Base project-scoped metadata."""

ProjectScopedResponseResources

Bases: UserScopedResponseResources

Base project-scoped resources.

Source code in src/zenml/models/v2/base/scoped.py
310
311
class ProjectScopedResponseResources(UserScopedResponseResources):
    """Base project-scoped resources."""

ProjectStatistics

Bases: BaseZenModel

Project statistics.

Source code in src/zenml/models/v2/misc/statistics.py
23
24
25
26
27
28
29
30
31
class ProjectStatistics(BaseZenModel):
    """Project statistics."""

    pipelines: int = Field(
        title="The number of pipelines.",
    )
    runs: int = Field(
        title="The number of pipeline runs.",
    )

ProjectUpdate

Bases: BaseUpdate

Update model for projects.

Source code in src/zenml/models/v2/core/project.py
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
class ProjectUpdate(BaseUpdate):
    """Update model for projects."""

    name: Optional[str] = Field(
        title="The unique name of the project. The project name must only "
        "contain only lowercase letters, numbers, underscores, and hyphens and "
        "be at most 50 characters long.",
        min_length=1,
        max_length=STR_ID_FIELD_MAX_LENGTH,
        pattern=r"^[a-z0-9_-]+$",
        default=None,
    )
    display_name: Optional[str] = Field(
        title="The display name of the project.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    description: Optional[str] = Field(
        title="The description of the project.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )

ResourceTypeModel

Bases: BaseModel

Resource type specification.

Describes the authentication methods and resource instantiation model for one or more resource types.

Source code in src/zenml/models/v2/misc/service_connector_type.py
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
class ResourceTypeModel(BaseModel):
    """Resource type specification.

    Describes the authentication methods and resource instantiation model for
    one or more resource types.
    """

    name: str = Field(
        title="User readable name for the resource type.",
    )
    resource_type: str = Field(
        title="Resource type identifier.",
    )
    description: str = Field(
        default="",
        title="A description of the resource type.",
    )
    auth_methods: List[str] = Field(
        title="The list of authentication methods that can be used to access "
        "resources of this type.",
    )
    supports_instances: bool = Field(
        default=False,
        title="Specifies if a single connector instance can be used to access "
        "multiple instances of this resource type. If set to True, the "
        "connector is able to provide a list of resource IDs identifying all "
        "the resources that it can access and a resource ID needs to be "
        "explicitly configured or supplied when access to a resource is "
        "requested. If set to False, a connector instance is only able to "
        "access a single resource and a resource ID is not required to access "
        "the resource.",
    )
    logo_url: Optional[str] = Field(
        default=None,
        title="Optionally, a URL pointing to a png,"
        "svg or jpg file can be attached.",
    )
    emoji: Optional[str] = Field(
        default=None,
        title="Optionally, a python-rich emoji can be attached.",
    )

    @property
    def emojified_resource_type(self) -> str:
        """Get the emojified resource type.

        Returns:
            The emojified resource type.
        """
        if not self.emoji:
            return self.resource_type
        return f"{self.emoji} {self.resource_type}"

emojified_resource_type property

Get the emojified resource type.

Returns:

Type Description
str

The emojified resource type.

ResourcesInfo

Bases: BaseModel

Information about the resources needed for CLI and UI.

Source code in src/zenml/models/v2/misc/info_models.py
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
class ResourcesInfo(BaseModel):
    """Information about the resources needed for CLI and UI."""

    flavor: str
    flavor_display_name: str
    required_configuration: Dict[str, str] = {}
    use_resource_value_as_fixed_config: bool = False

    accessible_by_service_connector: List[str]
    connected_through_service_connector: List["ComponentResponse"]

    @model_validator(mode="after")
    def _validate_resource_info(self) -> "ResourcesInfo":
        if (
            self.use_resource_value_as_fixed_config
            and len(self.required_configuration) > 1
        ):
            raise ValueError(
                "Cannot use resource value as fixed config if more than "
                "one required configuration key is provided."
            )
        return self

RunMetadataEntry

Bases: BaseModel

Utility class to sort/list run metadata entries.

Source code in src/zenml/models/v2/misc/run_metadata.py
32
33
34
35
36
37
38
class RunMetadataEntry(BaseModel):
    """Utility class to sort/list run metadata entries."""

    value: MetadataType = Field(title="The value for the run metadata entry")
    created: datetime = Field(
        title="The timestamp when this resource was created."
    )

RunMetadataFilterMixin

Bases: BaseFilter

Model to enable filtering and sorting by run metadata.

Source code in src/zenml/models/v2/base/scoped.py
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
class RunMetadataFilterMixin(BaseFilter):
    """Model to enable filtering and sorting by run metadata."""

    run_metadata: Optional[List[str]] = Field(
        default=None,
        description="The run_metadata to filter the pipeline runs by.",
    )
    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *BaseFilter.FILTER_EXCLUDE_FIELDS,
        "run_metadata",
    ]
    API_MULTI_INPUT_PARAMS: ClassVar[List[str]] = [
        *BaseFilter.API_MULTI_INPUT_PARAMS,
        "run_metadata",
    ]

    @model_validator(mode="after")
    def validate_run_metadata_format(self) -> "RunMetadataFilterMixin":
        """Validates that run_metadata entries are in the correct format.

        Each run_metadata entry must be in one of the following formats:
        1. "key:value" - Direct equality comparison (key equals value)
        2. "key:filterop:value" - Where filterop is one of the GenericFilterOps:
           - equals: Exact match
           - notequals: Not equal to
           - contains: String contains value
           - startswith: String starts with value
           - endswith: String ends with value
           - oneof: Value is one of the specified options
           - gte: Greater than or equal to
           - gt: Greater than
           - lte: Less than or equal to
           - lt: Less than
           - in: Value is in a list

        Examples:
        - "status:completed" - Find entries where status equals "completed"
        - "name:contains:test" - Find entries where name contains "test"
        - "duration:gt:10" - Find entries where duration is greater than 10

        Returns:
            self

        Raises:
            ValueError: If any entry in run_metadata does not contain a colon.
        """
        if self.run_metadata:
            for entry in self.run_metadata:
                if ":" not in entry:
                    raise ValueError(
                        f"Invalid run_metadata entry format: '{entry}'. "
                        "Entry must be in format 'key:value' for direct "
                        "equality comparison or 'key:filterop:value' where "
                        "filterop is one of: equals, notequals, "
                        f"contains, startswith, endswith, oneof, gte, gt, "
                        f"lte, lt, in."
                    )
        return self

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom run metadata filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        if self.run_metadata is not None:
            from sqlmodel import exists, select

            from zenml.enums import MetadataResourceTypes
            from zenml.zen_stores.schemas import (
                ArtifactVersionSchema,
                ModelVersionSchema,
                PipelineRunSchema,
                RunMetadataResourceSchema,
                RunMetadataSchema,
                ScheduleSchema,
                StepRunSchema,
            )

            resource_type_mapping = {
                ArtifactVersionSchema: MetadataResourceTypes.ARTIFACT_VERSION,
                ModelVersionSchema: MetadataResourceTypes.MODEL_VERSION,
                PipelineRunSchema: MetadataResourceTypes.PIPELINE_RUN,
                StepRunSchema: MetadataResourceTypes.STEP_RUN,
                ScheduleSchema: MetadataResourceTypes.SCHEDULE,
            }

            # Create an EXISTS subquery for each run_metadata filter
            for entry in self.run_metadata:
                # Split at the first colon to get the key
                key, value = entry.split(":", 1)

                # Create an exists subquery
                exists_subquery = exists(
                    select(RunMetadataResourceSchema.id)
                    .join(
                        RunMetadataSchema,
                        RunMetadataSchema.id  # type: ignore[arg-type]
                        == RunMetadataResourceSchema.run_metadata_id,
                    )
                    .where(
                        RunMetadataResourceSchema.resource_id == table.id,
                        RunMetadataResourceSchema.resource_type
                        == resource_type_mapping[table].value,
                        self.generate_custom_query_conditions_for_column(
                            value=key,
                            table=RunMetadataSchema,
                            column="key",
                        ),
                        self.generate_custom_query_conditions_for_column(
                            value=value,
                            table=RunMetadataSchema,
                            column="value",
                        ),
                    )
                )
                custom_filters.append(exists_subquery)

        return custom_filters

get_custom_filters(table)

Get custom run metadata filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/base/scoped.py
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom run metadata filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    if self.run_metadata is not None:
        from sqlmodel import exists, select

        from zenml.enums import MetadataResourceTypes
        from zenml.zen_stores.schemas import (
            ArtifactVersionSchema,
            ModelVersionSchema,
            PipelineRunSchema,
            RunMetadataResourceSchema,
            RunMetadataSchema,
            ScheduleSchema,
            StepRunSchema,
        )

        resource_type_mapping = {
            ArtifactVersionSchema: MetadataResourceTypes.ARTIFACT_VERSION,
            ModelVersionSchema: MetadataResourceTypes.MODEL_VERSION,
            PipelineRunSchema: MetadataResourceTypes.PIPELINE_RUN,
            StepRunSchema: MetadataResourceTypes.STEP_RUN,
            ScheduleSchema: MetadataResourceTypes.SCHEDULE,
        }

        # Create an EXISTS subquery for each run_metadata filter
        for entry in self.run_metadata:
            # Split at the first colon to get the key
            key, value = entry.split(":", 1)

            # Create an exists subquery
            exists_subquery = exists(
                select(RunMetadataResourceSchema.id)
                .join(
                    RunMetadataSchema,
                    RunMetadataSchema.id  # type: ignore[arg-type]
                    == RunMetadataResourceSchema.run_metadata_id,
                )
                .where(
                    RunMetadataResourceSchema.resource_id == table.id,
                    RunMetadataResourceSchema.resource_type
                    == resource_type_mapping[table].value,
                    self.generate_custom_query_conditions_for_column(
                        value=key,
                        table=RunMetadataSchema,
                        column="key",
                    ),
                    self.generate_custom_query_conditions_for_column(
                        value=value,
                        table=RunMetadataSchema,
                        column="value",
                    ),
                )
            )
            custom_filters.append(exists_subquery)

    return custom_filters

validate_run_metadata_format()

Validates that run_metadata entries are in the correct format.

Each run_metadata entry must be in one of the following formats: 1. "key:value" - Direct equality comparison (key equals value) 2. "key:filterop:value" - Where filterop is one of the GenericFilterOps: - equals: Exact match - notequals: Not equal to - contains: String contains value - startswith: String starts with value - endswith: String ends with value - oneof: Value is one of the specified options - gte: Greater than or equal to - gt: Greater than - lte: Less than or equal to - lt: Less than - in: Value is in a list

Examples: - "status:completed" - Find entries where status equals "completed" - "name:contains:test" - Find entries where name contains "test" - "duration:gt:10" - Find entries where duration is greater than 10

Returns:

Type Description
RunMetadataFilterMixin

self

Raises:

Type Description
ValueError

If any entry in run_metadata does not contain a colon.

Source code in src/zenml/models/v2/base/scoped.py
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
@model_validator(mode="after")
def validate_run_metadata_format(self) -> "RunMetadataFilterMixin":
    """Validates that run_metadata entries are in the correct format.

    Each run_metadata entry must be in one of the following formats:
    1. "key:value" - Direct equality comparison (key equals value)
    2. "key:filterop:value" - Where filterop is one of the GenericFilterOps:
       - equals: Exact match
       - notequals: Not equal to
       - contains: String contains value
       - startswith: String starts with value
       - endswith: String ends with value
       - oneof: Value is one of the specified options
       - gte: Greater than or equal to
       - gt: Greater than
       - lte: Less than or equal to
       - lt: Less than
       - in: Value is in a list

    Examples:
    - "status:completed" - Find entries where status equals "completed"
    - "name:contains:test" - Find entries where name contains "test"
    - "duration:gt:10" - Find entries where duration is greater than 10

    Returns:
        self

    Raises:
        ValueError: If any entry in run_metadata does not contain a colon.
    """
    if self.run_metadata:
        for entry in self.run_metadata:
            if ":" not in entry:
                raise ValueError(
                    f"Invalid run_metadata entry format: '{entry}'. "
                    "Entry must be in format 'key:value' for direct "
                    "equality comparison or 'key:filterop:value' where "
                    "filterop is one of: equals, notequals, "
                    f"contains, startswith, endswith, oneof, gte, gt, "
                    f"lte, lt, in."
                )
    return self

RunMetadataRequest

Bases: ProjectScopedRequest

Request model for run metadata.

Source code in src/zenml/models/v2/core/run_metadata.py
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
class RunMetadataRequest(ProjectScopedRequest):
    """Request model for run metadata."""

    resources: List[RunMetadataResource] = Field(
        title="The list of resources that this metadata belongs to."
    )
    stack_component_id: Optional[UUID] = Field(
        title="The ID of the stack component that this metadata belongs to.",
        default=None,
    )
    values: Dict[str, "MetadataType"] = Field(
        title="The metadata to be created.",
    )
    types: Dict[str, "MetadataTypeEnum"] = Field(
        title="The types of the metadata to be created.",
    )
    publisher_step_id: Optional[UUID] = Field(
        title="The ID of the step execution that published this metadata.",
        default=None,
    )

    @model_validator(mode="after")
    def validate_values_keys(self) -> "RunMetadataRequest":
        """Validates if the keys in the metadata are properly defined.

        Returns:
            self

        Raises:
            ValueError: if one of the key in the metadata contains `:`
        """
        invalid_keys = [key for key in self.values.keys() if ":" in key]
        if invalid_keys:
            raise ValueError(
                "You can not use colons (`:`) in the key names when you "
                "are creating metadata for your ZenML objects. Please change "
                f"the following keys: {invalid_keys}"
            )
        return self

validate_values_keys()

Validates if the keys in the metadata are properly defined.

Returns:

Type Description
RunMetadataRequest

self

Raises:

Type Description
ValueError

if one of the key in the metadata contains :

Source code in src/zenml/models/v2/core/run_metadata.py
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
@model_validator(mode="after")
def validate_values_keys(self) -> "RunMetadataRequest":
    """Validates if the keys in the metadata are properly defined.

    Returns:
        self

    Raises:
        ValueError: if one of the key in the metadata contains `:`
    """
    invalid_keys = [key for key in self.values.keys() if ":" in key]
    if invalid_keys:
        raise ValueError(
            "You can not use colons (`:`) in the key names when you "
            "are creating metadata for your ZenML objects. Please change "
            f"the following keys: {invalid_keys}"
        )
    return self

RunMetadataResource

Bases: BaseModel

Utility class to help identify resources to tag metadata to.

Source code in src/zenml/models/v2/misc/run_metadata.py
25
26
27
28
29
class RunMetadataResource(BaseModel):
    """Utility class to help identify resources to tag metadata to."""

    id: UUID = Field(title="The ID of the resource.")
    type: MetadataResourceTypes = Field(title="The type of the resource.")

RunTemplateFilter

Bases: ProjectScopedFilter, TaggableFilter

Model for filtering of run templates.

Source code in src/zenml/models/v2/core/run_template.py
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
class RunTemplateFilter(ProjectScopedFilter, TaggableFilter):
    """Model for filtering of run templates."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.FILTER_EXCLUDE_FIELDS,
        *TaggableFilter.FILTER_EXCLUDE_FIELDS,
        "code_repository_id",
        "stack_id",
        "build_id",
        "pipeline_id",
        "pipeline",
        "stack",
        "hidden",
    ]
    CUSTOM_SORTING_OPTIONS = [
        *ProjectScopedFilter.CUSTOM_SORTING_OPTIONS,
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CLI_EXCLUDE_FIELDS,
        *TaggableFilter.CLI_EXCLUDE_FIELDS,
    ]

    name: Optional[str] = Field(
        default=None,
        description="Name of the run template.",
    )
    hidden: Optional[bool] = Field(
        default=None,
        description="Whether the run template is hidden.",
    )
    pipeline_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Pipeline associated with the template.",
        union_mode="left_to_right",
    )
    build_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Build associated with the template.",
        union_mode="left_to_right",
    )
    stack_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Stack associated with the template.",
        union_mode="left_to_right",
    )
    code_repository_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Code repository associated with the template.",
        union_mode="left_to_right",
    )
    pipeline: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the pipeline associated with the template.",
    )
    stack: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the stack associated with the template.",
    )

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_, col

        from zenml.zen_stores.schemas import (
            CodeReferenceSchema,
            PipelineDeploymentSchema,
            PipelineSchema,
            RunTemplateSchema,
            StackSchema,
        )

        if self.hidden is not None:
            custom_filters.append(
                col(RunTemplateSchema.hidden).is_(self.hidden)
            )

        if self.code_repository_id:
            code_repo_filter = and_(
                RunTemplateSchema.source_deployment_id
                == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.code_reference_id
                == CodeReferenceSchema.id,
                CodeReferenceSchema.code_repository_id
                == self.code_repository_id,
            )
            custom_filters.append(code_repo_filter)

        if self.stack_id:
            stack_filter = and_(
                RunTemplateSchema.source_deployment_id
                == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.stack_id == self.stack_id,
            )
            custom_filters.append(stack_filter)

        if self.build_id:
            build_filter = and_(
                RunTemplateSchema.source_deployment_id
                == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.build_id == self.build_id,
            )
            custom_filters.append(build_filter)

        if self.pipeline_id:
            pipeline_filter = and_(
                RunTemplateSchema.source_deployment_id
                == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.pipeline_id == self.pipeline_id,
            )
            custom_filters.append(pipeline_filter)

        if self.pipeline:
            pipeline_filter = and_(
                RunTemplateSchema.source_deployment_id
                == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.pipeline_id == PipelineSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.pipeline,
                    table=PipelineSchema,
                ),
            )
            custom_filters.append(pipeline_filter)

        if self.stack:
            stack_filter = and_(
                RunTemplateSchema.source_deployment_id
                == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.stack_id == StackSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.stack,
                    table=StackSchema,
                ),
            )
            custom_filters.append(stack_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/run_template.py
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_, col

    from zenml.zen_stores.schemas import (
        CodeReferenceSchema,
        PipelineDeploymentSchema,
        PipelineSchema,
        RunTemplateSchema,
        StackSchema,
    )

    if self.hidden is not None:
        custom_filters.append(
            col(RunTemplateSchema.hidden).is_(self.hidden)
        )

    if self.code_repository_id:
        code_repo_filter = and_(
            RunTemplateSchema.source_deployment_id
            == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.code_reference_id
            == CodeReferenceSchema.id,
            CodeReferenceSchema.code_repository_id
            == self.code_repository_id,
        )
        custom_filters.append(code_repo_filter)

    if self.stack_id:
        stack_filter = and_(
            RunTemplateSchema.source_deployment_id
            == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.stack_id == self.stack_id,
        )
        custom_filters.append(stack_filter)

    if self.build_id:
        build_filter = and_(
            RunTemplateSchema.source_deployment_id
            == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.build_id == self.build_id,
        )
        custom_filters.append(build_filter)

    if self.pipeline_id:
        pipeline_filter = and_(
            RunTemplateSchema.source_deployment_id
            == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.pipeline_id == self.pipeline_id,
        )
        custom_filters.append(pipeline_filter)

    if self.pipeline:
        pipeline_filter = and_(
            RunTemplateSchema.source_deployment_id
            == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.pipeline_id == PipelineSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.pipeline,
                table=PipelineSchema,
            ),
        )
        custom_filters.append(pipeline_filter)

    if self.stack:
        stack_filter = and_(
            RunTemplateSchema.source_deployment_id
            == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.stack_id == StackSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.stack,
                table=StackSchema,
            ),
        )
        custom_filters.append(stack_filter)

    return custom_filters

RunTemplateRequest

Bases: ProjectScopedRequest

Request model for run templates.

Source code in src/zenml/models/v2/core/run_template.py
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
class RunTemplateRequest(ProjectScopedRequest):
    """Request model for run templates."""

    name: str = Field(
        title="The name of the run template.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    description: Optional[str] = Field(
        default=None,
        title="The description of the run template.",
        max_length=TEXT_FIELD_MAX_LENGTH,
    )
    source_deployment_id: UUID = Field(
        title="The deployment that should be the base of the created template."
    )
    hidden: bool = Field(
        default=False,
        title="Whether the run template is hidden.",
    )
    tags: Optional[List[str]] = Field(
        default=None,
        title="Tags of the run template.",
    )

RunTemplateResponse

Bases: ProjectScopedResponse[RunTemplateResponseBody, RunTemplateResponseMetadata, RunTemplateResponseResources]

Response model for run templates.

Source code in src/zenml/models/v2/core/run_template.py
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
class RunTemplateResponse(
    ProjectScopedResponse[
        RunTemplateResponseBody,
        RunTemplateResponseMetadata,
        RunTemplateResponseResources,
    ]
):
    """Response model for run templates."""

    name: str = Field(
        title="The name of the run template.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "RunTemplateResponse":
        """Return the hydrated version of this run template.

        Returns:
            The hydrated run template.
        """
        from zenml.client import Client

        return Client().zen_store.get_run_template(
            template_id=self.id, hydrate=True
        )

    # Body and metadata properties
    @property
    def runnable(self) -> bool:
        """The `runnable` property.

        Returns:
            the value of the property.
        """
        return self.get_body().runnable

    @property
    def hidden(self) -> bool:
        """The `hidden` property.

        Returns:
            the value of the property.
        """
        return self.get_body().hidden

    @property
    def latest_run_id(self) -> Optional[UUID]:
        """The `latest_run_id` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().latest_run_id

    @property
    def latest_run_status(self) -> Optional[ExecutionStatus]:
        """The `latest_run_status` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().latest_run_status

    @property
    def description(self) -> Optional[str]:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

    @property
    def pipeline_spec(self) -> Optional[PipelineSpec]:
        """The `pipeline_spec` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().pipeline_spec

    @property
    def config_template(self) -> Optional[Dict[str, Any]]:
        """The `config_template` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().config_template

    @property
    def config_schema(self) -> Optional[Dict[str, Any]]:
        """The `config_schema` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().config_schema

    @property
    def source_deployment(self) -> Optional[PipelineDeploymentResponse]:
        """The `source_deployment` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().source_deployment

    @property
    def pipeline(self) -> Optional[PipelineResponse]:
        """The `pipeline` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().pipeline

    @property
    def build(self) -> Optional[PipelineBuildResponse]:
        """The `build` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().build

    @property
    def code_reference(self) -> Optional[CodeReferenceResponse]:
        """The `code_reference` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().code_reference

    @property
    def tags(self) -> List[TagResponse]:
        """The `tags` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().tags

build property

The build property.

Returns:

Type Description
Optional[PipelineBuildResponse]

the value of the property.

code_reference property

The code_reference property.

Returns:

Type Description
Optional[CodeReferenceResponse]

the value of the property.

config_schema property

The config_schema property.

Returns:

Type Description
Optional[Dict[str, Any]]

the value of the property.

config_template property

The config_template property.

Returns:

Type Description
Optional[Dict[str, Any]]

the value of the property.

description property

The description property.

Returns:

Type Description
Optional[str]

the value of the property.

hidden property

The hidden property.

Returns:

Type Description
bool

the value of the property.

latest_run_id property

The latest_run_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

latest_run_status property

The latest_run_status property.

Returns:

Type Description
Optional[ExecutionStatus]

the value of the property.

pipeline property

The pipeline property.

Returns:

Type Description
Optional[PipelineResponse]

the value of the property.

pipeline_spec property

The pipeline_spec property.

Returns:

Type Description
Optional[PipelineSpec]

the value of the property.

runnable property

The runnable property.

Returns:

Type Description
bool

the value of the property.

source_deployment property

The source_deployment property.

Returns:

Type Description
Optional[PipelineDeploymentResponse]

the value of the property.

tags property

The tags property.

Returns:

Type Description
List[TagResponse]

the value of the property.

get_hydrated_version()

Return the hydrated version of this run template.

Returns:

Type Description
RunTemplateResponse

The hydrated run template.

Source code in src/zenml/models/v2/core/run_template.py
198
199
200
201
202
203
204
205
206
207
208
def get_hydrated_version(self) -> "RunTemplateResponse":
    """Return the hydrated version of this run template.

    Returns:
        The hydrated run template.
    """
    from zenml.client import Client

    return Client().zen_store.get_run_template(
        template_id=self.id, hydrate=True
    )

RunTemplateResponseBody

Bases: ProjectScopedResponseBody

Response body for run templates.

Source code in src/zenml/models/v2/core/run_template.py
123
124
125
126
127
128
129
130
131
132
class RunTemplateResponseBody(ProjectScopedResponseBody):
    """Response body for run templates."""

    runnable: bool = Field(
        title="If a run can be started from the template.",
    )
    hidden: bool = Field(
        default=False,
        title="Whether the run template is hidden.",
    )

RunTemplateResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for run templates.

Source code in src/zenml/models/v2/core/run_template.py
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
class RunTemplateResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for run templates."""

    description: Optional[str] = Field(
        default=None,
        title="The description of the run template.",
    )
    pipeline_spec: Optional[PipelineSpec] = Field(
        default=None, title="The spec of the pipeline."
    )
    config_template: Optional[Dict[str, Any]] = Field(
        default=None, title="Run configuration template."
    )
    config_schema: Optional[Dict[str, Any]] = Field(
        default=None, title="Run configuration schema."
    )

RunTemplateResponseResources

Bases: ProjectScopedResponseResources

All resource models associated with the run template.

Source code in src/zenml/models/v2/core/run_template.py
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
class RunTemplateResponseResources(ProjectScopedResponseResources):
    """All resource models associated with the run template."""

    source_deployment: Optional[PipelineDeploymentResponse] = Field(
        default=None,
        title="The deployment that is the source of the template.",
    )
    pipeline: Optional[PipelineResponse] = Field(
        default=None, title="The pipeline associated with the template."
    )
    build: Optional[PipelineBuildResponse] = Field(
        default=None,
        title="The pipeline build associated with the template.",
    )
    code_reference: Optional[CodeReferenceResponse] = Field(
        default=None,
        title="The code reference associated with the template.",
    )
    tags: List[TagResponse] = Field(
        title="Tags associated with the run template.",
    )
    latest_run_id: Optional[UUID] = Field(
        default=None,
        title="The ID of the latest run of the run template.",
    )
    latest_run_status: Optional[ExecutionStatus] = Field(
        default=None,
        title="The status of the latest run of the run template.",
    )

RunTemplateUpdate

Bases: BaseUpdate

Run template update model.

Source code in src/zenml/models/v2/core/run_template.py
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
class RunTemplateUpdate(BaseUpdate):
    """Run template update model."""

    name: Optional[str] = Field(
        default=None,
        title="The name of the run template.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    description: Optional[str] = Field(
        default=None,
        title="The description of the run template.",
        max_length=TEXT_FIELD_MAX_LENGTH,
    )
    hidden: Optional[bool] = Field(
        default=None,
        title="Whether the run template is hidden.",
    )
    add_tags: Optional[List[str]] = Field(
        default=None, title="New tags to add to the run template."
    )
    remove_tags: Optional[List[str]] = Field(
        default=None, title="Tags to remove from the run template."
    )

ScheduleFilter

Bases: ProjectScopedFilter

Model to enable advanced filtering of all Users.

Source code in src/zenml/models/v2/core/schedule.py
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
class ScheduleFilter(ProjectScopedFilter):
    """Model to enable advanced filtering of all Users."""

    pipeline_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Pipeline that the schedule is attached to.",
        union_mode="left_to_right",
    )
    orchestrator_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Orchestrator that the schedule is attached to.",
        union_mode="left_to_right",
    )
    active: Optional[bool] = Field(
        default=None,
        description="If the schedule is active",
    )
    cron_expression: Optional[str] = Field(
        default=None,
        description="The cron expression, describing the schedule",
    )
    start_time: Optional[Union[datetime, str]] = Field(
        default=None, description="Start time", union_mode="left_to_right"
    )
    end_time: Optional[Union[datetime, str]] = Field(
        default=None, description="End time", union_mode="left_to_right"
    )
    interval_second: Optional[Optional[float]] = Field(
        default=None,
        description="The repetition interval in seconds",
    )
    catchup: Optional[bool] = Field(
        default=None,
        description="Whether or not the schedule is set to catchup past missed "
        "events",
    )
    name: Optional[str] = Field(
        default=None,
        description="Name of the schedule",
    )
    run_once_start_time: Optional[Union[datetime, str]] = Field(
        default=None,
        description="The time at which the schedule should run once",
        union_mode="left_to_right",
    )

ScheduleRequest

Bases: ProjectScopedRequest

Request model for schedules.

Source code in src/zenml/models/v2/core/schedule.py
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
class ScheduleRequest(ProjectScopedRequest):
    """Request model for schedules."""

    name: str
    active: bool

    cron_expression: Optional[str] = None
    start_time: Optional[datetime] = None
    end_time: Optional[datetime] = None
    interval_second: Optional[timedelta] = None
    catchup: bool = False
    run_once_start_time: Optional[datetime] = None

    orchestrator_id: Optional[UUID]
    pipeline_id: Optional[UUID]

    @field_validator(
        "start_time", "end_time", "run_once_start_time", mode="after"
    )
    @classmethod
    def _ensure_tzunaware_utc(
        cls, value: Optional[datetime]
    ) -> Optional[datetime]:
        """Ensures that all datetimes are timezone unaware and in UTC time.

        Args:
            value: The datetime.

        Returns:
            The datetime in UTC time without timezone.
        """
        if value and value.tzinfo:
            value = value.astimezone(timezone.utc)
            value = value.replace(tzinfo=None)

        return value

    @model_validator(mode="after")
    def _ensure_cron_or_periodic_schedule_configured(
        self,
    ) -> "ScheduleRequest":
        """Ensures that the cron expression or start time + interval are set.

        Returns:
            All schedule attributes.

        Raises:
            ValueError: If no cron expression or start time + interval were
                provided.
        """
        cron_expression = self.cron_expression
        periodic_schedule = self.start_time and self.interval_second
        run_once_starts_at = self.run_once_start_time

        if cron_expression and periodic_schedule:
            logger.warning(
                "This schedule was created with a cron expression as well as "
                "values for `start_time` and `interval_seconds`. The resulting "
                "behavior depends on the concrete orchestrator implementation "
                "but will usually ignore the interval and use the cron "
                "expression."
            )
            return self
        elif cron_expression and run_once_starts_at:
            logger.warning(
                "This schedule was created with a cron expression as well as "
                "a value for `run_once_start_time`. The resulting behavior "
                "depends on the concrete orchestrator implementation but will "
                "usually ignore the `run_once_start_time`."
            )
            return self
        elif cron_expression or periodic_schedule or run_once_starts_at:
            return self
        else:
            raise ValueError(
                "Either a cron expression, a start time and interval seconds "
                "or a run once start time "
                "need to be set for a valid schedule."
            )

ScheduleResponse

Bases: ProjectScopedResponse[ScheduleResponseBody, ScheduleResponseMetadata, ScheduleResponseResources]

Response model for schedules.

Source code in src/zenml/models/v2/core/schedule.py
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
class ScheduleResponse(
    ProjectScopedResponse[
        ScheduleResponseBody,
        ScheduleResponseMetadata,
        ScheduleResponseResources,
    ],
):
    """Response model for schedules."""

    name: str = Field(
        title="Name of this schedule.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "ScheduleResponse":
        """Get the hydrated version of this schedule.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_schedule(self.id)

    # Helper methods
    @property
    def utc_start_time(self) -> Optional[str]:
        """Optional ISO-formatted string of the UTC start time.

        Returns:
            Optional ISO-formatted string of the UTC start time.
        """
        if not self.start_time:
            return None

        return to_utc_timezone(self.start_time).isoformat()

    @property
    def utc_end_time(self) -> Optional[str]:
        """Optional ISO-formatted string of the UTC end time.

        Returns:
            Optional ISO-formatted string of the UTC end time.
        """
        if not self.end_time:
            return None

        return to_utc_timezone(self.end_time).isoformat()

    # Body and metadata properties
    @property
    def active(self) -> bool:
        """The `active` property.

        Returns:
            the value of the property.
        """
        return self.get_body().active

    @property
    def cron_expression(self) -> Optional[str]:
        """The `cron_expression` property.

        Returns:
            the value of the property.
        """
        return self.get_body().cron_expression

    @property
    def start_time(self) -> Optional[datetime]:
        """The `start_time` property.

        Returns:
            the value of the property.
        """
        return self.get_body().start_time

    @property
    def end_time(self) -> Optional[datetime]:
        """The `end_time` property.

        Returns:
            the value of the property.
        """
        return self.get_body().end_time

    @property
    def run_once_start_time(self) -> Optional[datetime]:
        """The `run_once_start_time` property.

        Returns:
            the value of the property.
        """
        return self.get_body().run_once_start_time

    @property
    def interval_second(self) -> Optional[timedelta]:
        """The `interval_second` property.

        Returns:
            the value of the property.
        """
        return self.get_body().interval_second

    @property
    def catchup(self) -> bool:
        """The `catchup` property.

        Returns:
            the value of the property.
        """
        return self.get_body().catchup

    @property
    def orchestrator_id(self) -> Optional[UUID]:
        """The `orchestrator_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().orchestrator_id

    @property
    def pipeline_id(self) -> Optional[UUID]:
        """The `pipeline_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().pipeline_id

    @property
    def run_metadata(self) -> Dict[str, MetadataType]:
        """The `run_metadata` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().run_metadata

active property

The active property.

Returns:

Type Description
bool

the value of the property.

catchup property

The catchup property.

Returns:

Type Description
bool

the value of the property.

cron_expression property

The cron_expression property.

Returns:

Type Description
Optional[str]

the value of the property.

end_time property

The end_time property.

Returns:

Type Description
Optional[datetime]

the value of the property.

interval_second property

The interval_second property.

Returns:

Type Description
Optional[timedelta]

the value of the property.

orchestrator_id property

The orchestrator_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

pipeline_id property

The pipeline_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

run_metadata property

The run_metadata property.

Returns:

Type Description
Dict[str, MetadataType]

the value of the property.

run_once_start_time property

The run_once_start_time property.

Returns:

Type Description
Optional[datetime]

the value of the property.

start_time property

The start_time property.

Returns:

Type Description
Optional[datetime]

the value of the property.

utc_end_time property

Optional ISO-formatted string of the UTC end time.

Returns:

Type Description
Optional[str]

Optional ISO-formatted string of the UTC end time.

utc_start_time property

Optional ISO-formatted string of the UTC start time.

Returns:

Type Description
Optional[str]

Optional ISO-formatted string of the UTC start time.

get_hydrated_version()

Get the hydrated version of this schedule.

Returns:

Type Description
ScheduleResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/schedule.py
177
178
179
180
181
182
183
184
185
def get_hydrated_version(self) -> "ScheduleResponse":
    """Get the hydrated version of this schedule.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_schedule(self.id)

ScheduleResponseBody

Bases: ProjectScopedResponseBody

Response body for schedules.

Source code in src/zenml/models/v2/core/schedule.py
135
136
137
138
139
140
141
142
143
144
class ScheduleResponseBody(ProjectScopedResponseBody):
    """Response body for schedules."""

    active: bool
    cron_expression: Optional[str] = None
    start_time: Optional[datetime] = None
    end_time: Optional[datetime] = None
    interval_second: Optional[timedelta] = None
    catchup: bool = False
    run_once_start_time: Optional[datetime] = None

ScheduleResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for schedules.

Source code in src/zenml/models/v2/core/schedule.py
147
148
149
150
151
152
153
154
155
156
class ScheduleResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for schedules."""

    orchestrator_id: Optional[UUID]
    pipeline_id: Optional[UUID]

    run_metadata: Dict[str, MetadataType] = Field(
        title="Metadata associated with this schedule.",
        default={},
    )

ScheduleResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the schedule entity.

Source code in src/zenml/models/v2/core/schedule.py
159
160
class ScheduleResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the schedule entity."""

ScheduleUpdate

Bases: BaseUpdate

Update model for schedules.

Source code in src/zenml/models/v2/core/schedule.py
126
127
128
129
class ScheduleUpdate(BaseUpdate):
    """Update model for schedules."""

    name: Optional[str] = None

SecretFilter

Bases: UserScopedFilter

Model to enable advanced secret filtering.

Source code in src/zenml/models/v2/core/secret.py
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
class SecretFilter(UserScopedFilter):
    """Model to enable advanced secret filtering."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *UserScopedFilter.FILTER_EXCLUDE_FIELDS,
        "values",
    ]

    name: Optional[str] = Field(
        default=None,
        description="Name of the secret",
    )
    private: Optional[bool] = Field(
        default=None,
        description="Whether to filter secrets by private status",
    )

    def apply_filter(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Applies the filter to a query.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        # The secret user scoping works a bit differently than the other
        # scoped filters. We have to filter out all private secrets that are
        # not owned by the current user.
        if not self.scope_user:
            return super().apply_filter(query=query, table=table)

        scope_user = self.scope_user

        # First we apply the inherited filters without the user scoping
        # applied.
        self.scope_user = None
        query = super().apply_filter(query=query, table=table)
        self.scope_user = scope_user

        # Then we apply the user scoping filter.
        if self.scope_user:
            from sqlmodel import and_, or_

            query = query.where(
                or_(
                    and_(
                        getattr(table, "user_id") == self.scope_user,
                        getattr(table, "private") == True,  # noqa: E712
                    ),
                    getattr(table, "private") == False,  # noqa: E712
                )
            )

        else:
            query = query.where(getattr(table, "private") == False)  # noqa: E712

        return query

apply_filter(query, table)

Applies the filter to a query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/core/secret.py
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
def apply_filter(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Applies the filter to a query.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    # The secret user scoping works a bit differently than the other
    # scoped filters. We have to filter out all private secrets that are
    # not owned by the current user.
    if not self.scope_user:
        return super().apply_filter(query=query, table=table)

    scope_user = self.scope_user

    # First we apply the inherited filters without the user scoping
    # applied.
    self.scope_user = None
    query = super().apply_filter(query=query, table=table)
    self.scope_user = scope_user

    # Then we apply the user scoping filter.
    if self.scope_user:
        from sqlmodel import and_, or_

        query = query.where(
            or_(
                and_(
                    getattr(table, "user_id") == self.scope_user,
                    getattr(table, "private") == True,  # noqa: E712
                ),
                getattr(table, "private") == False,  # noqa: E712
            )
        )

    else:
        query = query.where(getattr(table, "private") == False)  # noqa: E712

    return query

SecretRequest

Bases: UserScopedRequest

Request model for secrets.

Source code in src/zenml/models/v2/core/secret.py
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
class SecretRequest(UserScopedRequest):
    """Request model for secrets."""

    ANALYTICS_FIELDS: ClassVar[List[str]] = ["private"]

    name: str = Field(
        title="The name of the secret.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    private: bool = Field(
        False,
        title="Whether the secret is private. A private secret is only "
        "accessible to the user who created it.",
    )
    values: Dict[str, Optional[PlainSerializedSecretStr]] = Field(
        default_factory=dict, title="The values stored in this secret."
    )

    @property
    def secret_values(self) -> Dict[str, str]:
        """A dictionary with all un-obfuscated values stored in this secret.

        The values are returned as strings, not SecretStr. If a value is
        None, it is not included in the returned dictionary. This is to enable
        the use of None values in the update model to indicate that a secret
        value should be deleted.

        Returns:
            A dictionary containing the secret's values.
        """
        return {
            k: v.get_secret_value()
            for k, v in self.values.items()
            if v is not None
        }

secret_values property

A dictionary with all un-obfuscated values stored in this secret.

The values are returned as strings, not SecretStr. If a value is None, it is not included in the returned dictionary. This is to enable the use of None values in the update model to indicate that a secret value should be deleted.

Returns:

Type Description
Dict[str, str]

A dictionary containing the secret's values.

SecretResponse

Bases: UserScopedResponse[SecretResponseBody, SecretResponseMetadata, SecretResponseResources]

Response model for secrets.

Source code in src/zenml/models/v2/core/secret.py
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
class SecretResponse(
    UserScopedResponse[
        SecretResponseBody,
        SecretResponseMetadata,
        SecretResponseResources,
    ]
):
    """Response model for secrets."""

    ANALYTICS_FIELDS: ClassVar[List[str]] = ["private"]

    name: str = Field(
        title="The name of the secret.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "SecretResponse":
        """Get the hydrated version of this secret.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_secret(self.id)

    # Body and metadata properties

    @property
    def private(self) -> bool:
        """The `private` property.

        Returns:
            the value of the property.
        """
        return self.get_body().private

    @property
    def values(self) -> Dict[str, Optional[SecretStr]]:
        """The `values` property.

        Returns:
            the value of the property.
        """
        return self.get_body().values

    # Helper methods
    @property
    def secret_values(self) -> Dict[str, str]:
        """A dictionary with all un-obfuscated values stored in this secret.

        The values are returned as strings, not SecretStr. If a value is
        None, it is not included in the returned dictionary. This is to enable
        the use of None values in the update model to indicate that a secret
        value should be deleted.

        Returns:
            A dictionary containing the secret's values.
        """
        return {
            k: v.get_secret_value()
            for k, v in self.values.items()
            if v is not None
        }

    @property
    def has_missing_values(self) -> bool:
        """Returns True if the secret has missing values (i.e. None).

        Values can be missing from a secret for example if the user retrieves a
        secret but does not have the permission to view the secret values.

        Returns:
            True if the secret has any values set to None.
        """
        return any(v is None for v in self.values.values())

    def add_secret(self, key: str, value: str) -> None:
        """Adds a secret value to the secret.

        Args:
            key: The key of the secret value.
            value: The secret value.
        """
        self.get_body().values[key] = SecretStr(value)

    def remove_secret(self, key: str) -> None:
        """Removes a secret value from the secret.

        Args:
            key: The key of the secret value.
        """
        del self.get_body().values[key]

    def remove_secrets(self) -> None:
        """Removes all secret values from the secret but keep the keys."""
        self.get_body().values = {k: None for k in self.values.keys()}

    def set_secrets(self, values: Dict[str, str]) -> None:
        """Sets the secret values of the secret.

        Args:
            values: The secret values to set.
        """
        self.get_body().values = {k: SecretStr(v) for k, v in values.items()}

has_missing_values property

Returns True if the secret has missing values (i.e. None).

Values can be missing from a secret for example if the user retrieves a secret but does not have the permission to view the secret values.

Returns:

Type Description
bool

True if the secret has any values set to None.

private property

The private property.

Returns:

Type Description
bool

the value of the property.

secret_values property

A dictionary with all un-obfuscated values stored in this secret.

The values are returned as strings, not SecretStr. If a value is None, it is not included in the returned dictionary. This is to enable the use of None values in the update model to indicate that a secret value should be deleted.

Returns:

Type Description
Dict[str, str]

A dictionary containing the secret's values.

values property

The values property.

Returns:

Type Description
Dict[str, Optional[SecretStr]]

the value of the property.

add_secret(key, value)

Adds a secret value to the secret.

Parameters:

Name Type Description Default
key str

The key of the secret value.

required
value str

The secret value.

required
Source code in src/zenml/models/v2/core/secret.py
225
226
227
228
229
230
231
232
def add_secret(self, key: str, value: str) -> None:
    """Adds a secret value to the secret.

    Args:
        key: The key of the secret value.
        value: The secret value.
    """
    self.get_body().values[key] = SecretStr(value)

get_hydrated_version()

Get the hydrated version of this secret.

Returns:

Type Description
SecretResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/secret.py
164
165
166
167
168
169
170
171
172
def get_hydrated_version(self) -> "SecretResponse":
    """Get the hydrated version of this secret.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_secret(self.id)

remove_secret(key)

Removes a secret value from the secret.

Parameters:

Name Type Description Default
key str

The key of the secret value.

required
Source code in src/zenml/models/v2/core/secret.py
234
235
236
237
238
239
240
def remove_secret(self, key: str) -> None:
    """Removes a secret value from the secret.

    Args:
        key: The key of the secret value.
    """
    del self.get_body().values[key]

remove_secrets()

Removes all secret values from the secret but keep the keys.

Source code in src/zenml/models/v2/core/secret.py
242
243
244
def remove_secrets(self) -> None:
    """Removes all secret values from the secret but keep the keys."""
    self.get_body().values = {k: None for k in self.values.keys()}

set_secrets(values)

Sets the secret values of the secret.

Parameters:

Name Type Description Default
values Dict[str, str]

The secret values to set.

required
Source code in src/zenml/models/v2/core/secret.py
246
247
248
249
250
251
252
def set_secrets(self, values: Dict[str, str]) -> None:
    """Sets the secret values of the secret.

    Args:
        values: The secret values to set.
    """
    self.get_body().values = {k: SecretStr(v) for k, v in values.items()}

SecretResponseBody

Bases: UserScopedResponseBody

Response body for secrets.

Source code in src/zenml/models/v2/core/secret.py
127
128
129
130
131
132
133
134
135
136
137
class SecretResponseBody(UserScopedResponseBody):
    """Response body for secrets."""

    private: bool = Field(
        False,
        title="Whether the secret is private. A private secret is only "
        "accessible to the user who created it.",
    )
    values: Dict[str, Optional[PlainSerializedSecretStr]] = Field(
        default_factory=dict, title="The values stored in this secret."
    )

SecretResponseMetadata

Bases: UserScopedResponseMetadata

Response metadata for secrets.

Source code in src/zenml/models/v2/core/secret.py
140
141
class SecretResponseMetadata(UserScopedResponseMetadata):
    """Response metadata for secrets."""

SecretResponseResources

Bases: UserScopedResponseResources

Response resources for secrets.

Source code in src/zenml/models/v2/core/secret.py
144
145
class SecretResponseResources(UserScopedResponseResources):
    """Response resources for secrets."""

SecretUpdate

Bases: BaseUpdate

Update model for secrets.

Source code in src/zenml/models/v2/core/secret.py
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
class SecretUpdate(BaseUpdate):
    """Update model for secrets."""

    ANALYTICS_FIELDS: ClassVar[List[str]] = ["private"]

    name: Optional[str] = Field(
        title="The name of the secret.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    private: Optional[bool] = Field(
        default=None,
        title="Whether the secret is private. A private secret is only "
        "accessible to the user who created it.",
    )
    values: Optional[Dict[str, Optional[PlainSerializedSecretStr]]] = Field(
        title="The values stored in this secret.",
        default=None,
    )

    def get_secret_values_update(self) -> Dict[str, Optional[str]]:
        """Returns a dictionary with the secret values to update.

        Returns:
            A dictionary with the secret values to update.
        """
        if self.values is not None:
            return {
                k: v.get_secret_value() if v is not None else None
                for k, v in self.values.items()
            }

        return {}

get_secret_values_update()

Returns a dictionary with the secret values to update.

Returns:

Type Description
Dict[str, Optional[str]]

A dictionary with the secret values to update.

Source code in src/zenml/models/v2/core/secret.py
109
110
111
112
113
114
115
116
117
118
119
120
121
def get_secret_values_update(self) -> Dict[str, Optional[str]]:
    """Returns a dictionary with the secret values to update.

    Returns:
        A dictionary with the secret values to update.
    """
    if self.values is not None:
        return {
            k: v.get_secret_value() if v is not None else None
            for k, v in self.values.items()
        }

    return {}

ServerActivationRequest

Bases: ServerSettingsUpdate

Model for activating the server.

Source code in src/zenml/models/v2/core/server_settings.py
211
212
213
214
215
216
217
218
219
220
221
222
223
224
class ServerActivationRequest(ServerSettingsUpdate):
    """Model for activating the server."""

    admin_username: Optional[str] = Field(
        default=None,
        title="The username of the default admin account to create. Leave "
        "empty to skip creating the default admin account.",
    )

    admin_password: Optional[str] = Field(
        default=None,
        title="The password of the default admin account to create. Leave "
        "empty to skip creating the default admin account.",
    )

ServerDatabaseType

Bases: StrEnum

Enum for server database types.

Source code in src/zenml/models/v2/misc/server_models.py
42
43
44
45
46
47
class ServerDatabaseType(StrEnum):
    """Enum for server database types."""

    SQLITE = "sqlite"
    MYSQL = "mysql"
    OTHER = "other"

ServerDeploymentType

Bases: StrEnum

Enum for server deployment types.

Source code in src/zenml/models/v2/misc/server_models.py
26
27
28
29
30
31
32
33
34
35
36
37
38
39
class ServerDeploymentType(StrEnum):
    """Enum for server deployment types."""

    LOCAL = "local"
    DOCKER = "docker"
    KUBERNETES = "kubernetes"
    AWS = "aws"
    GCP = "gcp"
    AZURE = "azure"
    ALPHA = "alpha"
    OTHER = "other"
    HF_SPACES = "hf_spaces"
    SANDBOX = "sandbox"
    CLOUD = "cloud"

ServerLoadInfo

Bases: BaseModel

Domain model for ZenML server load information.

Source code in src/zenml/models/v2/misc/server_models.py
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
class ServerLoadInfo(BaseModel):
    """Domain model for ZenML server load information."""

    threads: int = Field(
        title="Number of threads that the server is currently using."
    )

    db_connections_total: int = Field(
        title="Total number of database connections (active and idle) that the "
        "server currently has established."
    )

    db_connections_active: int = Field(
        title="Number of database connections that the server is currently "
        "actively using to make queries or transactions."
    )

    db_connections_overflow: int = Field(
        title="Number of overflow database connections that the server is "
        "currently actively using to make queries or transactions."
    )

ServerModel

Bases: BaseModel

Domain model for ZenML servers.

Source code in src/zenml/models/v2/misc/server_models.py
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
class ServerModel(BaseModel):
    """Domain model for ZenML servers."""

    id: UUID = Field(default_factory=uuid4, title="The unique server id.")

    name: Optional[str] = Field(None, title="The name of the ZenML server.")

    version: str = Field(
        title="The ZenML version that the server is running.",
    )

    active: bool = Field(
        True, title="Flag to indicate whether the server is active."
    )

    debug: bool = Field(
        False, title="Flag to indicate whether ZenML is running on debug mode."
    )

    deployment_type: ServerDeploymentType = Field(
        ServerDeploymentType.OTHER,
        title="The ZenML server deployment type.",
    )
    database_type: ServerDatabaseType = Field(
        ServerDatabaseType.OTHER,
        title="The database type that the server is using.",
    )
    secrets_store_type: SecretsStoreType = Field(
        SecretsStoreType.NONE,
        title="The type of secrets store that the server is using.",
    )
    auth_scheme: AuthScheme = Field(
        title="The authentication scheme that the server is using.",
    )
    server_url: str = Field(
        "",
        title="The URL where the ZenML server API is reachable. If not "
        "specified, the clients will use the same URL used to connect them to "
        "the ZenML server.",
    )
    dashboard_url: str = Field(
        "",
        title="The URL where the ZenML dashboard is reachable. If "
        "not specified, the `server_url` value will be used instead.",
    )
    analytics_enabled: bool = Field(
        default=True,  # We set a default for migrations from < 0.57.0
        title="Enable server-side analytics.",
    )

    metadata: Dict[str, str] = Field(
        {},
        title="The metadata associated with the server.",
    )

    last_user_activity: Optional[datetime] = Field(
        None,
        title="Timestamp of latest user activity traced on the server.",
    )

    pro_dashboard_url: Optional[str] = Field(
        None,
        title="The base URL of the ZenML Pro dashboard to which the server "
        "is connected. Only set if the server is a ZenML Pro server.",
    )

    pro_api_url: Optional[str] = Field(
        None,
        title="The base URL of the ZenML Pro API to which the server is "
        "connected. Only set if the server is a ZenML Pro server.",
    )

    pro_organization_id: Optional[UUID] = Field(
        None,
        title="The ID of the ZenML Pro organization to which the server is "
        "connected. Only set if the server is a ZenML Pro server.",
    )

    pro_organization_name: Optional[str] = Field(
        None,
        title="The name of the ZenML Pro organization to which the server is "
        "connected. Only set if the server is a ZenML Pro server.",
    )

    pro_workspace_id: Optional[UUID] = Field(
        None,
        title="The ID of the ZenML Pro workspace to which the server is "
        "connected. Only set if the server is a ZenML Pro server.",
    )

    pro_workspace_name: Optional[str] = Field(
        None,
        title="The name of the ZenML Pro workspace to which the server is "
        "connected. Only set if the server is a ZenML Pro server.",
    )

    def is_local(self) -> bool:
        """Return whether the server is running locally.

        Returns:
            True if the server is running locally, False otherwise.
        """
        from zenml.config.global_config import GlobalConfiguration

        # Local ZenML servers are identifiable by the fact that their
        # server ID is the same as the local client (user) ID.
        return self.id == GlobalConfiguration().user_id

    def is_pro_server(self) -> bool:
        """Return whether the server is a ZenML Pro server.

        Returns:
            True if the server is a ZenML Pro server, False otherwise.
        """
        return self.deployment_type == ServerDeploymentType.CLOUD

is_local()

Return whether the server is running locally.

Returns:

Type Description
bool

True if the server is running locally, False otherwise.

Source code in src/zenml/models/v2/misc/server_models.py
146
147
148
149
150
151
152
153
154
155
156
def is_local(self) -> bool:
    """Return whether the server is running locally.

    Returns:
        True if the server is running locally, False otherwise.
    """
    from zenml.config.global_config import GlobalConfiguration

    # Local ZenML servers are identifiable by the fact that their
    # server ID is the same as the local client (user) ID.
    return self.id == GlobalConfiguration().user_id

is_pro_server()

Return whether the server is a ZenML Pro server.

Returns:

Type Description
bool

True if the server is a ZenML Pro server, False otherwise.

Source code in src/zenml/models/v2/misc/server_models.py
158
159
160
161
162
163
164
def is_pro_server(self) -> bool:
    """Return whether the server is a ZenML Pro server.

    Returns:
        True if the server is a ZenML Pro server, False otherwise.
    """
    return self.deployment_type == ServerDeploymentType.CLOUD

ServerSettingsResponse

Bases: BaseResponse[ServerSettingsResponseBody, ServerSettingsResponseMetadata, ServerSettingsResponseResources]

Response model for server settings.

Source code in src/zenml/models/v2/core/server_settings.py
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
class ServerSettingsResponse(
    BaseResponse[
        ServerSettingsResponseBody,
        ServerSettingsResponseMetadata,
        ServerSettingsResponseResources,
    ]
):
    """Response model for server settings."""

    def get_hydrated_version(self) -> "ServerSettingsResponse":
        """Get the hydrated version of the server settings.

        Returns:
            An instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_server_settings(hydrate=True)

    # Body and metadata properties

    @property
    def server_id(self) -> UUID:
        """The `server_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().server_id

    @property
    def server_name(self) -> str:
        """The `server_name` property.

        Returns:
            the value of the property.
        """
        return self.get_body().server_name

    @property
    def logo_url(self) -> Optional[str]:
        """The `logo_url` property.

        Returns:
            the value of the property.
        """
        return self.get_body().logo_url

    @property
    def enable_analytics(self) -> bool:
        """The `enable_analytics` property.

        Returns:
            the value of the property.
        """
        return self.get_body().enable_analytics

    @property
    def display_announcements(self) -> Optional[bool]:
        """The `display_announcements` property.

        Returns:
            the value of the property.
        """
        return self.get_body().display_announcements

    @property
    def display_updates(self) -> Optional[bool]:
        """The `display_updates` property.

        Returns:
            the value of the property.
        """
        return self.get_body().display_updates

    @property
    def active(self) -> bool:
        """The `active` property.

        Returns:
            the value of the property.
        """
        return self.get_body().active

    @property
    def last_user_activity(self) -> datetime:
        """The `last_user_activity` property.

        Returns:
            the value of the property.
        """
        return self.get_body().last_user_activity

    @property
    def updated(self) -> datetime:
        """The `updated` property.

        Returns:
            the value of the property.
        """
        return self.get_body().updated

active property

The active property.

Returns:

Type Description
bool

the value of the property.

display_announcements property

The display_announcements property.

Returns:

Type Description
Optional[bool]

the value of the property.

display_updates property

The display_updates property.

Returns:

Type Description
Optional[bool]

the value of the property.

enable_analytics property

The enable_analytics property.

Returns:

Type Description
bool

the value of the property.

last_user_activity property

The last_user_activity property.

Returns:

Type Description
datetime

the value of the property.

logo_url property

The logo_url property.

Returns:

Type Description
Optional[str]

the value of the property.

server_id property

The server_id property.

Returns:

Type Description
UUID

the value of the property.

server_name property

The server_name property.

Returns:

Type Description
str

the value of the property.

updated property

The updated property.

Returns:

Type Description
datetime

the value of the property.

get_hydrated_version()

Get the hydrated version of the server settings.

Returns:

Type Description
ServerSettingsResponse

An instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/server_settings.py
110
111
112
113
114
115
116
117
118
def get_hydrated_version(self) -> "ServerSettingsResponse":
    """Get the hydrated version of the server settings.

    Returns:
        An instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_server_settings(hydrate=True)

ServerSettingsResponseBody

Bases: BaseResponseBody

Response body for server settings.

Source code in src/zenml/models/v2/core/server_settings.py
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
class ServerSettingsResponseBody(BaseResponseBody):
    """Response body for server settings."""

    server_id: UUID = Field(
        title="The unique server id.",
    )
    server_name: str = Field(title="The name of the server.")
    logo_url: Optional[str] = Field(
        default=None, title="The logo URL of the server."
    )
    active: bool = Field(
        title="Whether the server has been activated or not.",
    )
    enable_analytics: bool = Field(
        title="Whether analytics are enabled for the server.",
    )
    display_announcements: Optional[bool] = Field(
        title="Whether to display announcements about ZenML in the dashboard.",
    )
    display_updates: Optional[bool] = Field(
        title="Whether to display notifications about ZenML updates in the dashboard.",
    )
    last_user_activity: datetime = Field(
        title="The timestamp when the last user activity was detected.",
    )
    updated: datetime = Field(
        title="The timestamp when this resource was last updated."
    )

ServerSettingsResponseMetadata

Bases: BaseResponseMetadata

Response metadata for server settings.

Source code in src/zenml/models/v2/core/server_settings.py
93
94
class ServerSettingsResponseMetadata(BaseResponseMetadata):
    """Response metadata for server settings."""

ServerSettingsResponseResources

Bases: BaseResponseResources

Response resources for server settings.

Source code in src/zenml/models/v2/core/server_settings.py
97
98
class ServerSettingsResponseResources(BaseResponseResources):
    """Response resources for server settings."""

ServerSettingsUpdate

Bases: BaseUpdate

Model for updating server settings.

Source code in src/zenml/models/v2/core/server_settings.py
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
class ServerSettingsUpdate(BaseUpdate):
    """Model for updating server settings."""

    server_name: Optional[str] = Field(
        default=None, title="The name of the server."
    )
    logo_url: Optional[str] = Field(
        default=None, title="The logo URL of the server."
    )
    enable_analytics: Optional[bool] = Field(
        default=None,
        title="Whether to enable analytics for the server.",
    )
    display_announcements: Optional[bool] = Field(
        default=None,
        title="Whether to display announcements about ZenML in the dashboard.",
    )
    display_updates: Optional[bool] = Field(
        default=None,
        title="Whether to display notifications about ZenML updates in the dashboard.",
    )

ServerStatistics

Bases: BaseZenModel

Server statistics.

Source code in src/zenml/models/v2/misc/statistics.py
34
35
36
37
38
39
40
41
42
43
44
45
class ServerStatistics(BaseZenModel):
    """Server statistics."""

    stacks: int = Field(
        title="The number of stacks.",
    )
    components: int = Field(
        title="The number of components.",
    )
    projects: int = Field(
        title="The number of projects.",
    )

ServiceAccountFilter

Bases: BaseFilter

Model to enable advanced filtering of service accounts.

Source code in src/zenml/models/v2/core/service_account.py
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
class ServiceAccountFilter(BaseFilter):
    """Model to enable advanced filtering of service accounts."""

    name: Optional[str] = Field(
        default=None,
        description="Name of the user",
    )
    description: Optional[str] = Field(
        default=None,
        title="Filter by the service account description.",
    )
    active: Optional[Union[bool, str]] = Field(
        default=None,
        description="Whether the user is active",
        union_mode="left_to_right",
    )

    def apply_filter(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Override to filter out user accounts from the query.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        query = super().apply_filter(query=query, table=table)
        query = query.where(
            getattr(table, "is_service_account") == True  # noqa: E712
        )

        return query

apply_filter(query, table)

Override to filter out user accounts from the query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/core/service_account.py
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
def apply_filter(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Override to filter out user accounts from the query.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    query = super().apply_filter(query=query, table=table)
    query = query.where(
        getattr(table, "is_service_account") == True  # noqa: E712
    )

    return query

ServiceAccountRequest

Bases: BaseRequest

Request model for service accounts.

Source code in src/zenml/models/v2/core/service_account.py
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
class ServiceAccountRequest(BaseRequest):
    """Request model for service accounts."""

    ANALYTICS_FIELDS: ClassVar[List[str]] = [
        "name",
        "active",
    ]

    name: str = Field(
        title="The unique name for the service account.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    description: Optional[str] = Field(
        default=None,
        title="A description of the service account.",
        max_length=TEXT_FIELD_MAX_LENGTH,
    )
    active: bool = Field(title="Whether the service account is active or not.")
    model_config = ConfigDict(validate_assignment=True, extra="ignore")

ServiceAccountResponse

Bases: BaseIdentifiedResponse[ServiceAccountResponseBody, ServiceAccountResponseMetadata, ServiceAccountResponseResources]

Response model for service accounts.

Source code in src/zenml/models/v2/core/service_account.py
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
class ServiceAccountResponse(
    BaseIdentifiedResponse[
        ServiceAccountResponseBody,
        ServiceAccountResponseMetadata,
        ServiceAccountResponseResources,
    ]
):
    """Response model for service accounts."""

    ANALYTICS_FIELDS: ClassVar[List[str]] = [
        "name",
        "active",
    ]

    name: str = Field(
        title="The unique username for the account.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "ServiceAccountResponse":
        """Get the hydrated version of this service account.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_service_account(self.id)

    def to_user_model(self) -> "UserResponse":
        """Converts the service account to a user model.

        For now, a lot of code still relies on the active user and resource
        owners being a UserResponse object, which is a superset of the
        ServiceAccountResponse object. We need this method to convert the
        service account to a user.

        Returns:
            The user model.
        """
        from zenml.models.v2.core.user import (
            UserResponse,
            UserResponseBody,
            UserResponseMetadata,
        )

        return UserResponse(
            id=self.id,
            name=self.name,
            body=UserResponseBody(
                active=self.active,
                is_service_account=True,
                email_opted_in=False,
                created=self.created,
                updated=self.updated,
                is_admin=False,
            ),
            metadata=UserResponseMetadata(
                description=self.description,
            ),
        )

    # Body and metadata properties
    @property
    def active(self) -> bool:
        """The `active` property.

        Returns:
            the value of the property.
        """
        return self.get_body().active

    @property
    def description(self) -> str:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

active property

The active property.

Returns:

Type Description
bool

the value of the property.

description property

The description property.

Returns:

Type Description
str

the value of the property.

get_hydrated_version()

Get the hydrated version of this service account.

Returns:

Type Description
ServiceAccountResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/service_account.py
126
127
128
129
130
131
132
133
134
def get_hydrated_version(self) -> "ServiceAccountResponse":
    """Get the hydrated version of this service account.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_service_account(self.id)

to_user_model()

Converts the service account to a user model.

For now, a lot of code still relies on the active user and resource owners being a UserResponse object, which is a superset of the ServiceAccountResponse object. We need this method to convert the service account to a user.

Returns:

Type Description
UserResponse

The user model.

Source code in src/zenml/models/v2/core/service_account.py
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
def to_user_model(self) -> "UserResponse":
    """Converts the service account to a user model.

    For now, a lot of code still relies on the active user and resource
    owners being a UserResponse object, which is a superset of the
    ServiceAccountResponse object. We need this method to convert the
    service account to a user.

    Returns:
        The user model.
    """
    from zenml.models.v2.core.user import (
        UserResponse,
        UserResponseBody,
        UserResponseMetadata,
    )

    return UserResponse(
        id=self.id,
        name=self.name,
        body=UserResponseBody(
            active=self.active,
            is_service_account=True,
            email_opted_in=False,
            created=self.created,
            updated=self.updated,
            is_admin=False,
        ),
        metadata=UserResponseMetadata(
            description=self.description,
        ),
    )

ServiceAccountResponseBody

Bases: BaseDatedResponseBody

Response body for service accounts.

Source code in src/zenml/models/v2/core/service_account.py
87
88
89
90
class ServiceAccountResponseBody(BaseDatedResponseBody):
    """Response body for service accounts."""

    active: bool = Field(default=False, title="Whether the account is active.")

ServiceAccountResponseMetadata

Bases: BaseResponseMetadata

Response metadata for service accounts.

Source code in src/zenml/models/v2/core/service_account.py
 93
 94
 95
 96
 97
 98
 99
100
class ServiceAccountResponseMetadata(BaseResponseMetadata):
    """Response metadata for service accounts."""

    description: str = Field(
        default="",
        title="A description of the service account.",
        max_length=TEXT_FIELD_MAX_LENGTH,
    )

ServiceAccountUpdate

Bases: BaseUpdate

Update model for service accounts.

Source code in src/zenml/models/v2/core/service_account.py
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
class ServiceAccountUpdate(BaseUpdate):
    """Update model for service accounts."""

    ANALYTICS_FIELDS: ClassVar[List[str]] = ["name", "active"]

    name: Optional[str] = Field(
        title="The unique name for the service account.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    description: Optional[str] = Field(
        title="A description of the service account.",
        max_length=TEXT_FIELD_MAX_LENGTH,
        default=None,
    )
    active: Optional[bool] = Field(
        title="Whether the service account is active or not.",
        default=None,
    )

    model_config = ConfigDict(validate_assignment=True)

ServiceConnectorFilter

Bases: UserScopedFilter

Model to enable advanced filtering of service connectors.

Source code in src/zenml/models/v2/core/service_connector.py
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
class ServiceConnectorFilter(UserScopedFilter):
    """Model to enable advanced filtering of service connectors."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *UserScopedFilter.FILTER_EXCLUDE_FIELDS,
        "resource_type",
        "labels_str",
        "labels",
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *UserScopedFilter.CLI_EXCLUDE_FIELDS,
        "labels_str",
        "labels",
    ]
    name: Optional[str] = Field(
        default=None,
        description="The name to filter by",
    )
    connector_type: Optional[str] = Field(
        default=None,
        description="The type of service connector to filter by",
    )
    auth_method: Optional[str] = Field(
        default=None,
        title="Filter by the authentication method configured for the "
        "connector",
    )
    resource_type: Optional[str] = Field(
        default=None,
        title="Filter by the type of resource that the connector can be used "
        "to access",
    )
    resource_id: Optional[str] = Field(
        default=None,
        title="Filter by the ID of the resource instance that the connector "
        "is configured to access",
    )
    labels_str: Optional[str] = Field(
        default=None,
        title="Filter by one or more labels. This field can be either a JSON "
        "formatted dictionary of label names and values, where the values are "
        'optional and can be set to None (e.g. `{"label1":"value1", "label2": '
        "null}` ), or a comma-separated list of label names and values (e.g "
        "`label1=value1,label2=`. If a label name is specified without a "
        "value, the filter will match all service connectors that have that "
        "label present, regardless of value.",
    )
    secret_id: Optional[Union[UUID, str]] = Field(
        default=None,
        title="Filter by the ID of the secret that contains the service "
        "connector's credentials",
        union_mode="left_to_right",
    )

    # Use this internally to configure and access the labels as a dictionary
    labels: Optional[Dict[str, Optional[str]]] = Field(
        default=None,
        title="The labels to filter by, as a dictionary",
        exclude=True,
    )

    @model_validator(mode="after")
    def validate_labels(self) -> "ServiceConnectorFilter":
        """Parse the labels string into a label dictionary and vice-versa.

        Returns:
            The validated values.
        """
        if self.labels_str is not None:
            try:
                self.labels = json.loads(self.labels_str)
            except json.JSONDecodeError:
                # Interpret as comma-separated values instead
                self.labels = {
                    label.split("=", 1)[0]: label.split("=", 1)[1]
                    if "=" in label
                    else None
                    for label in self.labels_str.split(",")
                }
        elif self.labels is not None:
            self.labels_str = json.dumps(self.labels)

        return self

validate_labels()

Parse the labels string into a label dictionary and vice-versa.

Returns:

Type Description
ServiceConnectorFilter

The validated values.

Source code in src/zenml/models/v2/core/service_connector.py
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
@model_validator(mode="after")
def validate_labels(self) -> "ServiceConnectorFilter":
    """Parse the labels string into a label dictionary and vice-versa.

    Returns:
        The validated values.
    """
    if self.labels_str is not None:
        try:
            self.labels = json.loads(self.labels_str)
        except json.JSONDecodeError:
            # Interpret as comma-separated values instead
            self.labels = {
                label.split("=", 1)[0]: label.split("=", 1)[1]
                if "=" in label
                else None
                for label in self.labels_str.split(",")
            }
    elif self.labels is not None:
        self.labels_str = json.dumps(self.labels)

    return self

ServiceConnectorInfo

Bases: BaseModel

Information about the service connector when creating a full stack.

Source code in src/zenml/models/v2/misc/info_models.py
26
27
28
29
30
31
class ServiceConnectorInfo(BaseModel):
    """Information about the service connector when creating a full stack."""

    type: str
    auth_method: str
    configuration: Dict[str, Any] = {}

ServiceConnectorRequest

Bases: UserScopedRequest

Request model for service connectors.

Source code in src/zenml/models/v2/core/service_connector.py
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
class ServiceConnectorRequest(UserScopedRequest):
    """Request model for service connectors."""

    name: str = Field(
        title="The service connector name.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    connector_type: Union[str, "ServiceConnectorTypeModel"] = Field(
        title="The type of service connector.",
        union_mode="left_to_right",
    )
    description: str = Field(
        default="",
        title="The service connector instance description.",
    )
    auth_method: str = Field(
        title="The authentication method that the connector instance uses to "
        "access the resources.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    resource_types: List[str] = Field(
        default_factory=list,
        title="The type(s) of resource that the connector instance can be used "
        "to gain access to.",
    )
    resource_id: Optional[str] = Field(
        default=None,
        title="Uniquely identifies a specific resource instance that the "
        "connector instance can be used to access. If omitted, the connector "
        "instance can be used to access any and all resource instances that "
        "the authentication method and resource type(s) allow.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    supports_instances: bool = Field(
        default=False,
        title="Indicates whether the connector instance can be used to access "
        "multiple instances of the configured resource type.",
    )
    expires_at: Optional[datetime] = Field(
        default=None,
        title="Time when the authentication credentials configured for the "
        "connector expire. If omitted, the credentials do not expire.",
    )
    expires_skew_tolerance: Optional[int] = Field(
        default=None,
        title="The number of seconds of tolerance to apply when checking "
        "whether the authentication credentials configured for the connector "
        "have expired. If omitted, no tolerance is applied.",
    )
    expiration_seconds: Optional[int] = Field(
        default=None,
        title="The duration, in seconds, that the temporary credentials "
        "generated by this connector should remain valid. Only applicable for "
        "connectors and authentication methods that involve generating "
        "temporary credentials from the ones configured in the connector.",
    )
    configuration: Dict[str, Any] = Field(
        default_factory=dict,
        title="The service connector configuration, not including secrets.",
    )
    secrets: Dict[str, Optional[PlainSerializedSecretStr]] = Field(
        default_factory=dict,
        title="The service connector secrets.",
    )
    labels: Dict[str, str] = Field(
        default_factory=dict,
        title="Service connector labels.",
    )

    # Analytics
    ANALYTICS_FIELDS: ClassVar[List[str]] = [
        "connector_type",
        "auth_method",
        "resource_types",
    ]

    def get_analytics_metadata(self) -> Dict[str, Any]:
        """Format the resource types in the analytics metadata.

        Returns:
            Dict of analytics metadata.
        """
        metadata = super().get_analytics_metadata()
        if len(self.resource_types) == 1:
            metadata["resource_types"] = self.resource_types[0]
        else:
            metadata["resource_types"] = ", ".join(self.resource_types)
        metadata["connector_type"] = self.type
        return metadata

    # Helper methods
    @property
    def type(self) -> str:
        """Get the connector type.

        Returns:
            The connector type.
        """
        if isinstance(self.connector_type, str):
            return self.connector_type
        return self.connector_type.connector_type

    @property
    def emojified_connector_type(self) -> str:
        """Get the emojified connector type.

        Returns:
            The emojified connector type.
        """
        if not isinstance(self.connector_type, str):
            return self.connector_type.emojified_connector_type

        return self.connector_type

    @property
    def emojified_resource_types(self) -> List[str]:
        """Get the emojified connector type.

        Returns:
            The emojified connector type.
        """
        if not isinstance(self.connector_type, str):
            return [
                self.connector_type.resource_type_dict[
                    resource_type
                ].emojified_resource_type
                for resource_type in self.resource_types
            ]

        return self.resource_types

    def validate_and_configure_resources(
        self,
        connector_type: "ServiceConnectorTypeModel",
        resource_types: Optional[Union[str, List[str]]] = None,
        resource_id: Optional[str] = None,
        configuration: Optional[Dict[str, Any]] = None,
        secrets: Optional[Dict[str, Optional[SecretStr]]] = None,
    ) -> None:
        """Validate and configure the resources that the connector can be used to access.

        Args:
            connector_type: The connector type specification used to validate
                the connector configuration.
            resource_types: The type(s) of resource that the connector instance
                can be used to access. If omitted, a multi-type connector is
                configured.
            resource_id: Uniquely identifies a specific resource instance that
                the connector instance can be used to access.
            configuration: The connector configuration.
            secrets: The connector secrets.
        """
        _validate_and_configure_resources(
            connector=self,
            connector_type=connector_type,
            resource_types=resource_types,
            resource_id=resource_id,
            configuration=configuration,
            secrets=secrets,
        )

emojified_connector_type property

Get the emojified connector type.

Returns:

Type Description
str

The emojified connector type.

emojified_resource_types property

Get the emojified connector type.

Returns:

Type Description
List[str]

The emojified connector type.

type property

Get the connector type.

Returns:

Type Description
str

The connector type.

get_analytics_metadata()

Format the resource types in the analytics metadata.

Returns:

Type Description
Dict[str, Any]

Dict of analytics metadata.

Source code in src/zenml/models/v2/core/service_connector.py
120
121
122
123
124
125
126
127
128
129
130
131
132
def get_analytics_metadata(self) -> Dict[str, Any]:
    """Format the resource types in the analytics metadata.

    Returns:
        Dict of analytics metadata.
    """
    metadata = super().get_analytics_metadata()
    if len(self.resource_types) == 1:
        metadata["resource_types"] = self.resource_types[0]
    else:
        metadata["resource_types"] = ", ".join(self.resource_types)
    metadata["connector_type"] = self.type
    return metadata

validate_and_configure_resources(connector_type, resource_types=None, resource_id=None, configuration=None, secrets=None)

Validate and configure the resources that the connector can be used to access.

Parameters:

Name Type Description Default
connector_type ServiceConnectorTypeModel

The connector type specification used to validate the connector configuration.

required
resource_types Optional[Union[str, List[str]]]

The type(s) of resource that the connector instance can be used to access. If omitted, a multi-type connector is configured.

None
resource_id Optional[str]

Uniquely identifies a specific resource instance that the connector instance can be used to access.

None
configuration Optional[Dict[str, Any]]

The connector configuration.

None
secrets Optional[Dict[str, Optional[SecretStr]]]

The connector secrets.

None
Source code in src/zenml/models/v2/core/service_connector.py
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
def validate_and_configure_resources(
    self,
    connector_type: "ServiceConnectorTypeModel",
    resource_types: Optional[Union[str, List[str]]] = None,
    resource_id: Optional[str] = None,
    configuration: Optional[Dict[str, Any]] = None,
    secrets: Optional[Dict[str, Optional[SecretStr]]] = None,
) -> None:
    """Validate and configure the resources that the connector can be used to access.

    Args:
        connector_type: The connector type specification used to validate
            the connector configuration.
        resource_types: The type(s) of resource that the connector instance
            can be used to access. If omitted, a multi-type connector is
            configured.
        resource_id: Uniquely identifies a specific resource instance that
            the connector instance can be used to access.
        configuration: The connector configuration.
        secrets: The connector secrets.
    """
    _validate_and_configure_resources(
        connector=self,
        connector_type=connector_type,
        resource_types=resource_types,
        resource_id=resource_id,
        configuration=configuration,
        secrets=secrets,
    )

ServiceConnectorRequirements

Bases: BaseModel

Service connector requirements.

Describes requirements that a service connector consumer has for a service connector instance that it needs in order to access a resource.

Attributes:

Name Type Description
connector_type Optional[str]

The type of service connector that is required. If omitted, any service connector type can be used.

resource_type str

The type of resource that the service connector instance must be able to access.

resource_id_attr Optional[str]

The name of an attribute in the stack component configuration that contains the resource ID of the resource that the service connector instance must be able to access.

Source code in src/zenml/models/v2/misc/service_connector_type.py
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
class ServiceConnectorRequirements(BaseModel):
    """Service connector requirements.

    Describes requirements that a service connector consumer has for a
    service connector instance that it needs in order to access a resource.

    Attributes:
        connector_type: The type of service connector that is required. If
            omitted, any service connector type can be used.
        resource_type: The type of resource that the service connector instance
            must be able to access.
        resource_id_attr: The name of an attribute in the stack component
            configuration that contains the resource ID of the resource that
            the service connector instance must be able to access.
    """

    connector_type: Optional[str] = None
    resource_type: str
    resource_id_attr: Optional[str] = None

    def is_satisfied_by(
        self,
        connector: Union[
            "ServiceConnectorResponse", "ServiceConnectorRequest"
        ],
        component: Union["ComponentResponse", "ComponentBase"],
    ) -> Tuple[bool, str]:
        """Check if the requirements are satisfied by a connector.

        Args:
            connector: The connector to check.
            component: The stack component that the connector is associated
                with.

        Returns:
            True if the requirements are satisfied, False otherwise, and a
            message describing the reason for the failure.
        """
        if self.connector_type and self.connector_type != connector.type:
            return (
                False,
                f"connector type '{connector.type}' does not match the "
                f"'{self.connector_type}' connector type specified in the "
                "stack component requirements",
            )
        if self.resource_type not in connector.resource_types:
            return False, (
                f"connector does not provide the '{self.resource_type}' "
                "resource type specified in the stack component requirements. "
                "Only the following resource types are supported: "
                f"{', '.join(connector.resource_types)}"
            )
        if self.resource_id_attr:
            resource_id = component.configuration.get(self.resource_id_attr)
            if not resource_id:
                return (
                    False,
                    f"the '{self.resource_id_attr}' stack component "
                    f"configuration attribute plays the role of resource "
                    f"identifier, but the stack component does not contain a "
                    f"'{self.resource_id_attr}' attribute. Please add the "
                    f"'{self.resource_id_attr}' attribute to the stack "
                    "component configuration and try again.",
                )

        return True, ""

is_satisfied_by(connector, component)

Check if the requirements are satisfied by a connector.

Parameters:

Name Type Description Default
connector Union[ServiceConnectorResponse, ServiceConnectorRequest]

The connector to check.

required
component Union[ComponentResponse, ComponentBase]

The stack component that the connector is associated with.

required

Returns:

Type Description
bool

True if the requirements are satisfied, False otherwise, and a

str

message describing the reason for the failure.

Source code in src/zenml/models/v2/misc/service_connector_type.py
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
def is_satisfied_by(
    self,
    connector: Union[
        "ServiceConnectorResponse", "ServiceConnectorRequest"
    ],
    component: Union["ComponentResponse", "ComponentBase"],
) -> Tuple[bool, str]:
    """Check if the requirements are satisfied by a connector.

    Args:
        connector: The connector to check.
        component: The stack component that the connector is associated
            with.

    Returns:
        True if the requirements are satisfied, False otherwise, and a
        message describing the reason for the failure.
    """
    if self.connector_type and self.connector_type != connector.type:
        return (
            False,
            f"connector type '{connector.type}' does not match the "
            f"'{self.connector_type}' connector type specified in the "
            "stack component requirements",
        )
    if self.resource_type not in connector.resource_types:
        return False, (
            f"connector does not provide the '{self.resource_type}' "
            "resource type specified in the stack component requirements. "
            "Only the following resource types are supported: "
            f"{', '.join(connector.resource_types)}"
        )
    if self.resource_id_attr:
        resource_id = component.configuration.get(self.resource_id_attr)
        if not resource_id:
            return (
                False,
                f"the '{self.resource_id_attr}' stack component "
                f"configuration attribute plays the role of resource "
                f"identifier, but the stack component does not contain a "
                f"'{self.resource_id_attr}' attribute. Please add the "
                f"'{self.resource_id_attr}' attribute to the stack "
                "component configuration and try again.",
            )

    return True, ""

ServiceConnectorResourcesInfo

Bases: BaseModel

Information about the service connector resources needed for CLI and UI.

Source code in src/zenml/models/v2/misc/info_models.py
73
74
75
76
77
78
class ServiceConnectorResourcesInfo(BaseModel):
    """Information about the service connector resources needed for CLI and UI."""

    connector_type: str

    components_resources_info: Dict[StackComponentType, List[ResourcesInfo]]

ServiceConnectorResourcesModel

Bases: BaseModel

Service connector resources list.

Lists the resource types and resource instances that a service connector can provide access to.

Source code in src/zenml/models/v2/misc/service_connector_type.py
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
class ServiceConnectorResourcesModel(BaseModel):
    """Service connector resources list.

    Lists the resource types and resource instances that a service connector
    can provide access to.
    """

    id: Optional[UUID] = Field(
        default=None,
        title="The ID of the service connector instance providing this "
        "resource.",
    )

    name: Optional[str] = Field(
        default=None,
        title="The name of the service connector instance providing this "
        "resource.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    connector_type: Union[str, "ServiceConnectorTypeModel"] = Field(
        title="The type of service connector.", union_mode="left_to_right"
    )

    resources: List[ServiceConnectorTypedResourcesModel] = Field(
        default_factory=list,
        title="The list of resources that the service connector instance can "
        "give access to. Contains one entry for every resource type "
        "that the connector is configured for.",
    )

    error: Optional[str] = Field(
        default=None,
        title="A global error message describing why the service connector "
        "instance could not authenticate to the remote service.",
    )

    @property
    def resources_dict(self) -> Dict[str, ServiceConnectorTypedResourcesModel]:
        """Get the resources as a dictionary indexed by resource type.

        Returns:
            The resources as a dictionary indexed by resource type.
        """
        return {
            resource.resource_type: resource for resource in self.resources
        }

    @property
    def resource_types(self) -> List[str]:
        """Get the resource types.

        Returns:
            The resource types.
        """
        return [resource.resource_type for resource in self.resources]

    def set_error(
        self, error: str, resource_type: Optional[str] = None
    ) -> None:
        """Set a global error message or an error for a single resource type.

        Args:
            error: The error message.
            resource_type: The resource type to set the error message for. If
                omitted, or if there is only one resource type involved, the
                error message is (also) set globally.

        Raises:
            KeyError: If the resource type is not found in the resources list.
        """
        if resource_type:
            resource = self.resources_dict.get(resource_type)
            if not resource:
                raise KeyError(
                    f"resource type '{resource_type}' not found in "
                    "service connector resources list"
                )
            resource.error = error
            resource.resource_ids = None
            if len(self.resources) == 1:
                # If there is only one resource type involved, set the global
                # error message as well.
                self.error = error
        else:
            self.error = error
            for resource in self.resources:
                resource.error = error
                resource.resource_ids = None

    def set_resource_ids(
        self, resource_type: str, resource_ids: List[str]
    ) -> None:
        """Set the resource IDs for a resource type.

        Args:
            resource_type: The resource type to set the resource IDs for.
            resource_ids: The resource IDs to set.

        Raises:
            KeyError: If the resource type is not found in the resources list.
        """
        resource = self.resources_dict.get(resource_type)
        if not resource:
            raise KeyError(
                f"resource type '{resource_type}' not found in "
                "service connector resources list"
            )
        resource.resource_ids = resource_ids
        resource.error = None

    @property
    def type(self) -> str:
        """Get the connector type.

        Returns:
            The connector type.
        """
        if isinstance(self.connector_type, str):
            return self.connector_type
        return self.connector_type.connector_type

    @property
    def emojified_connector_type(self) -> str:
        """Get the emojified connector type.

        Returns:
            The emojified connector type.
        """
        if not isinstance(self.connector_type, str):
            return self.connector_type.emojified_connector_type

        return self.connector_type

    def get_emojified_resource_types(
        self, resource_type: Optional[str] = None
    ) -> List[str]:
        """Get the emojified resource type.

        Args:
            resource_type: The resource type to get the emojified resource type
                for. If omitted, the emojified resource type for all resource
                types is returned.


        Returns:
            The list of emojified resource types.
        """
        if not isinstance(self.connector_type, str):
            if resource_type:
                return [
                    self.connector_type.resource_type_dict[
                        resource_type
                    ].emojified_resource_type
                ]
            return [
                self.connector_type.resource_type_dict[
                    resource_type
                ].emojified_resource_type
                for resource_type in self.resources_dict.keys()
            ]
        if resource_type:
            return [resource_type]
        return list(self.resources_dict.keys())

    def get_default_resource_id(self) -> Optional[str]:
        """Get the default resource ID, if included in the resource list.

        The default resource ID is a resource ID supplied by the connector
        implementation only for resource types that do not support multiple
        instances.

        Returns:
            The default resource ID, or None if no resource ID is set.
        """
        if len(self.resources) != 1:
            # multi-type connectors do not have a default resource ID
            return None

        if isinstance(self.connector_type, str):
            # can't determine default resource ID for unknown connector types
            return None

        resource_type_spec = self.connector_type.resource_type_dict[
            self.resources[0].resource_type
        ]
        if resource_type_spec.supports_instances:
            # resource types that support multiple instances do not have a
            # default resource ID
            return None

        resource_ids = self.resources[0].resource_ids

        if not resource_ids or len(resource_ids) != 1:
            return None

        return resource_ids[0]

    @classmethod
    def from_connector_model(
        cls,
        connector_model: "ServiceConnectorResponse",
        resource_type: Optional[str] = None,
    ) -> "ServiceConnectorResourcesModel":
        """Initialize a resource model from a connector model.

        Args:
            connector_model: The connector model.
            resource_type: The resource type to set on the resource model. If
                omitted, the resource type is set according to the connector
                model.

        Returns:
            A resource list model instance.
        """
        resources = cls(
            id=connector_model.id,
            name=connector_model.name,
            connector_type=connector_model.type,
        )

        resource_types = resource_type or connector_model.resource_types
        for resource_type in resource_types:
            resources.resources.append(
                ServiceConnectorTypedResourcesModel(
                    resource_type=resource_type,
                    resource_ids=[connector_model.resource_id]
                    if connector_model.resource_id
                    else None,
                )
            )

        return resources

emojified_connector_type property

Get the emojified connector type.

Returns:

Type Description
str

The emojified connector type.

resource_types property

Get the resource types.

Returns:

Type Description
List[str]

The resource types.

resources_dict property

Get the resources as a dictionary indexed by resource type.

Returns:

Type Description
Dict[str, ServiceConnectorTypedResourcesModel]

The resources as a dictionary indexed by resource type.

type property

Get the connector type.

Returns:

Type Description
str

The connector type.

from_connector_model(connector_model, resource_type=None) classmethod

Initialize a resource model from a connector model.

Parameters:

Name Type Description Default
connector_model ServiceConnectorResponse

The connector model.

required
resource_type Optional[str]

The resource type to set on the resource model. If omitted, the resource type is set according to the connector model.

None

Returns:

Type Description
ServiceConnectorResourcesModel

A resource list model instance.

Source code in src/zenml/models/v2/misc/service_connector_type.py
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def from_connector_model(
    cls,
    connector_model: "ServiceConnectorResponse",
    resource_type: Optional[str] = None,
) -> "ServiceConnectorResourcesModel":
    """Initialize a resource model from a connector model.

    Args:
        connector_model: The connector model.
        resource_type: The resource type to set on the resource model. If
            omitted, the resource type is set according to the connector
            model.

    Returns:
        A resource list model instance.
    """
    resources = cls(
        id=connector_model.id,
        name=connector_model.name,
        connector_type=connector_model.type,
    )

    resource_types = resource_type or connector_model.resource_types
    for resource_type in resource_types:
        resources.resources.append(
            ServiceConnectorTypedResourcesModel(
                resource_type=resource_type,
                resource_ids=[connector_model.resource_id]
                if connector_model.resource_id
                else None,
            )
        )

    return resources

get_default_resource_id()

Get the default resource ID, if included in the resource list.

The default resource ID is a resource ID supplied by the connector implementation only for resource types that do not support multiple instances.

Returns:

Type Description
Optional[str]

The default resource ID, or None if no resource ID is set.

Source code in src/zenml/models/v2/misc/service_connector_type.py
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
def get_default_resource_id(self) -> Optional[str]:
    """Get the default resource ID, if included in the resource list.

    The default resource ID is a resource ID supplied by the connector
    implementation only for resource types that do not support multiple
    instances.

    Returns:
        The default resource ID, or None if no resource ID is set.
    """
    if len(self.resources) != 1:
        # multi-type connectors do not have a default resource ID
        return None

    if isinstance(self.connector_type, str):
        # can't determine default resource ID for unknown connector types
        return None

    resource_type_spec = self.connector_type.resource_type_dict[
        self.resources[0].resource_type
    ]
    if resource_type_spec.supports_instances:
        # resource types that support multiple instances do not have a
        # default resource ID
        return None

    resource_ids = self.resources[0].resource_ids

    if not resource_ids or len(resource_ids) != 1:
        return None

    return resource_ids[0]

get_emojified_resource_types(resource_type=None)

Get the emojified resource type.

Parameters:

Name Type Description Default
resource_type Optional[str]

The resource type to get the emojified resource type for. If omitted, the emojified resource type for all resource types is returned.

None

Returns:

Type Description
List[str]

The list of emojified resource types.

Source code in src/zenml/models/v2/misc/service_connector_type.py
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
def get_emojified_resource_types(
    self, resource_type: Optional[str] = None
) -> List[str]:
    """Get the emojified resource type.

    Args:
        resource_type: The resource type to get the emojified resource type
            for. If omitted, the emojified resource type for all resource
            types is returned.


    Returns:
        The list of emojified resource types.
    """
    if not isinstance(self.connector_type, str):
        if resource_type:
            return [
                self.connector_type.resource_type_dict[
                    resource_type
                ].emojified_resource_type
            ]
        return [
            self.connector_type.resource_type_dict[
                resource_type
            ].emojified_resource_type
            for resource_type in self.resources_dict.keys()
        ]
    if resource_type:
        return [resource_type]
    return list(self.resources_dict.keys())

set_error(error, resource_type=None)

Set a global error message or an error for a single resource type.

Parameters:

Name Type Description Default
error str

The error message.

required
resource_type Optional[str]

The resource type to set the error message for. If omitted, or if there is only one resource type involved, the error message is (also) set globally.

None

Raises:

Type Description
KeyError

If the resource type is not found in the resources list.

Source code in src/zenml/models/v2/misc/service_connector_type.py
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
def set_error(
    self, error: str, resource_type: Optional[str] = None
) -> None:
    """Set a global error message or an error for a single resource type.

    Args:
        error: The error message.
        resource_type: The resource type to set the error message for. If
            omitted, or if there is only one resource type involved, the
            error message is (also) set globally.

    Raises:
        KeyError: If the resource type is not found in the resources list.
    """
    if resource_type:
        resource = self.resources_dict.get(resource_type)
        if not resource:
            raise KeyError(
                f"resource type '{resource_type}' not found in "
                "service connector resources list"
            )
        resource.error = error
        resource.resource_ids = None
        if len(self.resources) == 1:
            # If there is only one resource type involved, set the global
            # error message as well.
            self.error = error
    else:
        self.error = error
        for resource in self.resources:
            resource.error = error
            resource.resource_ids = None

set_resource_ids(resource_type, resource_ids)

Set the resource IDs for a resource type.

Parameters:

Name Type Description Default
resource_type str

The resource type to set the resource IDs for.

required
resource_ids List[str]

The resource IDs to set.

required

Raises:

Type Description
KeyError

If the resource type is not found in the resources list.

Source code in src/zenml/models/v2/misc/service_connector_type.py
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
def set_resource_ids(
    self, resource_type: str, resource_ids: List[str]
) -> None:
    """Set the resource IDs for a resource type.

    Args:
        resource_type: The resource type to set the resource IDs for.
        resource_ids: The resource IDs to set.

    Raises:
        KeyError: If the resource type is not found in the resources list.
    """
    resource = self.resources_dict.get(resource_type)
    if not resource:
        raise KeyError(
            f"resource type '{resource_type}' not found in "
            "service connector resources list"
        )
    resource.resource_ids = resource_ids
    resource.error = None

ServiceConnectorResponse

Bases: UserScopedResponse[ServiceConnectorResponseBody, ServiceConnectorResponseMetadata, ServiceConnectorResponseResources]

Response model for service connectors.

Source code in src/zenml/models/v2/core/service_connector.py
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
class ServiceConnectorResponse(
    UserScopedResponse[
        ServiceConnectorResponseBody,
        ServiceConnectorResponseMetadata,
        ServiceConnectorResponseResources,
    ]
):
    """Response model for service connectors."""

    # Disable the warning for updating responses, because we update the
    # service connector type in place
    _warn_on_response_updates: bool = False

    name: str = Field(
        title="The service connector name.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_analytics_metadata(self) -> Dict[str, Any]:
        """Add the service connector labels to analytics metadata.

        Returns:
            Dict of analytics metadata.
        """
        metadata = super().get_analytics_metadata()

        metadata.update(
            {
                label[6:]: value
                for label, value in self.labels.items()
                if label.startswith("zenml:")
            }
        )
        return metadata

    def get_hydrated_version(self) -> "ServiceConnectorResponse":
        """Get the hydrated version of this service connector.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_service_connector(self.id)

    # Helper methods
    @property
    def type(self) -> str:
        """Get the connector type.

        Returns:
            The connector type.
        """
        if isinstance(self.connector_type, str):
            return self.connector_type
        return self.connector_type.connector_type

    @property
    def emojified_connector_type(self) -> str:
        """Get the emojified connector type.

        Returns:
            The emojified connector type.
        """
        if not isinstance(self.connector_type, str):
            return self.connector_type.emojified_connector_type

        return self.connector_type

    @property
    def emojified_resource_types(self) -> List[str]:
        """Get the emojified connector type.

        Returns:
            The emojified connector type.
        """
        if not isinstance(self.connector_type, str):
            return [
                self.connector_type.resource_type_dict[
                    resource_type
                ].emojified_resource_type
                for resource_type in self.resource_types
            ]

        return self.resource_types

    @property
    def is_multi_type(self) -> bool:
        """Checks if the connector is multi-type.

        A multi-type connector can be used to access multiple types of
        resources.

        Returns:
            True if the connector is multi-type, False otherwise.
        """
        return len(self.resource_types) > 1

    @property
    def is_multi_instance(self) -> bool:
        """Checks if the connector is multi-instance.

        A multi-instance connector is configured to access multiple instances
        of the configured resource type.

        Returns:
            True if the connector is multi-instance, False otherwise.
        """
        return (
            not self.is_multi_type
            and self.supports_instances
            and not self.resource_id
        )

    @property
    def is_single_instance(self) -> bool:
        """Checks if the connector is single-instance.

        A single-instance connector is configured to access only a single
        instance of the configured resource type or does not support multiple
        resource instances.

        Returns:
            True if the connector is single-instance, False otherwise.
        """
        return not self.is_multi_type and not self.is_multi_instance

    @property
    def full_configuration(self) -> Dict[str, str]:
        """Get the full connector configuration, including secrets.

        Returns:
            The full connector configuration, including secrets.
        """
        config = self.configuration.copy()
        config.update(
            {k: v.get_secret_value() for k, v in self.secrets.items() if v}
        )
        return config

    def set_connector_type(
        self, value: Union[str, "ServiceConnectorTypeModel"]
    ) -> None:
        """Auxiliary method to set the connector type.

        Args:
            value: the new value for the connector type.
        """
        self.get_body().connector_type = value

    def validate_and_configure_resources(
        self,
        connector_type: "ServiceConnectorTypeModel",
        resource_types: Optional[Union[str, List[str]]] = None,
        resource_id: Optional[str] = None,
        configuration: Optional[Dict[str, Any]] = None,
        secrets: Optional[Dict[str, Optional[SecretStr]]] = None,
    ) -> None:
        """Validate and configure the resources that the connector can be used to access.

        Args:
            connector_type: The connector type specification used to validate
                the connector configuration.
            resource_types: The type(s) of resource that the connector instance
                can be used to access. If omitted, a multi-type connector is
                configured.
            resource_id: Uniquely identifies a specific resource instance that
                the connector instance can be used to access.
            configuration: The connector configuration.
            secrets: The connector secrets.
        """
        _validate_and_configure_resources(
            connector=self,
            connector_type=connector_type,
            resource_types=resource_types,
            resource_id=resource_id,
            configuration=configuration,
            secrets=secrets,
        )

    # Body and metadata properties
    @property
    def description(self) -> str:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_body().description

    @property
    def connector_type(self) -> Union[str, "ServiceConnectorTypeModel"]:
        """The `connector_type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().connector_type

    @property
    def auth_method(self) -> str:
        """The `auth_method` property.

        Returns:
            the value of the property.
        """
        return self.get_body().auth_method

    @property
    def resource_types(self) -> List[str]:
        """The `resource_types` property.

        Returns:
            the value of the property.
        """
        return self.get_body().resource_types

    @property
    def resource_id(self) -> Optional[str]:
        """The `resource_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().resource_id

    @property
    def supports_instances(self) -> bool:
        """The `supports_instances` property.

        Returns:
            the value of the property.
        """
        return self.get_body().supports_instances

    @property
    def expires_at(self) -> Optional[datetime]:
        """The `expires_at` property.

        Returns:
            the value of the property.
        """
        return self.get_body().expires_at

    @property
    def expires_skew_tolerance(self) -> Optional[int]:
        """The `expires_skew_tolerance` property.

        Returns:
            the value of the property.
        """
        return self.get_body().expires_skew_tolerance

    @property
    def configuration(self) -> Dict[str, Any]:
        """The `configuration` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().configuration

    @property
    def secret_id(self) -> Optional[UUID]:
        """The `secret_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().secret_id

    @property
    def expiration_seconds(self) -> Optional[int]:
        """The `expiration_seconds` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().expiration_seconds

    @property
    def secrets(self) -> Dict[str, Optional[SecretStr]]:
        """The `secrets` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().secrets

    @property
    def labels(self) -> Dict[str, str]:
        """The `labels` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().labels

auth_method property

The auth_method property.

Returns:

Type Description
str

the value of the property.

configuration property

The configuration property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

connector_type property

The connector_type property.

Returns:

Type Description
Union[str, ServiceConnectorTypeModel]

the value of the property.

description property

The description property.

Returns:

Type Description
str

the value of the property.

emojified_connector_type property

Get the emojified connector type.

Returns:

Type Description
str

The emojified connector type.

emojified_resource_types property

Get the emojified connector type.

Returns:

Type Description
List[str]

The emojified connector type.

expiration_seconds property

The expiration_seconds property.

Returns:

Type Description
Optional[int]

the value of the property.

expires_at property

The expires_at property.

Returns:

Type Description
Optional[datetime]

the value of the property.

expires_skew_tolerance property

The expires_skew_tolerance property.

Returns:

Type Description
Optional[int]

the value of the property.

full_configuration property

Get the full connector configuration, including secrets.

Returns:

Type Description
Dict[str, str]

The full connector configuration, including secrets.

is_multi_instance property

Checks if the connector is multi-instance.

A multi-instance connector is configured to access multiple instances of the configured resource type.

Returns:

Type Description
bool

True if the connector is multi-instance, False otherwise.

is_multi_type property

Checks if the connector is multi-type.

A multi-type connector can be used to access multiple types of resources.

Returns:

Type Description
bool

True if the connector is multi-type, False otherwise.

is_single_instance property

Checks if the connector is single-instance.

A single-instance connector is configured to access only a single instance of the configured resource type or does not support multiple resource instances.

Returns:

Type Description
bool

True if the connector is single-instance, False otherwise.

labels property

The labels property.

Returns:

Type Description
Dict[str, str]

the value of the property.

resource_id property

The resource_id property.

Returns:

Type Description
Optional[str]

the value of the property.

resource_types property

The resource_types property.

Returns:

Type Description
List[str]

the value of the property.

secret_id property

The secret_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

secrets property

The secrets property.

Returns:

Type Description
Dict[str, Optional[SecretStr]]

the value of the property.

supports_instances property

The supports_instances property.

Returns:

Type Description
bool

the value of the property.

type property

Get the connector type.

Returns:

Type Description
str

The connector type.

get_analytics_metadata()

Add the service connector labels to analytics metadata.

Returns:

Type Description
Dict[str, Any]

Dict of analytics metadata.

Source code in src/zenml/models/v2/core/service_connector.py
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
def get_analytics_metadata(self) -> Dict[str, Any]:
    """Add the service connector labels to analytics metadata.

    Returns:
        Dict of analytics metadata.
    """
    metadata = super().get_analytics_metadata()

    metadata.update(
        {
            label[6:]: value
            for label, value in self.labels.items()
            if label.startswith("zenml:")
        }
    )
    return metadata

get_hydrated_version()

Get the hydrated version of this service connector.

Returns:

Type Description
ServiceConnectorResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/service_connector.py
515
516
517
518
519
520
521
522
523
def get_hydrated_version(self) -> "ServiceConnectorResponse":
    """Get the hydrated version of this service connector.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_service_connector(self.id)

set_connector_type(value)

Auxiliary method to set the connector type.

Parameters:

Name Type Description Default
value Union[str, ServiceConnectorTypeModel]

the new value for the connector type.

required
Source code in src/zenml/models/v2/core/service_connector.py
620
621
622
623
624
625
626
627
628
def set_connector_type(
    self, value: Union[str, "ServiceConnectorTypeModel"]
) -> None:
    """Auxiliary method to set the connector type.

    Args:
        value: the new value for the connector type.
    """
    self.get_body().connector_type = value

validate_and_configure_resources(connector_type, resource_types=None, resource_id=None, configuration=None, secrets=None)

Validate and configure the resources that the connector can be used to access.

Parameters:

Name Type Description Default
connector_type ServiceConnectorTypeModel

The connector type specification used to validate the connector configuration.

required
resource_types Optional[Union[str, List[str]]]

The type(s) of resource that the connector instance can be used to access. If omitted, a multi-type connector is configured.

None
resource_id Optional[str]

Uniquely identifies a specific resource instance that the connector instance can be used to access.

None
configuration Optional[Dict[str, Any]]

The connector configuration.

None
secrets Optional[Dict[str, Optional[SecretStr]]]

The connector secrets.

None
Source code in src/zenml/models/v2/core/service_connector.py
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
def validate_and_configure_resources(
    self,
    connector_type: "ServiceConnectorTypeModel",
    resource_types: Optional[Union[str, List[str]]] = None,
    resource_id: Optional[str] = None,
    configuration: Optional[Dict[str, Any]] = None,
    secrets: Optional[Dict[str, Optional[SecretStr]]] = None,
) -> None:
    """Validate and configure the resources that the connector can be used to access.

    Args:
        connector_type: The connector type specification used to validate
            the connector configuration.
        resource_types: The type(s) of resource that the connector instance
            can be used to access. If omitted, a multi-type connector is
            configured.
        resource_id: Uniquely identifies a specific resource instance that
            the connector instance can be used to access.
        configuration: The connector configuration.
        secrets: The connector secrets.
    """
    _validate_and_configure_resources(
        connector=self,
        connector_type=connector_type,
        resource_types=resource_types,
        resource_id=resource_id,
        configuration=configuration,
        secrets=secrets,
    )

ServiceConnectorResponseBody

Bases: UserScopedResponseBody

Response body for service connectors.

Source code in src/zenml/models/v2/core/service_connector.py
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
class ServiceConnectorResponseBody(UserScopedResponseBody):
    """Response body for service connectors."""

    description: str = Field(
        default="",
        title="The service connector instance description.",
    )
    connector_type: Union[str, "ServiceConnectorTypeModel"] = Field(
        title="The type of service connector.", union_mode="left_to_right"
    )
    auth_method: str = Field(
        title="The authentication method that the connector instance uses to "
        "access the resources.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    resource_types: List[str] = Field(
        default_factory=list,
        title="The type(s) of resource that the connector instance can be used "
        "to gain access to.",
    )
    resource_id: Optional[str] = Field(
        default=None,
        title="Uniquely identifies a specific resource instance that the "
        "connector instance can be used to access. If omitted, the connector "
        "instance can be used to access any and all resource instances that "
        "the authentication method and resource type(s) allow.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    supports_instances: bool = Field(
        default=False,
        title="Indicates whether the connector instance can be used to access "
        "multiple instances of the configured resource type.",
    )
    expires_at: Optional[datetime] = Field(
        default=None,
        title="Time when the authentication credentials configured for the "
        "connector expire. If omitted, the credentials do not expire.",
    )
    expires_skew_tolerance: Optional[int] = Field(
        default=None,
        title="The number of seconds of tolerance to apply when checking "
        "whether the authentication credentials configured for the connector "
        "have expired. If omitted, no tolerance is applied.",
    )

ServiceConnectorResponseMetadata

Bases: UserScopedResponseMetadata

Response metadata for service connectors.

Source code in src/zenml/models/v2/core/service_connector.py
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
class ServiceConnectorResponseMetadata(UserScopedResponseMetadata):
    """Response metadata for service connectors."""

    configuration: Dict[str, Any] = Field(
        default_factory=dict,
        title="The service connector configuration, not including secrets.",
    )
    secret_id: Optional[UUID] = Field(
        default=None,
        title="The ID of the secret that contains the service connector "
        "secret configuration values.",
    )
    expiration_seconds: Optional[int] = Field(
        default=None,
        title="The duration, in seconds, that the temporary credentials "
        "generated by this connector should remain valid. Only applicable for "
        "connectors and authentication methods that involve generating "
        "temporary credentials from the ones configured in the connector.",
    )
    secrets: Dict[str, Optional[PlainSerializedSecretStr]] = Field(
        default_factory=dict,
        title="The service connector secrets.",
    )
    labels: Dict[str, str] = Field(
        default_factory=dict,
        title="Service connector labels.",
    )

ServiceConnectorResponseResources

Bases: UserScopedResponseResources

Class for all resource models associated with the service connector entity.

Source code in src/zenml/models/v2/core/service_connector.py
476
477
class ServiceConnectorResponseResources(UserScopedResponseResources):
    """Class for all resource models associated with the service connector entity."""

ServiceConnectorTypeModel

Bases: BaseModel

Service connector type specification.

Describes the types of resources to which the service connector can be used to gain access and the authentication methods that are supported by the service connector.

The connector type, resource types, resource IDs and authentication methods can all be used as search criteria to lookup and filter service connector instances that are compatible with the requirements of a consumer (e.g. a stack component).

Source code in src/zenml/models/v2/misc/service_connector_type.py
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
class ServiceConnectorTypeModel(BaseModel):
    """Service connector type specification.

    Describes the types of resources to which the service connector can be used
    to gain access and the authentication methods that are supported by the
    service connector.

    The connector type, resource types, resource IDs and authentication
    methods can all be used as search criteria to lookup and filter service
    connector instances that are compatible with the requirements of a consumer
    (e.g. a stack component).
    """

    name: str = Field(
        title="User readable name for the service connector type.",
    )
    connector_type: str = Field(
        title="The type of service connector. It can be used to represent a "
        "generic resource (e.g. Docker, Kubernetes) or a group of different "
        "resources accessible through a common interface or point of access "
        "and authentication (e.g. a cloud provider or a platform).",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    description: str = Field(
        default="",
        title="A description of the service connector.",
    )
    resource_types: List[ResourceTypeModel] = Field(
        title="A list of resource types that the connector can be used to "
        "access.",
    )
    auth_methods: List[AuthenticationMethodModel] = Field(
        title="A list of specifications describing the authentication "
        "methods that are supported by the service connector, along with the "
        "configuration and secrets attributes that need to be configured for "
        "them.",
    )
    supports_auto_configuration: bool = Field(
        default=False,
        title="Models if the connector can be configured automatically based "
        "on information extracted from a local environment.",
    )
    logo_url: Optional[str] = Field(
        default=None,
        title="Optionally, a URL pointing to a png,"
        "svg or jpg can be attached.",
    )
    emoji: Optional[str] = Field(
        default=None,
        title="Optionally, a python-rich emoji can be attached.",
    )
    docs_url: Optional[str] = Field(
        default=None,
        title="Optionally, a URL pointing to docs, within docs.zenml.io.",
    )
    sdk_docs_url: Optional[str] = Field(
        default=None,
        title="Optionally, a URL pointing to SDK docs,"
        "within sdkdocs.zenml.io.",
    )
    local: bool = Field(
        default=True,
        title="If True, the service connector is available locally.",
    )
    remote: bool = Field(
        default=False,
        title="If True, the service connector is available remotely.",
    )
    _connector_class: Optional[Type["ServiceConnector"]] = None

    @property
    def connector_class(self) -> Optional[Type["ServiceConnector"]]:
        """Get the service connector class.

        Returns:
            The service connector class.
        """
        return self._connector_class

    @property
    def emojified_connector_type(self) -> str:
        """Get the emojified connector type.

        Returns:
            The emojified connector type.
        """
        if not self.emoji:
            return self.connector_type
        return f"{self.emoji} {self.connector_type}"

    @property
    def emojified_resource_types(self) -> List[str]:
        """Get the emojified connector types.

        Returns:
            The emojified connector types.
        """
        return [
            resource_type.emojified_resource_type
            for resource_type in self.resource_types
        ]

    def set_connector_class(
        self, connector_class: Type["ServiceConnector"]
    ) -> None:
        """Set the service connector class.

        Args:
            connector_class: The service connector class.
        """
        self._connector_class = connector_class

    @field_validator("resource_types")
    @classmethod
    def validate_resource_types(
        cls, values: List[ResourceTypeModel]
    ) -> List[ResourceTypeModel]:
        """Validate that the resource types are unique.

        Args:
            values: The list of resource types.

        Returns:
            The list of resource types.

        Raises:
            ValueError: If two or more resource type specifications list the
                same resource type.
        """
        # Gather all resource types from the list of resource type
        # specifications.
        resource_types = [r.resource_type for r in values]
        if len(resource_types) != len(set(resource_types)):
            raise ValueError(
                "Two or more resource type specifications must not list "
                "the same resource type."
            )

        return values

    @field_validator("auth_methods")
    @classmethod
    def validate_auth_methods(
        cls, values: List[AuthenticationMethodModel]
    ) -> List[AuthenticationMethodModel]:
        """Validate that the authentication methods are unique.

        Args:
            values: The list of authentication methods.

        Returns:
            The list of authentication methods.

        Raises:
            ValueError: If two or more authentication method specifications
                share the same authentication method value.
        """
        # Gather all auth methods from the list of auth method
        # specifications.
        auth_methods = [a.auth_method for a in values]
        if len(auth_methods) != len(set(auth_methods)):
            raise ValueError(
                "Two or more authentication method specifications must not "
                "share the same authentication method value."
            )

        return values

    @property
    def resource_type_dict(
        self,
    ) -> Dict[str, ResourceTypeModel]:
        """Returns a map of resource types to resource type specifications.

        Returns:
            A map of resource types to resource type specifications.
        """
        return {r.resource_type: r for r in self.resource_types}

    @property
    def auth_method_dict(
        self,
    ) -> Dict[str, AuthenticationMethodModel]:
        """Returns a map of authentication methods to authentication method specifications.

        Returns:
            A map of authentication methods to authentication method
            specifications.
        """
        return {a.auth_method: a for a in self.auth_methods}

    def find_resource_specifications(
        self,
        auth_method: str,
        resource_type: Optional[str] = None,
    ) -> Tuple[AuthenticationMethodModel, Optional[ResourceTypeModel]]:
        """Find the specifications for a configurable resource.

        Validate the supplied connector configuration parameters against the
        connector specification and return the matching authentication method
        specification and resource specification.

        Args:
            auth_method: The name of the authentication method.
            resource_type: The type of resource being configured.

        Returns:
            The authentication method specification and resource specification
            for the specified authentication method and resource type.

        Raises:
            KeyError: If the authentication method is not supported by the
                connector for the specified resource type and ID.
        """
        # Verify the authentication method
        auth_method_dict = self.auth_method_dict
        if auth_method in auth_method_dict:
            # A match was found for the authentication method
            auth_method_spec = auth_method_dict[auth_method]
        else:
            # No match was found for the authentication method
            raise KeyError(
                f"connector type '{self.connector_type}' does not support the "
                f"'{auth_method}' authentication method. Supported "
                f"authentication methods are: {list(auth_method_dict.keys())}."
            )

        if resource_type is None:
            # No resource type was specified, so no resource type
            # specification can be returned.
            return auth_method_spec, None

        # Verify the resource type
        resource_type_dict = self.resource_type_dict
        if resource_type in resource_type_dict:
            resource_type_spec = resource_type_dict[resource_type]
        else:
            raise KeyError(
                f"connector type '{self.connector_type}' does not support "
                f"resource type '{resource_type}'. Supported resource types "
                f"are: {list(resource_type_dict.keys())}."
            )

        if auth_method not in resource_type_spec.auth_methods:
            raise KeyError(
                f"the '{self.connector_type}' connector type does not support "
                f"the '{auth_method}' authentication method for the "
                f"'{resource_type}' resource type. Supported authentication "
                f"methods are: {resource_type_spec.auth_methods}."
            )

        return auth_method_spec, resource_type_spec

auth_method_dict property

Returns a map of authentication methods to authentication method specifications.

Returns:

Type Description
Dict[str, AuthenticationMethodModel]

A map of authentication methods to authentication method

Dict[str, AuthenticationMethodModel]

specifications.

connector_class property

Get the service connector class.

Returns:

Type Description
Optional[Type[ServiceConnector]]

The service connector class.

emojified_connector_type property

Get the emojified connector type.

Returns:

Type Description
str

The emojified connector type.

emojified_resource_types property

Get the emojified connector types.

Returns:

Type Description
List[str]

The emojified connector types.

resource_type_dict property

Returns a map of resource types to resource type specifications.

Returns:

Type Description
Dict[str, ResourceTypeModel]

A map of resource types to resource type specifications.

find_resource_specifications(auth_method, resource_type=None)

Find the specifications for a configurable resource.

Validate the supplied connector configuration parameters against the connector specification and return the matching authentication method specification and resource specification.

Parameters:

Name Type Description Default
auth_method str

The name of the authentication method.

required
resource_type Optional[str]

The type of resource being configured.

None

Returns:

Type Description
AuthenticationMethodModel

The authentication method specification and resource specification

Optional[ResourceTypeModel]

for the specified authentication method and resource type.

Raises:

Type Description
KeyError

If the authentication method is not supported by the connector for the specified resource type and ID.

Source code in src/zenml/models/v2/misc/service_connector_type.py
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
def find_resource_specifications(
    self,
    auth_method: str,
    resource_type: Optional[str] = None,
) -> Tuple[AuthenticationMethodModel, Optional[ResourceTypeModel]]:
    """Find the specifications for a configurable resource.

    Validate the supplied connector configuration parameters against the
    connector specification and return the matching authentication method
    specification and resource specification.

    Args:
        auth_method: The name of the authentication method.
        resource_type: The type of resource being configured.

    Returns:
        The authentication method specification and resource specification
        for the specified authentication method and resource type.

    Raises:
        KeyError: If the authentication method is not supported by the
            connector for the specified resource type and ID.
    """
    # Verify the authentication method
    auth_method_dict = self.auth_method_dict
    if auth_method in auth_method_dict:
        # A match was found for the authentication method
        auth_method_spec = auth_method_dict[auth_method]
    else:
        # No match was found for the authentication method
        raise KeyError(
            f"connector type '{self.connector_type}' does not support the "
            f"'{auth_method}' authentication method. Supported "
            f"authentication methods are: {list(auth_method_dict.keys())}."
        )

    if resource_type is None:
        # No resource type was specified, so no resource type
        # specification can be returned.
        return auth_method_spec, None

    # Verify the resource type
    resource_type_dict = self.resource_type_dict
    if resource_type in resource_type_dict:
        resource_type_spec = resource_type_dict[resource_type]
    else:
        raise KeyError(
            f"connector type '{self.connector_type}' does not support "
            f"resource type '{resource_type}'. Supported resource types "
            f"are: {list(resource_type_dict.keys())}."
        )

    if auth_method not in resource_type_spec.auth_methods:
        raise KeyError(
            f"the '{self.connector_type}' connector type does not support "
            f"the '{auth_method}' authentication method for the "
            f"'{resource_type}' resource type. Supported authentication "
            f"methods are: {resource_type_spec.auth_methods}."
        )

    return auth_method_spec, resource_type_spec

set_connector_class(connector_class)

Set the service connector class.

Parameters:

Name Type Description Default
connector_class Type[ServiceConnector]

The service connector class.

required
Source code in src/zenml/models/v2/misc/service_connector_type.py
321
322
323
324
325
326
327
328
329
def set_connector_class(
    self, connector_class: Type["ServiceConnector"]
) -> None:
    """Set the service connector class.

    Args:
        connector_class: The service connector class.
    """
    self._connector_class = connector_class

validate_auth_methods(values) classmethod

Validate that the authentication methods are unique.

Parameters:

Name Type Description Default
values List[AuthenticationMethodModel]

The list of authentication methods.

required

Returns:

Type Description
List[AuthenticationMethodModel]

The list of authentication methods.

Raises:

Type Description
ValueError

If two or more authentication method specifications share the same authentication method value.

Source code in src/zenml/models/v2/misc/service_connector_type.py
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
@field_validator("auth_methods")
@classmethod
def validate_auth_methods(
    cls, values: List[AuthenticationMethodModel]
) -> List[AuthenticationMethodModel]:
    """Validate that the authentication methods are unique.

    Args:
        values: The list of authentication methods.

    Returns:
        The list of authentication methods.

    Raises:
        ValueError: If two or more authentication method specifications
            share the same authentication method value.
    """
    # Gather all auth methods from the list of auth method
    # specifications.
    auth_methods = [a.auth_method for a in values]
    if len(auth_methods) != len(set(auth_methods)):
        raise ValueError(
            "Two or more authentication method specifications must not "
            "share the same authentication method value."
        )

    return values

validate_resource_types(values) classmethod

Validate that the resource types are unique.

Parameters:

Name Type Description Default
values List[ResourceTypeModel]

The list of resource types.

required

Returns:

Type Description
List[ResourceTypeModel]

The list of resource types.

Raises:

Type Description
ValueError

If two or more resource type specifications list the same resource type.

Source code in src/zenml/models/v2/misc/service_connector_type.py
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
@field_validator("resource_types")
@classmethod
def validate_resource_types(
    cls, values: List[ResourceTypeModel]
) -> List[ResourceTypeModel]:
    """Validate that the resource types are unique.

    Args:
        values: The list of resource types.

    Returns:
        The list of resource types.

    Raises:
        ValueError: If two or more resource type specifications list the
            same resource type.
    """
    # Gather all resource types from the list of resource type
    # specifications.
    resource_types = [r.resource_type for r in values]
    if len(resource_types) != len(set(resource_types)):
        raise ValueError(
            "Two or more resource type specifications must not list "
            "the same resource type."
        )

    return values

ServiceConnectorTypedResourcesModel

Bases: BaseModel

Service connector typed resources list.

Lists the resource instances that a service connector can provide access to.

Source code in src/zenml/models/v2/misc/service_connector_type.py
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
class ServiceConnectorTypedResourcesModel(BaseModel):
    """Service connector typed resources list.

    Lists the resource instances that a service connector can provide
    access to.
    """

    resource_type: str = Field(
        title="The type of resource that the service connector instance can "
        "be used to access.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    resource_ids: Optional[List[str]] = Field(
        default=None,
        title="The resource IDs of all resource instances that the service "
        "connector instance can be used to access. Omitted (set to None) for "
        "multi-type service connectors that didn't explicitly request to "
        "fetch resources for all resource types. Also omitted if an error "
        "occurred while listing the resource instances or if no resources are "
        "listed due to authorization issues or lack of permissions (in both "
        "cases the 'error' field is set to an error message). For resource "
        "types that do not support multiple instances, a single resource ID is "
        "listed.",
    )

    error: Optional[str] = Field(
        default=None,
        title="An error message describing why the service connector instance "
        "could not list the resources that it is configured to access.",
    )

ServiceConnectorUpdate

Bases: BaseUpdate

Model used for service connector updates.

Most fields in the update model are optional and will not be updated if omitted. However, the following fields are "special" and leaving them out will also cause the corresponding value to be removed from the service connector in the database:

  • the resource_id field
  • the expiration_seconds field

In addition to the above exceptions, the following rules apply:

  • the configuration and secrets fields together represent a full valid configuration update, not just a partial update. If either is set (i.e. not None) in the update, their values are merged together and will replace the existing configuration and secrets values.
  • the labels field is also a full labels update: if set (i.e. not None), all existing labels are removed and replaced by the new labels in the update.

NOTE: the attributes here override the ones in the base class, so they have a None default value.

Source code in src/zenml/models/v2/core/service_connector.py
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
class ServiceConnectorUpdate(BaseUpdate):
    """Model used for service connector updates.

    Most fields in the update model are optional and will not be updated if
    omitted. However, the following fields are "special" and leaving them out
    will also cause the corresponding value to be removed from the service
    connector in the database:

    * the `resource_id` field
    * the `expiration_seconds` field

    In addition to the above exceptions, the following rules apply:

    * the `configuration` and `secrets` fields together represent a full
    valid configuration update, not just a partial update. If either is
    set (i.e. not None) in the update, their values are merged together and
    will replace the existing configuration and secrets values.
    * the `labels` field is also a full labels update: if set (i.e. not
    `None`), all existing labels are removed and replaced by the new labels
    in the update.

    NOTE: the attributes here override the ones in the base class, so they
    have a None default value.
    """

    name: Optional[str] = Field(
        title="The service connector name.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    connector_type: Optional[Union[str, "ServiceConnectorTypeModel"]] = Field(
        title="The type of service connector.",
        default=None,
        union_mode="left_to_right",
    )
    description: Optional[str] = Field(
        title="The service connector instance description.",
        default=None,
    )
    auth_method: Optional[str] = Field(
        title="The authentication method that the connector instance uses to "
        "access the resources.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    resource_types: Optional[List[str]] = Field(
        title="The type(s) of resource that the connector instance can be used "
        "to gain access to.",
        default=None,
    )
    resource_id: Optional[str] = Field(
        title="Uniquely identifies a specific resource instance that the "
        "connector instance can be used to access. If omitted, the "
        "connector instance can be used to access any and all resource "
        "instances that the authentication method and resource type(s) "
        "allow.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    supports_instances: Optional[bool] = Field(
        title="Indicates whether the connector instance can be used to access "
        "multiple instances of the configured resource type.",
        default=None,
    )
    expires_at: Optional[datetime] = Field(
        title="Time when the authentication credentials configured for the "
        "connector expire. If omitted, the credentials do not expire.",
        default=None,
    )
    expires_skew_tolerance: Optional[int] = Field(
        title="The number of seconds of tolerance to apply when checking "
        "whether the authentication credentials configured for the "
        "connector have expired. If omitted, no tolerance is applied.",
        default=None,
    )
    expiration_seconds: Optional[int] = Field(
        title="The duration, in seconds, that the temporary credentials "
        "generated by this connector should remain valid. Only "
        "applicable for connectors and authentication methods that "
        "involve generating temporary credentials from the ones "
        "configured in the connector.",
        default=None,
    )
    configuration: Optional[Dict[str, Any]] = Field(
        title="The service connector configuration, not including secrets.",
        default=None,
    )
    secrets: Optional[Dict[str, Optional[PlainSerializedSecretStr]]] = Field(
        title="The service connector secrets.",
        default=None,
    )
    labels: Optional[Dict[str, str]] = Field(
        title="Service connector labels.",
        default=None,
    )

    # Analytics
    ANALYTICS_FIELDS: ClassVar[List[str]] = [
        "connector_type",
        "auth_method",
        "resource_types",
    ]

    def get_analytics_metadata(self) -> Dict[str, Any]:
        """Format the resource types in the analytics metadata.

        Returns:
            Dict of analytics metadata.
        """
        metadata = super().get_analytics_metadata()

        if self.resource_types is not None:
            if len(self.resource_types) == 1:
                metadata["resource_types"] = self.resource_types[0]
            else:
                metadata["resource_types"] = ", ".join(self.resource_types)

        if self.connector_type is not None:
            metadata["connector_type"] = self.type

        return metadata

    # Helper methods
    @property
    def type(self) -> Optional[str]:
        """Get the connector type.

        Returns:
            The connector type.
        """
        if self.connector_type is not None:
            if isinstance(self.connector_type, str):
                return self.connector_type
            return self.connector_type.connector_type
        return None

    def validate_and_configure_resources(
        self,
        connector_type: "ServiceConnectorTypeModel",
        resource_types: Optional[Union[str, List[str]]] = None,
        resource_id: Optional[str] = None,
        configuration: Optional[Dict[str, Any]] = None,
        secrets: Optional[Dict[str, Optional[SecretStr]]] = None,
    ) -> None:
        """Validate and configure the resources that the connector can be used to access.

        Args:
            connector_type: The connector type specification used to validate
                the connector configuration.
            resource_types: The type(s) of resource that the connector instance
                can be used to access. If omitted, a multi-type connector is
                configured.
            resource_id: Uniquely identifies a specific resource instance that
                the connector instance can be used to access.
            configuration: The connector configuration.
            secrets: The connector secrets.
        """
        _validate_and_configure_resources(
            connector=self,
            connector_type=connector_type,
            resource_types=resource_types,
            resource_id=resource_id,
            configuration=configuration,
            secrets=secrets,
        )

    def convert_to_request(self) -> "ServiceConnectorRequest":
        """Method to generate a service connector request object from self.

        For certain operations, the service connector update model need to
        adhere to the limitations set by the request model. In order to use
        update models in such situations, we need to be able to convert an
        update model into a request model.

        Returns:
            The equivalent request model

        Raises:
            RuntimeError: if the model can not be converted to a request model.
        """
        try:
            return ServiceConnectorRequest.model_validate(self.model_dump())
        except ValidationError as e:
            raise RuntimeError(
                "The service connector update model can not be converted into "
                f"an equivalent request model: {e}"
            )

type property

Get the connector type.

Returns:

Type Description
Optional[str]

The connector type.

convert_to_request()

Method to generate a service connector request object from self.

For certain operations, the service connector update model need to adhere to the limitations set by the request model. In order to use update models in such situations, we need to be able to convert an update model into a request model.

Returns:

Type Description
ServiceConnectorRequest

The equivalent request model

Raises:

Type Description
RuntimeError

if the model can not be converted to a request model.

Source code in src/zenml/models/v2/core/service_connector.py
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
def convert_to_request(self) -> "ServiceConnectorRequest":
    """Method to generate a service connector request object from self.

    For certain operations, the service connector update model need to
    adhere to the limitations set by the request model. In order to use
    update models in such situations, we need to be able to convert an
    update model into a request model.

    Returns:
        The equivalent request model

    Raises:
        RuntimeError: if the model can not be converted to a request model.
    """
    try:
        return ServiceConnectorRequest.model_validate(self.model_dump())
    except ValidationError as e:
        raise RuntimeError(
            "The service connector update model can not be converted into "
            f"an equivalent request model: {e}"
        )

get_analytics_metadata()

Format the resource types in the analytics metadata.

Returns:

Type Description
Dict[str, Any]

Dict of analytics metadata.

Source code in src/zenml/models/v2/core/service_connector.py
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
def get_analytics_metadata(self) -> Dict[str, Any]:
    """Format the resource types in the analytics metadata.

    Returns:
        Dict of analytics metadata.
    """
    metadata = super().get_analytics_metadata()

    if self.resource_types is not None:
        if len(self.resource_types) == 1:
            metadata["resource_types"] = self.resource_types[0]
        else:
            metadata["resource_types"] = ", ".join(self.resource_types)

    if self.connector_type is not None:
        metadata["connector_type"] = self.type

    return metadata

validate_and_configure_resources(connector_type, resource_types=None, resource_id=None, configuration=None, secrets=None)

Validate and configure the resources that the connector can be used to access.

Parameters:

Name Type Description Default
connector_type ServiceConnectorTypeModel

The connector type specification used to validate the connector configuration.

required
resource_types Optional[Union[str, List[str]]]

The type(s) of resource that the connector instance can be used to access. If omitted, a multi-type connector is configured.

None
resource_id Optional[str]

Uniquely identifies a specific resource instance that the connector instance can be used to access.

None
configuration Optional[Dict[str, Any]]

The connector configuration.

None
secrets Optional[Dict[str, Optional[SecretStr]]]

The connector secrets.

None
Source code in src/zenml/models/v2/core/service_connector.py
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
def validate_and_configure_resources(
    self,
    connector_type: "ServiceConnectorTypeModel",
    resource_types: Optional[Union[str, List[str]]] = None,
    resource_id: Optional[str] = None,
    configuration: Optional[Dict[str, Any]] = None,
    secrets: Optional[Dict[str, Optional[SecretStr]]] = None,
) -> None:
    """Validate and configure the resources that the connector can be used to access.

    Args:
        connector_type: The connector type specification used to validate
            the connector configuration.
        resource_types: The type(s) of resource that the connector instance
            can be used to access. If omitted, a multi-type connector is
            configured.
        resource_id: Uniquely identifies a specific resource instance that
            the connector instance can be used to access.
        configuration: The connector configuration.
        secrets: The connector secrets.
    """
    _validate_and_configure_resources(
        connector=self,
        connector_type=connector_type,
        resource_types=resource_types,
        resource_id=resource_id,
        configuration=configuration,
        secrets=secrets,
    )

ServiceFilter

Bases: ProjectScopedFilter

Model to enable advanced filtering of services.

Source code in src/zenml/models/v2/core/service.py
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
class ServiceFilter(ProjectScopedFilter):
    """Model to enable advanced filtering of services."""

    name: Optional[str] = Field(
        default=None,
        description="Name of the service. Use this to filter services by "
        "their name.",
    )
    type: Optional[str] = Field(
        default=None,
        description="Type of the service. Filter services by their type.",
    )
    flavor: Optional[str] = Field(
        default=None,
        description="Flavor of the service. Use this to filter services by "
        "their flavor.",
    )
    config: Optional[bytes] = Field(
        default=None,
        description="Config of the service. Use this to filter services by "
        "their config.",
    )
    pipeline_name: Optional[str] = Field(
        default=None,
        description="Pipeline name responsible for deploying the service",
    )
    pipeline_step_name: Optional[str] = Field(
        default=None,
        description="Pipeline step name responsible for deploying the service",
    )
    running: Optional[bool] = Field(
        default=None, description="Whether the service is running"
    )
    model_version_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="By the model version this service is attached to.",
        union_mode="left_to_right",
    )
    pipeline_run_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="By the pipeline run this service is attached to.",
        union_mode="left_to_right",
    )

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

    def set_type(self, type: str) -> None:
        """Set the type of the service.

        Args:
            type: The type of the service.
        """
        self.type = type

    def set_flavor(self, flavor: str) -> None:
        """Set the flavor of the service.

        Args:
            flavor: The flavor of the service.
        """
        self.flavor = flavor

    # Artifact name and type are not DB fields and need to be handled separately
    FILTER_EXCLUDE_FIELDS = [
        *ProjectScopedFilter.FILTER_EXCLUDE_FIELDS,
        "flavor",
        "type",
        "pipeline_step_name",
        "running",
        "pipeline_name",
        "config",
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CLI_EXCLUDE_FIELDS,
        "flavor",
        "type",
        "pipeline_step_name",
        "running",
        "pipeline_name",
    ]

    def generate_filter(
        self, table: Type["AnySchema"]
    ) -> Union["ColumnElement[bool]"]:
        """Generate the filter for the query.

        Services can be scoped by type to narrow the search.

        Args:
            table: The Table that is being queried from.

        Returns:
            The filter expression for the query.
        """
        from sqlmodel import and_

        base_filter = super().generate_filter(table)

        if self.type:
            type_filter = getattr(table, "type") == self.type
            base_filter = and_(base_filter, type_filter)

        if self.flavor:
            flavor_filter = getattr(table, "flavor") == self.flavor
            base_filter = and_(base_filter, flavor_filter)

        if self.pipeline_name:
            pipeline_name_filter = (
                getattr(table, "pipeline_name") == self.pipeline_name
            )
            base_filter = and_(base_filter, pipeline_name_filter)

        if self.pipeline_step_name:
            pipeline_step_name_filter = (
                getattr(table, "pipeline_step_name") == self.pipeline_step_name
            )
            base_filter = and_(base_filter, pipeline_step_name_filter)

        return base_filter

generate_filter(table)

Generate the filter for the query.

Services can be scoped by type to narrow the search.

Parameters:

Name Type Description Default
table Type[AnySchema]

The Table that is being queried from.

required

Returns:

Type Description
Union[ColumnElement[bool]]

The filter expression for the query.

Source code in src/zenml/models/v2/core/service.py
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
def generate_filter(
    self, table: Type["AnySchema"]
) -> Union["ColumnElement[bool]"]:
    """Generate the filter for the query.

    Services can be scoped by type to narrow the search.

    Args:
        table: The Table that is being queried from.

    Returns:
        The filter expression for the query.
    """
    from sqlmodel import and_

    base_filter = super().generate_filter(table)

    if self.type:
        type_filter = getattr(table, "type") == self.type
        base_filter = and_(base_filter, type_filter)

    if self.flavor:
        flavor_filter = getattr(table, "flavor") == self.flavor
        base_filter = and_(base_filter, flavor_filter)

    if self.pipeline_name:
        pipeline_name_filter = (
            getattr(table, "pipeline_name") == self.pipeline_name
        )
        base_filter = and_(base_filter, pipeline_name_filter)

    if self.pipeline_step_name:
        pipeline_step_name_filter = (
            getattr(table, "pipeline_step_name") == self.pipeline_step_name
        )
        base_filter = and_(base_filter, pipeline_step_name_filter)

    return base_filter

set_flavor(flavor)

Set the flavor of the service.

Parameters:

Name Type Description Default
flavor str

The flavor of the service.

required
Source code in src/zenml/models/v2/core/service.py
467
468
469
470
471
472
473
def set_flavor(self, flavor: str) -> None:
    """Set the flavor of the service.

    Args:
        flavor: The flavor of the service.
    """
    self.flavor = flavor

set_type(type)

Set the type of the service.

Parameters:

Name Type Description Default
type str

The type of the service.

required
Source code in src/zenml/models/v2/core/service.py
459
460
461
462
463
464
465
def set_type(self, type: str) -> None:
    """Set the type of the service.

    Args:
        type: The type of the service.
    """
    self.type = type

ServiceRequest

Bases: ProjectScopedRequest

Request model for services.

Source code in src/zenml/models/v2/core/service.py
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
class ServiceRequest(ProjectScopedRequest):
    """Request model for services."""

    name: str = Field(
        title="The name of the service.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    service_type: ServiceType = Field(
        title="The type of the service.",
    )
    service_source: Optional[str] = Field(
        title="The class of the service.",
        description="The fully qualified class name of the service "
        "implementation.",
        default=None,
    )
    admin_state: Optional[ServiceState] = Field(
        title="The admin state of the service.",
        description="The administrative state of the service, e.g., ACTIVE, "
        "INACTIVE.",
        default=None,
    )
    config: Dict[str, Any] = Field(
        title="The service config.",
        description="A dictionary containing configuration parameters for the "
        "service.",
    )
    labels: Optional[Dict[str, str]] = Field(
        default=None,
        title="The service labels.",
    )
    status: Optional[Dict[str, Any]] = Field(
        default=None,
        title="The status of the service.",
    )
    endpoint: Optional[Dict[str, Any]] = Field(
        default=None,
        title="The service endpoint.",
    )
    prediction_url: Optional[str] = Field(
        default=None,
        title="The service endpoint URL.",
    )
    health_check_url: Optional[str] = Field(
        default=None,
        title="The service health check URL.",
    )
    model_version_id: Optional[UUID] = Field(
        default=None,
        title="The model version id linked to the service.",
    )
    pipeline_run_id: Optional[UUID] = Field(
        default=None,
        title="The pipeline run id linked to the service.",
    )

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

ServiceResponse

Bases: ProjectScopedResponse[ServiceResponseBody, ServiceResponseMetadata, ServiceResponseResources]

Response model for services.

Source code in src/zenml/models/v2/core/service.py
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
class ServiceResponse(
    ProjectScopedResponse[
        ServiceResponseBody, ServiceResponseMetadata, ServiceResponseResources
    ]
):
    """Response model for services."""

    name: str = Field(
        title="The name of the service.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "ServiceResponse":
        """Get the hydrated version of this artifact.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_service(self.id)

    # Body and metadata properties

    @property
    def service_type(self) -> ServiceType:
        """The `service_type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().service_type

    @property
    def labels(self) -> Optional[Dict[str, str]]:
        """The `labels` property.

        Returns:
            the value of the property.
        """
        return self.get_body().labels

    @property
    def service_source(self) -> Optional[str]:
        """The `service_source` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().service_source

    @property
    def config(self) -> Dict[str, Any]:
        """The `config` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().config

    @property
    def status(self) -> Optional[Dict[str, Any]]:
        """The `status` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().status

    @property
    def endpoint(self) -> Optional[Dict[str, Any]]:
        """The `endpoint` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().endpoint

    @property
    def created(self) -> datetime:
        """The `created` property.

        Returns:
            the value of the property.
        """
        return self.get_body().created

    @property
    def updated(self) -> datetime:
        """The `updated` property.

        Returns:
            the value of the property.
        """
        return self.get_body().updated

    @property
    def admin_state(self) -> Optional[ServiceState]:
        """The `admin_state` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().admin_state

    @property
    def prediction_url(self) -> Optional[str]:
        """The `prediction_url` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().prediction_url

    @property
    def health_check_url(self) -> Optional[str]:
        """The `health_check_url` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().health_check_url

    @property
    def state(self) -> Optional[ServiceState]:
        """The `state` property.

        Returns:
            the value of the property.
        """
        return self.get_body().state

    @property
    def pipeline_run(self) -> Optional["PipelineRunResponse"]:
        """The `pipeline_run` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().pipeline_run

    @property
    def model_version(self) -> Optional["ModelVersionResponse"]:
        """The `model_version` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().model_version

admin_state property

The admin_state property.

Returns:

Type Description
Optional[ServiceState]

the value of the property.

config property

The config property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

created property

The created property.

Returns:

Type Description
datetime

the value of the property.

endpoint property

The endpoint property.

Returns:

Type Description
Optional[Dict[str, Any]]

the value of the property.

health_check_url property

The health_check_url property.

Returns:

Type Description
Optional[str]

the value of the property.

labels property

The labels property.

Returns:

Type Description
Optional[Dict[str, str]]

the value of the property.

model_version property

The model_version property.

Returns:

Type Description
Optional[ModelVersionResponse]

the value of the property.

pipeline_run property

The pipeline_run property.

Returns:

Type Description
Optional[PipelineRunResponse]

the value of the property.

prediction_url property

The prediction_url property.

Returns:

Type Description
Optional[str]

the value of the property.

service_source property

The service_source property.

Returns:

Type Description
Optional[str]

the value of the property.

service_type property

The service_type property.

Returns:

Type Description
ServiceType

the value of the property.

state property

The state property.

Returns:

Type Description
Optional[ServiceState]

the value of the property.

status property

The status property.

Returns:

Type Description
Optional[Dict[str, Any]]

the value of the property.

updated property

The updated property.

Returns:

Type Description
datetime

the value of the property.

get_hydrated_version()

Get the hydrated version of this artifact.

Returns:

Type Description
ServiceResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/service.py
265
266
267
268
269
270
271
272
273
def get_hydrated_version(self) -> "ServiceResponse":
    """Get the hydrated version of this artifact.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_service(self.id)

ServiceResponseBody

Bases: ProjectScopedResponseBody

Response body for services.

Source code in src/zenml/models/v2/core/service.py
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
class ServiceResponseBody(ProjectScopedResponseBody):
    """Response body for services."""

    service_type: ServiceType = Field(
        title="The type of the service.",
    )
    labels: Optional[Dict[str, str]] = Field(
        default=None,
        title="The service labels.",
    )
    created: datetime = Field(
        title="The timestamp when this component was created."
    )
    updated: datetime = Field(
        title="The timestamp when this component was last updated.",
    )
    state: Optional[ServiceState] = Field(
        default=None,
        title="The current state of the service.",
    )

ServiceResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for services.

Source code in src/zenml/models/v2/core/service.py
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
class ServiceResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for services."""

    service_source: Optional[str] = Field(
        title="The class of the service.",
    )
    admin_state: Optional[ServiceState] = Field(
        title="The admin state of the service.",
    )
    config: Dict[str, Any] = Field(
        title="The service config.",
    )
    status: Optional[Dict[str, Any]] = Field(
        title="The status of the service.",
    )
    endpoint: Optional[Dict[str, Any]] = Field(
        default=None,
        title="The service endpoint.",
    )
    prediction_url: Optional[str] = Field(
        default=None,
        title="The service endpoint URL.",
    )
    health_check_url: Optional[str] = Field(
        default=None,
        title="The service health check URL.",
    )

ServiceResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the service entity.

Source code in src/zenml/models/v2/core/service.py
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
class ServiceResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the service entity."""

    pipeline_run: Optional["PipelineRunResponse"] = Field(
        default=None,
        title="The pipeline run associated with the service.",
    )
    model_version: Optional["ModelVersionResponse"] = Field(
        default=None,
        title="The model version associated with the service.",
    )

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

ServiceType

Bases: BaseModel

Service type descriptor.

Attributes:

Name Type Description
type str

service type

flavor str

service flavor

name str

name of the service type

description str

description of the service type

logo_url str

logo of the service type

Source code in src/zenml/models/v2/misc/service.py
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
class ServiceType(BaseModel):
    """Service type descriptor.

    Attributes:
        type: service type
        flavor: service flavor
        name: name of the service type
        description: description of the service type
        logo_url: logo of the service type
    """

    type: str
    flavor: str
    name: str = ""
    description: str = ""
    logo_url: str = ""

    model_config = ConfigDict(
        # make the service type immutable and hashable
        frozen=True
    )

ServiceUpdate

Bases: BaseUpdate

Update model for stack components.

Source code in src/zenml/models/v2/core/service.py
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
class ServiceUpdate(BaseUpdate):
    """Update model for stack components."""

    name: Optional[str] = Field(
        None,
        title="The name of the service.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    admin_state: Optional[ServiceState] = Field(
        None,
        title="The admin state of the service.",
        description="The administrative state of the service, e.g., ACTIVE, "
        "INACTIVE.",
    )
    service_source: Optional[str] = Field(
        None,
        title="The class of the service.",
        description="The fully qualified class name of the service "
        "implementation.",
    )
    status: Optional[Dict[str, Any]] = Field(
        None,
        title="The status of the service.",
    )
    endpoint: Optional[Dict[str, Any]] = Field(
        None,
        title="The service endpoint.",
    )
    prediction_url: Optional[str] = Field(
        None,
        title="The service endpoint URL.",
    )
    health_check_url: Optional[str] = Field(
        None,
        title="The service health check URL.",
    )
    labels: Optional[Dict[str, str]] = Field(
        default=None,
        title="The service labels.",
    )
    model_version_id: Optional[UUID] = Field(
        default=None,
        title="The model version id linked to the service.",
    )

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

StackDeploymentConfig

Bases: BaseModel

Configuration about a stack deployment.

Source code in src/zenml/models/v2/misc/stack_deployment.py
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
class StackDeploymentConfig(BaseModel):
    """Configuration about a stack deployment."""

    deployment_url: str = Field(
        title="The cloud provider console URL where the stack will be deployed.",
    )
    deployment_url_text: str = Field(
        title="A textual description for the cloud provider console URL.",
    )
    configuration: Optional[str] = Field(
        default=None,
        title="Configuration for the stack deployment that the user must "
        "manually configure into the cloud provider console.",
    )
    instructions: Optional[str] = Field(
        default=None,
        title="Instructions for deploying the stack.",
    )

StackDeploymentInfo

Bases: BaseModel

Information about a stack deployment.

Source code in src/zenml/models/v2/misc/stack_deployment.py
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
class StackDeploymentInfo(BaseModel):
    """Information about a stack deployment."""

    provider: StackDeploymentProvider = Field(
        title="The provider of the stack deployment."
    )
    description: str = Field(
        title="The description of the stack deployment.",
        description="The description of the stack deployment.",
    )
    instructions: str = Field(
        title="The instructions for deploying the stack.",
        description="The instructions for deploying the stack.",
    )
    post_deploy_instructions: str = Field(
        title="The instructions for post-deployment.",
        description="The instructions for post-deployment.",
    )
    integrations: List[str] = Field(
        title="ZenML integrations required for the stack.",
        description="The list of ZenML integrations that need to be installed "
        "for the stack to be usable.",
    )
    permissions: Dict[str, List[str]] = Field(
        title="The permissions granted to ZenML to access the cloud resources.",
        description="The permissions granted to ZenML to access the cloud "
        "resources, as a dictionary grouping permissions by resource.",
    )
    locations: Dict[str, str] = Field(
        title="The locations where the stack can be deployed.",
        description="The locations where the stack can be deployed, as a "
        "dictionary mapping location names to descriptions.",
    )
    skypilot_default_regions: Dict[str, str] = Field(
        title="The locations where the Skypilot clusters can be deployed by default.",
        description="The locations where the Skypilot clusters can be deployed by default, as a "
        "dictionary mapping location names to descriptions.",
    )

StackFilter

Bases: UserScopedFilter

Model to enable advanced stack filtering.

Source code in src/zenml/models/v2/core/stack.py
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
class StackFilter(UserScopedFilter):
    """Model to enable advanced stack filtering."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *UserScopedFilter.FILTER_EXCLUDE_FIELDS,
        "component_id",
        "component",
    ]

    name: Optional[str] = Field(
        default=None,
        description="Name of the stack",
    )
    description: Optional[str] = Field(
        default=None, description="Description of the stack"
    )
    component_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Component in the stack",
        union_mode="left_to_right",
    )
    component: Optional[Union[UUID, str]] = Field(
        default=None, description="Name/ID of a component in the stack."
    )

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from zenml.zen_stores.schemas import (
            StackComponentSchema,
            StackCompositionSchema,
            StackSchema,
        )

        if self.component_id:
            component_id_filter = and_(
                StackCompositionSchema.stack_id == StackSchema.id,
                StackCompositionSchema.component_id == self.component_id,
            )
            custom_filters.append(component_id_filter)

        if self.component:
            component_filter = and_(
                StackCompositionSchema.stack_id == StackSchema.id,
                StackCompositionSchema.component_id == StackComponentSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.component,
                    table=StackComponentSchema,
                ),
            )
            custom_filters.append(component_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/stack.py
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from zenml.zen_stores.schemas import (
        StackComponentSchema,
        StackCompositionSchema,
        StackSchema,
    )

    if self.component_id:
        component_id_filter = and_(
            StackCompositionSchema.stack_id == StackSchema.id,
            StackCompositionSchema.component_id == self.component_id,
        )
        custom_filters.append(component_id_filter)

    if self.component:
        component_filter = and_(
            StackCompositionSchema.stack_id == StackSchema.id,
            StackCompositionSchema.component_id == StackComponentSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.component,
                table=StackComponentSchema,
            ),
        )
        custom_filters.append(component_filter)

    return custom_filters

StackRequest

Bases: UserScopedRequest

Request model for stack creation.

Source code in src/zenml/models/v2/core/stack.py
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
class StackRequest(UserScopedRequest):
    """Request model for stack creation."""

    name: str = Field(
        title="The name of the stack.", max_length=STR_FIELD_MAX_LENGTH
    )
    description: str = Field(
        default="",
        title="The description of the stack",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    stack_spec_path: Optional[str] = Field(
        default=None,
        title="The path to the stack spec used for mlstacks deployments.",
    )
    components: Dict[StackComponentType, List[Union[UUID, ComponentInfo]]] = (
        Field(
            title="The mapping for the components of the full stack registration.",
            description="The mapping from component types to either UUIDs of "
            "existing components or request information for brand new "
            "components.",
        )
    )
    labels: Optional[Dict[str, Any]] = Field(
        default=None,
        title="The stack labels.",
    )
    service_connectors: List[Union[UUID, ServiceConnectorInfo]] = Field(
        default=[],
        title="The service connectors dictionary for the full stack "
        "registration.",
        description="The UUID of an already existing service connector or "
        "request information to create a service connector from "
        "scratch.",
    )

    @field_validator("components")
    def _validate_components(
        cls, value: Dict[StackComponentType, List[Union[UUID, ComponentInfo]]]
    ) -> Dict[StackComponentType, List[Union[UUID, ComponentInfo]]]:
        """Validate the components of the stack.

        Args:
            value: The components of the stack.

        Raises:
            ValueError: If the stack does not contain an orchestrator and
                artifact store.

        Returns:
            The components of the stack.
        """
        if value:
            artifact_stores = value.get(StackComponentType.ARTIFACT_STORE, [])
            orchestrators = value.get(StackComponentType.ORCHESTRATOR, [])

            if orchestrators and artifact_stores:
                return value

        raise ValueError(
            "Stack must contain at least an orchestrator and artifact store."
        )

    @model_validator(mode="after")
    def _validate_indexes_in_components(self) -> "StackRequest":
        for components in self.components.values():
            for component in components:
                if isinstance(component, ComponentInfo):
                    if component.service_connector_index is not None:
                        if (
                            component.service_connector_index < 0
                            or component.service_connector_index
                            >= len(self.service_connectors)
                        ):
                            raise ValueError(
                                f"Service connector index "
                                f"{component.service_connector_index} "
                                "is out of range. Please provide a valid index "
                                "referring to the position in the list of service "
                                "connectors."
                            )
        return self

StackResponse

Bases: UserScopedResponse[StackResponseBody, StackResponseMetadata, StackResponseResources]

Response model for stacks.

Source code in src/zenml/models/v2/core/stack.py
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
class StackResponse(
    UserScopedResponse[
        StackResponseBody,
        StackResponseMetadata,
        StackResponseResources,
    ]
):
    """Response model for stacks."""

    name: str = Field(
        title="The name of the stack.", max_length=STR_FIELD_MAX_LENGTH
    )

    def get_hydrated_version(self) -> "StackResponse":
        """Get the hydrated version of this stack.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_stack(self.id)

    # Helper methods
    @property
    def is_valid(self) -> bool:
        """Check if the stack is valid.

        Returns:
            True if the stack is valid, False otherwise.
        """
        return (
            StackComponentType.ARTIFACT_STORE in self.components
            and StackComponentType.ORCHESTRATOR in self.components
        )

    def to_yaml(self) -> Dict[str, Any]:
        """Create yaml representation of the Stack Model.

        Returns:
            The yaml representation of the Stack Model.
        """
        component_data = {}
        for component_type, components_list in self.components.items():
            component = components_list[0]
            component_dict = dict(
                name=component.name,
                type=str(component.type),
                flavor=component.flavor_name,
            )
            configuration = json.loads(
                component.get_metadata().model_dump_json(
                    include={"configuration"}
                )
            )
            component_dict.update(configuration)

            component_data[component_type.value] = component_dict

        # write zenml version and stack dict to YAML
        yaml_data = {
            "stack_name": self.name,
            "components": component_data,
        }

        return yaml_data

    # Analytics
    def get_analytics_metadata(self) -> Dict[str, Any]:
        """Add the stack components to the stack analytics metadata.

        Returns:
            Dict of analytics metadata.
        """
        metadata = super().get_analytics_metadata()
        metadata.update(
            {ct: c[0].flavor_name for ct, c in self.components.items()}
        )

        if self.labels is not None:
            metadata.update(
                {
                    label[6:]: value
                    for label, value in self.labels.items()
                    if label.startswith("zenml:")
                }
            )
        return metadata

    @property
    def description(self) -> Optional[str]:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

    @property
    def stack_spec_path(self) -> Optional[str]:
        """The `stack_spec_path` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().stack_spec_path

    @property
    def components(
        self,
    ) -> Dict[StackComponentType, List["ComponentResponse"]]:
        """The `components` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().components

    @property
    def labels(self) -> Optional[Dict[str, Any]]:
        """The `labels` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().labels

components property

The components property.

Returns:

Type Description
Dict[StackComponentType, List[ComponentResponse]]

the value of the property.

description property

The description property.

Returns:

Type Description
Optional[str]

the value of the property.

is_valid property

Check if the stack is valid.

Returns:

Type Description
bool

True if the stack is valid, False otherwise.

labels property

The labels property.

Returns:

Type Description
Optional[Dict[str, Any]]

the value of the property.

stack_spec_path property

The stack_spec_path property.

Returns:

Type Description
Optional[str]

the value of the property.

get_analytics_metadata()

Add the stack components to the stack analytics metadata.

Returns:

Type Description
Dict[str, Any]

Dict of analytics metadata.

Source code in src/zenml/models/v2/core/stack.py
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
def get_analytics_metadata(self) -> Dict[str, Any]:
    """Add the stack components to the stack analytics metadata.

    Returns:
        Dict of analytics metadata.
    """
    metadata = super().get_analytics_metadata()
    metadata.update(
        {ct: c[0].flavor_name for ct, c in self.components.items()}
    )

    if self.labels is not None:
        metadata.update(
            {
                label[6:]: value
                for label, value in self.labels.items()
                if label.startswith("zenml:")
            }
        )
    return metadata

get_hydrated_version()

Get the hydrated version of this stack.

Returns:

Type Description
StackResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/stack.py
259
260
261
262
263
264
265
266
267
def get_hydrated_version(self) -> "StackResponse":
    """Get the hydrated version of this stack.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_stack(self.id)

to_yaml()

Create yaml representation of the Stack Model.

Returns:

Type Description
Dict[str, Any]

The yaml representation of the Stack Model.

Source code in src/zenml/models/v2/core/stack.py
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
def to_yaml(self) -> Dict[str, Any]:
    """Create yaml representation of the Stack Model.

    Returns:
        The yaml representation of the Stack Model.
    """
    component_data = {}
    for component_type, components_list in self.components.items():
        component = components_list[0]
        component_dict = dict(
            name=component.name,
            type=str(component.type),
            flavor=component.flavor_name,
        )
        configuration = json.loads(
            component.get_metadata().model_dump_json(
                include={"configuration"}
            )
        )
        component_dict.update(configuration)

        component_data[component_type.value] = component_dict

    # write zenml version and stack dict to YAML
    yaml_data = {
        "stack_name": self.name,
        "components": component_data,
    }

    return yaml_data

StackResponseBody

Bases: UserScopedResponseBody

Response body for stacks.

Source code in src/zenml/models/v2/core/stack.py
216
217
class StackResponseBody(UserScopedResponseBody):
    """Response body for stacks."""

StackResponseMetadata

Bases: UserScopedResponseMetadata

Response metadata for stacks.

Source code in src/zenml/models/v2/core/stack.py
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
class StackResponseMetadata(UserScopedResponseMetadata):
    """Response metadata for stacks."""

    components: Dict[StackComponentType, List["ComponentResponse"]] = Field(
        title="A mapping of stack component types to the actual"
        "instances of components of this type."
    )
    description: Optional[str] = Field(
        default="",
        title="The description of the stack",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    stack_spec_path: Optional[str] = Field(
        default=None,
        title="The path to the stack spec used for mlstacks deployments.",
    )
    labels: Optional[Dict[str, Any]] = Field(
        default=None,
        title="The stack labels.",
    )

StackResponseResources

Bases: UserScopedResponseResources

Response resources for stacks.

Source code in src/zenml/models/v2/core/stack.py
242
243
class StackResponseResources(UserScopedResponseResources):
    """Response resources for stacks."""

StackUpdate

Bases: BaseUpdate

Update model for stacks.

Source code in src/zenml/models/v2/core/stack.py
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
class StackUpdate(BaseUpdate):
    """Update model for stacks."""

    name: Optional[str] = Field(
        title="The name of the stack.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    description: Optional[str] = Field(
        title="The description of the stack",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    stack_spec_path: Optional[str] = Field(
        title="The path to the stack spec used for mlstacks deployments.",
        default=None,
    )
    components: Optional[Dict[StackComponentType, List[UUID]]] = Field(
        title="A mapping of stack component types to the actual"
        "instances of components of this type.",
        default=None,
    )
    labels: Optional[Dict[str, Any]] = Field(
        default=None,
        title="The stack labels.",
    )

    @field_validator("components")
    def _validate_components(
        cls,
        value: Optional[
            Dict[StackComponentType, List[Union[UUID, ComponentInfo]]]
        ],
    ) -> Optional[Dict[StackComponentType, List[Union[UUID, ComponentInfo]]]]:
        """Validate the components of the stack.

        Args:
            value: The components of the stack.

        Raises:
            ValueError: If the stack does not contain an orchestrator and
                artifact store.

        Returns:
            The components of the stack.
        """
        if value is None:
            return None

        if value:
            artifact_stores = value.get(StackComponentType.ARTIFACT_STORE, [])
            orchestrators = value.get(StackComponentType.ORCHESTRATOR, [])

            if orchestrators and artifact_stores:
                return value

        raise ValueError(
            "Stack must contain at least an orchestrator and artifact store."
        )

StepRunFilter

Bases: ProjectScopedFilter, RunMetadataFilterMixin

Model to enable advanced filtering of step runs.

Source code in src/zenml/models/v2/core/step_run.py
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
class StepRunFilter(ProjectScopedFilter, RunMetadataFilterMixin):
    """Model to enable advanced filtering of step runs."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.FILTER_EXCLUDE_FIELDS,
        *RunMetadataFilterMixin.FILTER_EXCLUDE_FIELDS,
        "model",
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CLI_EXCLUDE_FIELDS,
        *RunMetadataFilterMixin.CLI_EXCLUDE_FIELDS,
    ]
    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *ProjectScopedFilter.CUSTOM_SORTING_OPTIONS,
        *RunMetadataFilterMixin.CUSTOM_SORTING_OPTIONS,
    ]
    API_MULTI_INPUT_PARAMS: ClassVar[List[str]] = [
        *ProjectScopedFilter.API_MULTI_INPUT_PARAMS,
        *RunMetadataFilterMixin.API_MULTI_INPUT_PARAMS,
    ]

    name: Optional[str] = Field(
        default=None,
        description="Name of the step run",
    )
    code_hash: Optional[str] = Field(
        default=None,
        description="Code hash for this step run",
    )
    cache_key: Optional[str] = Field(
        default=None,
        description="Cache key for this step run",
    )
    status: Optional[str] = Field(
        default=None,
        description="Status of the Step Run",
    )
    start_time: Optional[Union[datetime, str]] = Field(
        default=None,
        description="Start time for this run",
        union_mode="left_to_right",
    )
    end_time: Optional[Union[datetime, str]] = Field(
        default=None,
        description="End time for this run",
        union_mode="left_to_right",
    )
    pipeline_run_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Pipeline run of this step run",
        union_mode="left_to_right",
    )
    deployment_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Deployment of this step run",
        union_mode="left_to_right",
    )
    original_step_run_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Original id for this step run",
        union_mode="left_to_right",
    )
    model_version_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Model version associated with the step run.",
        union_mode="left_to_right",
    )
    model: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the model associated with the step run.",
    )
    model_config = ConfigDict(protected_namespaces=())

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_

        from zenml.zen_stores.schemas import (
            ModelSchema,
            ModelVersionSchema,
            StepRunSchema,
        )

        if self.model:
            model_filter = and_(
                StepRunSchema.model_version_id == ModelVersionSchema.id,
                ModelVersionSchema.model_id == ModelSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.model, table=ModelSchema
                ),
            )
            custom_filters.append(model_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/step_run.py
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_

    from zenml.zen_stores.schemas import (
        ModelSchema,
        ModelVersionSchema,
        StepRunSchema,
    )

    if self.model:
        model_filter = and_(
            StepRunSchema.model_version_id == ModelVersionSchema.id,
            ModelVersionSchema.model_id == ModelSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.model, table=ModelSchema
            ),
        )
        custom_filters.append(model_filter)

    return custom_filters

StepRunRequest

Bases: ProjectScopedRequest

Request model for step runs.

Source code in src/zenml/models/v2/core/step_run.py
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
class StepRunRequest(ProjectScopedRequest):
    """Request model for step runs."""

    name: str = Field(
        title="The name of the pipeline run step.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    start_time: Optional[datetime] = Field(
        title="The start time of the step run.",
        default=None,
    )
    end_time: Optional[datetime] = Field(
        title="The end time of the step run.",
        default=None,
    )
    status: ExecutionStatus = Field(title="The status of the step.")
    cache_key: Optional[str] = Field(
        title="The cache key of the step run.",
        default=None,
        max_length=STR_FIELD_MAX_LENGTH,
    )
    code_hash: Optional[str] = Field(
        title="The code hash of the step run.",
        default=None,
        max_length=STR_FIELD_MAX_LENGTH,
    )
    docstring: Optional[str] = Field(
        title="The docstring of the step function or class.",
        default=None,
        max_length=TEXT_FIELD_MAX_LENGTH,
    )
    source_code: Optional[str] = Field(
        title="The source code of the step function or class.",
        default=None,
        max_length=TEXT_FIELD_MAX_LENGTH,
    )
    pipeline_run_id: UUID = Field(
        title="The ID of the pipeline run that this step run belongs to.",
    )
    original_step_run_id: Optional[UUID] = Field(
        title="The ID of the original step run if this step was cached.",
        default=None,
    )
    parent_step_ids: List[UUID] = Field(
        title="The IDs of the parent steps of this step run.",
        default_factory=list,
    )
    inputs: Dict[str, List[UUID]] = Field(
        title="The IDs of the input artifact versions of the step run.",
        default_factory=dict,
    )
    outputs: Dict[str, List[UUID]] = Field(
        title="The IDs of the output artifact versions of the step run.",
        default_factory=dict,
    )
    logs: Optional["LogsRequest"] = Field(
        title="Logs associated with this step run.",
        default=None,
    )

    model_config = ConfigDict(protected_namespaces=())

StepRunResponse

Bases: ProjectScopedResponse[StepRunResponseBody, StepRunResponseMetadata, StepRunResponseResources]

Response model for step runs.

Source code in src/zenml/models/v2/core/step_run.py
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
class StepRunResponse(
    ProjectScopedResponse[
        StepRunResponseBody, StepRunResponseMetadata, StepRunResponseResources
    ]
):
    """Response model for step runs."""

    name: str = Field(
        title="The name of the pipeline run step.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "StepRunResponse":
        """Get the hydrated version of this step run.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_run_step(self.id)

    # Helper properties
    @property
    def input(self) -> StepRunInputResponse:
        """Returns the input artifact that was used to run this step.

        Returns:
            The input artifact.

        Raises:
            ValueError: If there were zero or multiple inputs to this step.
        """
        if not self.inputs:
            raise ValueError(f"Step {self.name} has no inputs.")
        if len(self.inputs) > 1 or (
            len(self.inputs) == 1 and len(next(iter(self.inputs.values()))) > 1
        ):
            raise ValueError(
                f"Step {self.name} has multiple inputs, so `Step.input` is "
                "ambiguous. Please use `Step.inputs` instead."
            )
        return next(iter(self.inputs.values()))[0]

    @property
    def output(self) -> ArtifactVersionResponse:
        """Returns the output artifact that was written by this step.

        Returns:
            The output artifact.

        Raises:
            ValueError: If there were zero or multiple step outputs.
        """
        if not self.outputs:
            raise ValueError(f"Step {self.name} has no outputs.")
        if len(self.outputs) > 1 or (
            len(self.outputs) == 1
            and len(next(iter(self.outputs.values()))) > 1
        ):
            raise ValueError(
                f"Step {self.name} has multiple outputs, so `Step.output` is "
                "ambiguous. Please use `Step.outputs` instead."
            )
        return next(iter(self.outputs.values()))[0]

    @property
    def regular_inputs(self) -> Dict[str, StepRunInputResponse]:
        """Returns the regular step inputs of the step run.

        Regular step inputs are the inputs that are defined in the step function
        signature, and are not manually loaded during the step execution.

        Raises:
            ValueError: If there were multiple regular input artifacts for the
                same input name.

        Returns:
            The regular step inputs.
        """
        result = {}

        for input_name, input_artifacts in self.inputs.items():
            filtered = [
                input_artifact
                for input_artifact in input_artifacts
                if input_artifact.input_type != StepRunInputArtifactType.MANUAL
            ]
            if len(filtered) > 1:
                raise ValueError(
                    f"Expected 1 regular input artifact for {input_name}, got "
                    f"{len(filtered)}."
                )
            if filtered:
                result[input_name] = filtered[0]

        return result

    @property
    def regular_outputs(self) -> Dict[str, ArtifactVersionResponse]:
        """Returns the regular step outputs of the step run.

        Regular step outputs are the outputs that are defined in the step
        function signature, and are not manually saved during the step
        execution.

        Raises:
            ValueError: If there were multiple regular output artifacts for the
                same output name.

        Returns:
            The regular step outputs.
        """
        result = {}

        for output_name, output_artifacts in self.outputs.items():
            filtered = [
                output_artifact
                for output_artifact in output_artifacts
                if output_artifact.save_type == ArtifactSaveType.STEP_OUTPUT
            ]
            if len(filtered) > 1:
                raise ValueError(
                    f"Expected 1 regular output artifact for {output_name}, "
                    f"got {len(filtered)}."
                )
            if filtered:
                result[output_name] = filtered[0]

        return result

    # Body and metadata properties
    @property
    def status(self) -> ExecutionStatus:
        """The `status` property.

        Returns:
            the value of the property.
        """
        return self.get_body().status

    @property
    def inputs(self) -> Dict[str, List[StepRunInputResponse]]:
        """The `inputs` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().inputs

    @property
    def outputs(self) -> Dict[str, List[ArtifactVersionResponse]]:
        """The `outputs` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().outputs

    @property
    def model_version_id(self) -> Optional[UUID]:
        """The `model_version_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().model_version_id

    @property
    def substitutions(self) -> Dict[str, str]:
        """The `substitutions` property.

        Returns:
            the value of the property.
        """
        return self.get_body().substitutions

    @property
    def config(self) -> "StepConfiguration":
        """The `config` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().config

    @property
    def spec(self) -> "StepSpec":
        """The `spec` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().spec

    @property
    def cache_key(self) -> Optional[str]:
        """The `cache_key` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().cache_key

    @property
    def code_hash(self) -> Optional[str]:
        """The `code_hash` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().code_hash

    @property
    def docstring(self) -> Optional[str]:
        """The `docstring` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().docstring

    @property
    def source_code(self) -> Optional[str]:
        """The `source_code` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().source_code

    @property
    def start_time(self) -> Optional[datetime]:
        """The `start_time` property.

        Returns:
            the value of the property.
        """
        return self.get_body().start_time

    @property
    def end_time(self) -> Optional[datetime]:
        """The `end_time` property.

        Returns:
            the value of the property.
        """
        return self.get_body().end_time

    @property
    def logs(self) -> Optional["LogsResponse"]:
        """The `logs` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().logs

    @property
    def deployment_id(self) -> UUID:
        """The `deployment_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().deployment_id

    @property
    def pipeline_run_id(self) -> UUID:
        """The `pipeline_run_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().pipeline_run_id

    @property
    def original_step_run_id(self) -> Optional[UUID]:
        """The `original_step_run_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().original_step_run_id

    @property
    def parent_step_ids(self) -> List[UUID]:
        """The `parent_step_ids` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().parent_step_ids

    @property
    def run_metadata(self) -> Dict[str, MetadataType]:
        """The `run_metadata` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().run_metadata

    @property
    def model_version(self) -> Optional[ModelVersionResponse]:
        """The `model_version` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().model_version

cache_key property

The cache_key property.

Returns:

Type Description
Optional[str]

the value of the property.

code_hash property

The code_hash property.

Returns:

Type Description
Optional[str]

the value of the property.

config property

The config property.

Returns:

Type Description
StepConfiguration

the value of the property.

deployment_id property

The deployment_id property.

Returns:

Type Description
UUID

the value of the property.

docstring property

The docstring property.

Returns:

Type Description
Optional[str]

the value of the property.

end_time property

The end_time property.

Returns:

Type Description
Optional[datetime]

the value of the property.

input property

Returns the input artifact that was used to run this step.

Returns:

Type Description
StepRunInputResponse

The input artifact.

Raises:

Type Description
ValueError

If there were zero or multiple inputs to this step.

inputs property

The inputs property.

Returns:

Type Description
Dict[str, List[StepRunInputResponse]]

the value of the property.

logs property

The logs property.

Returns:

Type Description
Optional[LogsResponse]

the value of the property.

model_version property

The model_version property.

Returns:

Type Description
Optional[ModelVersionResponse]

the value of the property.

model_version_id property

The model_version_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

original_step_run_id property

The original_step_run_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

output property

Returns the output artifact that was written by this step.

Returns:

Type Description
ArtifactVersionResponse

The output artifact.

Raises:

Type Description
ValueError

If there were zero or multiple step outputs.

outputs property

The outputs property.

Returns:

Type Description
Dict[str, List[ArtifactVersionResponse]]

the value of the property.

parent_step_ids property

The parent_step_ids property.

Returns:

Type Description
List[UUID]

the value of the property.

pipeline_run_id property

The pipeline_run_id property.

Returns:

Type Description
UUID

the value of the property.

regular_inputs property

Returns the regular step inputs of the step run.

Regular step inputs are the inputs that are defined in the step function signature, and are not manually loaded during the step execution.

Raises:

Type Description
ValueError

If there were multiple regular input artifacts for the same input name.

Returns:

Type Description
Dict[str, StepRunInputResponse]

The regular step inputs.

regular_outputs property

Returns the regular step outputs of the step run.

Regular step outputs are the outputs that are defined in the step function signature, and are not manually saved during the step execution.

Raises:

Type Description
ValueError

If there were multiple regular output artifacts for the same output name.

Returns:

Type Description
Dict[str, ArtifactVersionResponse]

The regular step outputs.

run_metadata property

The run_metadata property.

Returns:

Type Description
Dict[str, MetadataType]

the value of the property.

source_code property

The source_code property.

Returns:

Type Description
Optional[str]

the value of the property.

spec property

The spec property.

Returns:

Type Description
StepSpec

the value of the property.

start_time property

The start_time property.

Returns:

Type Description
Optional[datetime]

the value of the property.

status property

The status property.

Returns:

Type Description
ExecutionStatus

the value of the property.

substitutions property

The substitutions property.

Returns:

Type Description
Dict[str, str]

the value of the property.

get_hydrated_version()

Get the hydrated version of this step run.

Returns:

Type Description
StepRunResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/step_run.py
293
294
295
296
297
298
299
300
301
def get_hydrated_version(self) -> "StepRunResponse":
    """Get the hydrated version of this step run.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_run_step(self.id)

StepRunResponseBody

Bases: ProjectScopedResponseBody

Response body for step runs.

Source code in src/zenml/models/v2/core/step_run.py
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
class StepRunResponseBody(ProjectScopedResponseBody):
    """Response body for step runs."""

    status: ExecutionStatus = Field(title="The status of the step.")
    start_time: Optional[datetime] = Field(
        title="The start time of the step run.",
        default=None,
    )
    end_time: Optional[datetime] = Field(
        title="The end time of the step run.",
        default=None,
    )
    model_version_id: Optional[UUID] = Field(
        title="The ID of the model version that was "
        "configured by this step run explicitly.",
        default=None,
    )
    substitutions: Dict[str, str] = Field(
        title="The substitutions of the step run.",
        default={},
    )
    model_config = ConfigDict(protected_namespaces=())

StepRunResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for step runs.

Source code in src/zenml/models/v2/core/step_run.py
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
class StepRunResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for step runs."""

    __zenml_skip_dehydration__: ClassVar[List[str]] = [
        "config",
        "spec",
        "metadata",
    ]

    # Configuration
    config: "StepConfiguration" = Field(title="The configuration of the step.")
    spec: "StepSpec" = Field(title="The spec of the step.")

    # Code related fields
    cache_key: Optional[str] = Field(
        title="The cache key of the step run.",
        default=None,
        max_length=STR_FIELD_MAX_LENGTH,
    )
    code_hash: Optional[str] = Field(
        title="The code hash of the step run.",
        default=None,
        max_length=STR_FIELD_MAX_LENGTH,
    )
    docstring: Optional[str] = Field(
        title="The docstring of the step function or class.",
        default=None,
        max_length=TEXT_FIELD_MAX_LENGTH,
    )
    source_code: Optional[str] = Field(
        title="The source code of the step function or class.",
        default=None,
        max_length=TEXT_FIELD_MAX_LENGTH,
    )

    # References
    logs: Optional["LogsResponse"] = Field(
        title="Logs associated with this step run.",
        default=None,
    )
    deployment_id: UUID = Field(
        title="The deployment associated with the step run."
    )
    pipeline_run_id: UUID = Field(
        title="The ID of the pipeline run that this step run belongs to.",
    )
    original_step_run_id: Optional[UUID] = Field(
        title="The ID of the original step run if this step was cached.",
        default=None,
    )
    parent_step_ids: List[UUID] = Field(
        title="The IDs of the parent steps of this step run.",
        default_factory=list,
    )
    run_metadata: Dict[str, MetadataType] = Field(
        title="Metadata associated with this step run.",
        default={},
    )

StepRunResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the step run entity.

Source code in src/zenml/models/v2/core/step_run.py
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
class StepRunResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the step run entity."""

    model_version: Optional[ModelVersionResponse] = None
    inputs: Dict[str, List[StepRunInputResponse]] = Field(
        title="The input artifact versions of the step run.",
        default_factory=dict,
    )
    outputs: Dict[str, List[ArtifactVersionResponse]] = Field(
        title="The output artifact versions of the step run.",
        default_factory=dict,
    )

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

StepRunUpdate

Bases: BaseUpdate

Update model for step runs.

Source code in src/zenml/models/v2/core/step_run.py
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
class StepRunUpdate(BaseUpdate):
    """Update model for step runs."""

    outputs: Dict[str, List[UUID]] = Field(
        title="The IDs of the output artifact versions of the step run.",
        default={},
    )
    loaded_artifact_versions: Dict[str, UUID] = Field(
        title="The IDs of artifact versions that were loaded by this step run.",
        default={},
    )
    status: Optional[ExecutionStatus] = Field(
        title="The status of the step.",
        default=None,
    )
    end_time: Optional[datetime] = Field(
        title="The end time of the step run.",
        default=None,
    )
    model_config = ConfigDict(protected_namespaces=())

StrFilter

Bases: Filter

Filter for all string fields.

Source code in src/zenml/models/v2/base/filter.py
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
class StrFilter(Filter):
    """Filter for all string fields."""

    ALLOWED_OPS: ClassVar[List[str]] = [
        GenericFilterOps.EQUALS,
        GenericFilterOps.NOT_EQUALS,
        GenericFilterOps.STARTSWITH,
        GenericFilterOps.CONTAINS,
        GenericFilterOps.ENDSWITH,
        GenericFilterOps.ONEOF,
        GenericFilterOps.GT,
        GenericFilterOps.GTE,
        GenericFilterOps.LT,
        GenericFilterOps.LTE,
    ]

    @model_validator(mode="after")
    def check_value_if_operation_oneof(self) -> "StrFilter":
        """Validator to check if value is a list if oneof operation is used.

        Raises:
            ValueError: If the value is not a list

        Returns:
            self
        """
        if self.operation == GenericFilterOps.ONEOF:
            if not isinstance(self.value, list):
                raise ValueError(ONEOF_ERROR)
        return self

    def _check_if_column_is_json_encoded(self, column: Any) -> bool:
        """Check if the column is json encoded.

        Args:
            column: The column of an SQLModel table on which to filter.

        Returns:
            True if the column is json encoded, False otherwise.
        """
        from zenml.zen_stores.schemas import RunMetadataSchema

        JSON_ENCODED_COLUMNS = [RunMetadataSchema.value]

        if column in JSON_ENCODED_COLUMNS:
            return True
        return False

    def generate_query_conditions_from_column(self, column: Any) -> Any:
        """Generate query conditions for a string column.

        Args:
            column: The string column of an SQLModel table on which to filter.

        Returns:
            A list of query conditions.
        """
        # Handle numeric comparisons (GT, LT, GTE, LTE)
        if self.operation in {
            GenericFilterOps.GT,
            GenericFilterOps.LT,
            GenericFilterOps.GTE,
            GenericFilterOps.LTE,
        }:
            return self._handle_numeric_comparison(column)

        # Handle operations that need special treatment for JSON-encoded columns
        is_json_encoded = self._check_if_column_is_json_encoded(column)

        # Handle list operations
        if self.operation == GenericFilterOps.ONEOF:
            assert isinstance(self.value, list)
            return self._handle_oneof(column, is_json_encoded)

        # Handle pattern matching operations
        if self.operation == GenericFilterOps.CONTAINS:
            return column.like(f"%{self.value}%")

        if self.operation == GenericFilterOps.STARTSWITH:
            return self._handle_startswith(column, is_json_encoded)

        if self.operation == GenericFilterOps.ENDSWITH:
            return self._handle_endswith(column, is_json_encoded)

        if self.operation == GenericFilterOps.NOT_EQUALS:
            return self._handle_not_equals(column, is_json_encoded)

        # Default case (EQUALS)
        return self._handle_equals(column, is_json_encoded)

    def _handle_numeric_comparison(self, column: Any) -> Any:
        """Handle numeric comparison operations.

        Args:
            column: The column to compare.

        Returns:
            The query condition.

        Raises:
            ValueError: If the comparison fails.
        """
        try:
            numeric_column = cast(column, Float)
            assert self.value is not None

            operations = {
                GenericFilterOps.GT: lambda col, val: and_(
                    col, col > float(val)
                ),
                GenericFilterOps.LT: lambda col, val: and_(
                    col, col < float(val)
                ),
                GenericFilterOps.GTE: lambda col, val: and_(
                    col, col >= float(val)
                ),
                GenericFilterOps.LTE: lambda col, val: and_(
                    col, col <= float(val)
                ),
            }

            return operations[self.operation](numeric_column, self.value)  # type: ignore[no-untyped-call]
        except Exception as e:
            raise ValueError(
                f"Failed to compare the column '{column}' to the "
                f"value '{self.value}' (must be numeric): {e}"
            )

    def _handle_oneof(self, column: Any, is_json_encoded: bool) -> Any:
        """Handle the ONEOF operation.

        Args:
            column: The column to check.
            is_json_encoded: Whether the column is JSON encoded.

        Returns:
            The query condition.
        """
        from sqlalchemy import or_

        conditions = []

        assert isinstance(self.value, list)

        for value in self.value:
            if is_json_encoded:
                # For JSON encoded columns, add conditions for both raw and JSON-quoted values
                conditions.append(column == value)
                conditions.append(column == f'"{value}"')
            else:
                conditions.append(column == value)

        return or_(*conditions)

    def _handle_startswith(self, column: Any, is_json_encoded: bool) -> Any:
        """Handle the STARTSWITH operation.

        Args:
            column: The column to check.
            is_json_encoded: Whether the column is JSON encoded.

        Returns:
            The query condition.
        """
        if is_json_encoded:
            return or_(
                column.startswith(self.value),
                column.startswith(f'"{self.value}'),
            )
        else:
            return column.startswith(self.value)

    def _handle_endswith(self, column: Any, is_json_encoded: bool) -> Any:
        """Handle the ENDSWITH operation.

        Args:
            column: The column to check.
            is_json_encoded: Whether the column is JSON encoded.

        Returns:
            The query condition.
        """
        if is_json_encoded:
            return or_(
                column.endswith(self.value), column.endswith(f'{self.value}"')
            )
        else:
            return column.endswith(self.value)

    def _handle_not_equals(self, column: Any, is_json_encoded: bool) -> Any:
        """Handle the NOT_EQUALS operation.

        Args:
            column: The column to check.
            is_json_encoded: Whether the column is JSON encoded.

        Returns:
            The query condition.
        """
        if is_json_encoded:
            return and_(column != self.value, column != f'"{self.value}"')
        else:
            return column != self.value

    def _handle_equals(self, column: Any, is_json_encoded: bool) -> Any:
        """Handle the EQUALS operation (default).

        Args:
            column: The column to check.
            is_json_encoded: Whether the column is JSON encoded.

        Returns:
            The query condition.
        """
        if is_json_encoded:
            return or_(column == self.value, column == f'"{self.value}"')
        else:
            return column == self.value

check_value_if_operation_oneof()

Validator to check if value is a list if oneof operation is used.

Raises:

Type Description
ValueError

If the value is not a list

Returns:

Type Description
StrFilter

self

Source code in src/zenml/models/v2/base/filter.py
187
188
189
190
191
192
193
194
195
196
197
198
199
200
@model_validator(mode="after")
def check_value_if_operation_oneof(self) -> "StrFilter":
    """Validator to check if value is a list if oneof operation is used.

    Raises:
        ValueError: If the value is not a list

    Returns:
        self
    """
    if self.operation == GenericFilterOps.ONEOF:
        if not isinstance(self.value, list):
            raise ValueError(ONEOF_ERROR)
    return self

generate_query_conditions_from_column(column)

Generate query conditions for a string column.

Parameters:

Name Type Description Default
column Any

The string column of an SQLModel table on which to filter.

required

Returns:

Type Description
Any

A list of query conditions.

Source code in src/zenml/models/v2/base/filter.py
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
def generate_query_conditions_from_column(self, column: Any) -> Any:
    """Generate query conditions for a string column.

    Args:
        column: The string column of an SQLModel table on which to filter.

    Returns:
        A list of query conditions.
    """
    # Handle numeric comparisons (GT, LT, GTE, LTE)
    if self.operation in {
        GenericFilterOps.GT,
        GenericFilterOps.LT,
        GenericFilterOps.GTE,
        GenericFilterOps.LTE,
    }:
        return self._handle_numeric_comparison(column)

    # Handle operations that need special treatment for JSON-encoded columns
    is_json_encoded = self._check_if_column_is_json_encoded(column)

    # Handle list operations
    if self.operation == GenericFilterOps.ONEOF:
        assert isinstance(self.value, list)
        return self._handle_oneof(column, is_json_encoded)

    # Handle pattern matching operations
    if self.operation == GenericFilterOps.CONTAINS:
        return column.like(f"%{self.value}%")

    if self.operation == GenericFilterOps.STARTSWITH:
        return self._handle_startswith(column, is_json_encoded)

    if self.operation == GenericFilterOps.ENDSWITH:
        return self._handle_endswith(column, is_json_encoded)

    if self.operation == GenericFilterOps.NOT_EQUALS:
        return self._handle_not_equals(column, is_json_encoded)

    # Default case (EQUALS)
    return self._handle_equals(column, is_json_encoded)

TagFilter

Bases: UserScopedFilter

Model to enable advanced filtering of all tags.

Source code in src/zenml/models/v2/core/tag.py
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
class TagFilter(UserScopedFilter):
    """Model to enable advanced filtering of all tags."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *UserScopedFilter.FILTER_EXCLUDE_FIELDS,
        "resource_type",
    ]

    name: Optional[str] = Field(
        description="The unique title of the tag.", default=None
    )
    color: Optional[ColorVariants] = Field(
        description="The color variant assigned to the tag.", default=None
    )
    exclusive: Optional[bool] = Field(
        description="The flag signifying whether the tag is an exclusive tag.",
        default=None,
    )
    resource_type: Optional[TaggableResourceTypes] = Field(
        description="Filter tags associated with a specific resource type.",
        default=None,
    )

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import exists, select

        from zenml.zen_stores.schemas import (
            TagResourceSchema,
            TagSchema,
        )

        if self.resource_type:
            # Filter for tags that have at least one association with the specified resource type
            resource_type_filter = exists(
                select(TagResourceSchema).where(
                    TagResourceSchema.tag_id == TagSchema.id,
                    TagResourceSchema.resource_type
                    == self.resource_type.value,
                )
            )
            custom_filters.append(resource_type_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/tag.py
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import exists, select

    from zenml.zen_stores.schemas import (
        TagResourceSchema,
        TagSchema,
    )

    if self.resource_type:
        # Filter for tags that have at least one association with the specified resource type
        resource_type_filter = exists(
            select(TagResourceSchema).where(
                TagResourceSchema.tag_id == TagSchema.id,
                TagResourceSchema.resource_type
                == self.resource_type.value,
            )
        )
        custom_filters.append(resource_type_filter)

    return custom_filters

TagRequest

Bases: UserScopedRequest

Request model for tags.

Source code in src/zenml/models/v2/core/tag.py
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
class TagRequest(UserScopedRequest):
    """Request model for tags."""

    name: str = Field(
        description="The unique title of the tag.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    exclusive: bool = Field(
        description="The flag signifying whether the tag is an exclusive tag.",
        default=False,
    )
    color: ColorVariants = Field(
        description="The color variant assigned to the tag.",
        default_factory=lambda: random.choice(list(ColorVariants)),
    )

    @field_validator("name")
    @classmethod
    def validate_name_not_uuid(cls, value: str) -> str:
        """Validates that the tag name is not a UUID.

        Args:
            value: The tag name to validate.

        Returns:
            The validated tag name.

        Raises:
            ValueError: If the tag name can be converted
                to a UUID.
        """
        if is_valid_uuid(value):
            raise ValueError(
                "Tag names cannot be UUIDs or strings that "
                "can be converted to UUIDs."
            )
        return value

validate_name_not_uuid(value) classmethod

Validates that the tag name is not a UUID.

Parameters:

Name Type Description Default
value str

The tag name to validate.

required

Returns:

Type Description
str

The validated tag name.

Raises:

Type Description
ValueError

If the tag name can be converted to a UUID.

Source code in src/zenml/models/v2/core/tag.py
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
@field_validator("name")
@classmethod
def validate_name_not_uuid(cls, value: str) -> str:
    """Validates that the tag name is not a UUID.

    Args:
        value: The tag name to validate.

    Returns:
        The validated tag name.

    Raises:
        ValueError: If the tag name can be converted
            to a UUID.
    """
    if is_valid_uuid(value):
        raise ValueError(
            "Tag names cannot be UUIDs or strings that "
            "can be converted to UUIDs."
        )
    return value

TagResource

Bases: BaseModel

Utility class to help identify resources to tag.

Source code in src/zenml/models/v2/misc/tag.py
23
24
25
26
27
class TagResource(BaseModel):
    """Utility class to help identify resources to tag."""

    id: UUID = Field(title="The ID of the resource.")
    type: TaggableResourceTypes = Field(title="The type of the resource.")

TagResourceRequest

Bases: BaseRequest

Request model for links between tags and resources.

Source code in src/zenml/models/v2/core/tag_resource.py
30
31
32
33
34
35
class TagResourceRequest(BaseRequest):
    """Request model for links between tags and resources."""

    tag_id: UUID
    resource_id: UUID
    resource_type: TaggableResourceTypes

TagResourceResponse

Bases: BaseIdentifiedResponse[TagResourceResponseBody, BaseResponseMetadata, TagResourceResponseResources]

Response model for the links between tags and resources.

Source code in src/zenml/models/v2/core/tag_resource.py
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
class TagResourceResponse(
    BaseIdentifiedResponse[
        TagResourceResponseBody,
        BaseResponseMetadata,
        TagResourceResponseResources,
    ]
):
    """Response model for the links between tags and resources."""

    @property
    def tag_id(self) -> UUID:
        """The `tag_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().tag_id

    @property
    def resource_id(self) -> UUID:
        """The `resource_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().resource_id

    @property
    def resource_type(self) -> TaggableResourceTypes:
        """The `resource_type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().resource_type

resource_id property

The resource_id property.

Returns:

Type Description
UUID

the value of the property.

resource_type property

The resource_type property.

Returns:

Type Description
TaggableResourceTypes

the value of the property.

tag_id property

The tag_id property.

Returns:

Type Description
UUID

the value of the property.

TagResourceResponseBody

Bases: BaseDatedResponseBody

Response body for the links between tags and resources.

Source code in src/zenml/models/v2/core/tag_resource.py
45
46
47
48
49
50
class TagResourceResponseBody(BaseDatedResponseBody):
    """Response body for the links between tags and resources."""

    tag_id: UUID
    resource_id: UUID
    resource_type: TaggableResourceTypes

TagResponse

Bases: UserScopedResponse[TagResponseBody, TagResponseMetadata, TagResponseResources]

Response model for tags.

Source code in src/zenml/models/v2/core/tag.py
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
class TagResponse(
    UserScopedResponse[
        TagResponseBody, TagResponseMetadata, TagResponseResources
    ]
):
    """Response model for tags."""

    name: str = Field(
        description="The unique title of the tag.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "TagResponse":
        """Get the hydrated version of this tag.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_tag(self.id)

    @property
    def color(self) -> ColorVariants:
        """The `color` property.

        Returns:
            the value of the property.
        """
        return self.get_body().color

    @property
    def exclusive(self) -> bool:
        """The `exclusive` property.

        Returns:
            the value of the property.
        """
        return self.get_body().exclusive

    @property
    def tagged_count(self) -> int:
        """The `tagged_count` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().tagged_count

color property

The color property.

Returns:

Type Description
ColorVariants

the value of the property.

exclusive property

The exclusive property.

Returns:

Type Description
bool

the value of the property.

tagged_count property

The tagged_count property.

Returns:

Type Description
int

the value of the property.

get_hydrated_version()

Get the hydrated version of this tag.

Returns:

Type Description
TagResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/tag.py
154
155
156
157
158
159
160
161
162
def get_hydrated_version(self) -> "TagResponse":
    """Get the hydrated version of this tag.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_tag(self.id)

TagResponseBody

Bases: UserScopedResponseBody

Response body for tags.

Source code in src/zenml/models/v2/core/tag.py
118
119
120
121
122
123
124
125
126
127
class TagResponseBody(UserScopedResponseBody):
    """Response body for tags."""

    color: ColorVariants = Field(
        description="The color variant assigned to the tag.",
        default_factory=lambda: random.choice(list(ColorVariants)),
    )
    exclusive: bool = Field(
        description="The flag signifying whether the tag is an exclusive tag."
    )

TagResponseMetadata

Bases: UserScopedResponseMetadata

Response metadata for tags.

Source code in src/zenml/models/v2/core/tag.py
130
131
132
133
134
135
class TagResponseMetadata(UserScopedResponseMetadata):
    """Response metadata for tags."""

    tagged_count: int = Field(
        description="The count of resources tagged with this tag."
    )

TagResponseResources

Bases: UserScopedResponseResources

Class for all resource models associated with the tag entity.

Source code in src/zenml/models/v2/core/tag.py
138
139
class TagResponseResources(UserScopedResponseResources):
    """Class for all resource models associated with the tag entity."""

TagUpdate

Bases: BaseUpdate

Update model for tags.

Source code in src/zenml/models/v2/core/tag.py
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
class TagUpdate(BaseUpdate):
    """Update model for tags."""

    name: Optional[str] = None
    exclusive: Optional[bool] = None
    color: Optional[ColorVariants] = None

    @field_validator("name")
    @classmethod
    def validate_name_not_uuid(cls, value: Optional[str]) -> Optional[str]:
        """Validates that the tag name is not a UUID.

        Args:
            value: The tag name to validate.

        Returns:
            The validated tag name.

        Raises:
            ValueError: If the tag name can be converted to a UUID.
        """
        if value is not None and is_valid_uuid(value):
            raise ValueError(
                "Tag names cannot be UUIDs or strings that "
                "can be converted to UUIDs."
            )
        return value

validate_name_not_uuid(value) classmethod

Validates that the tag name is not a UUID.

Parameters:

Name Type Description Default
value Optional[str]

The tag name to validate.

required

Returns:

Type Description
Optional[str]

The validated tag name.

Raises:

Type Description
ValueError

If the tag name can be converted to a UUID.

Source code in src/zenml/models/v2/core/tag.py
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
@field_validator("name")
@classmethod
def validate_name_not_uuid(cls, value: Optional[str]) -> Optional[str]:
    """Validates that the tag name is not a UUID.

    Args:
        value: The tag name to validate.

    Returns:
        The validated tag name.

    Raises:
        ValueError: If the tag name can be converted to a UUID.
    """
    if value is not None and is_valid_uuid(value):
        raise ValueError(
            "Tag names cannot be UUIDs or strings that "
            "can be converted to UUIDs."
        )
    return value

TaggableFilter

Bases: BaseFilter

Model to enable filtering and sorting by tags.

Source code in src/zenml/models/v2/base/scoped.py
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
class TaggableFilter(BaseFilter):
    """Model to enable filtering and sorting by tags."""

    tag: Optional[str] = Field(
        description="Tag to apply to the filter query.", default=None
    )
    tags: Optional[List[str]] = Field(
        description="Tags to apply to the filter query.", default=None
    )

    CLI_EXCLUDE_FIELDS = [
        *BaseFilter.CLI_EXCLUDE_FIELDS,
    ]
    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *BaseFilter.FILTER_EXCLUDE_FIELDS,
        "tag",
        "tags",
    ]
    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *BaseFilter.CUSTOM_SORTING_OPTIONS,
        "tags",
    ]
    API_MULTI_INPUT_PARAMS: ClassVar[List[str]] = [
        *BaseFilter.API_MULTI_INPUT_PARAMS,
        "tags",
    ]

    @model_validator(mode="after")
    def add_tag_to_tags(self) -> "TaggableFilter":
        """Deprecated the tag attribute in favor of the tags attribute.

        Returns:
            self
        """
        if self.tag is not None:
            logger.warning(
                "The `tag` attribute is deprecated in favor of the `tags` attribute. "
                "Please update your code to use the `tags` attribute instead."
            )
            if self.tags is not None:
                self.tags.append(self.tag)
            else:
                self.tags = [self.tag]

            self.tag = None

        return self

    def apply_filter(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Applies the filter to a query.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        from zenml.zen_stores.schemas import TagResourceSchema, TagSchema

        query = super().apply_filter(query=query, table=table)

        if self.tags:
            query = query.join(
                TagResourceSchema,
                TagResourceSchema.resource_id == getattr(table, "id"),
            ).join(TagSchema, TagSchema.id == TagResourceSchema.tag_id)

        return query

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom tag filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        if self.tags:
            from sqlmodel import exists, select

            from zenml.zen_stores.schemas import TagResourceSchema, TagSchema

            for tag in self.tags:
                condition = self.generate_custom_query_conditions_for_column(
                    value=tag, table=TagSchema, column="name"
                )
                exists_subquery = exists(
                    select(TagResourceSchema)
                    .join(TagSchema, TagSchema.id == TagResourceSchema.tag_id)  # type: ignore[arg-type]
                    .where(
                        TagResourceSchema.resource_id == table.id, condition
                    )
                )
                custom_filters.append(exists_subquery)

        return custom_filters

    def apply_sorting(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Apply sorting to the query.

        Args:
            query: The query to which to apply the sorting.
            table: The query table.

        Returns:
            The query with sorting applied.
        """
        sort_by, operand = self.sorting_params

        if sort_by == "tags":
            from sqlmodel import asc, desc, func, select

            from zenml.enums import SorterOps, TaggableResourceTypes
            from zenml.zen_stores.schemas import (
                ArtifactSchema,
                ArtifactVersionSchema,
                ModelSchema,
                ModelVersionSchema,
                PipelineRunSchema,
                PipelineSchema,
                RunTemplateSchema,
                TagResourceSchema,
                TagSchema,
            )

            resource_type_mapping = {
                ArtifactSchema: TaggableResourceTypes.ARTIFACT,
                ArtifactVersionSchema: TaggableResourceTypes.ARTIFACT_VERSION,
                ModelSchema: TaggableResourceTypes.MODEL,
                ModelVersionSchema: TaggableResourceTypes.MODEL_VERSION,
                PipelineSchema: TaggableResourceTypes.PIPELINE,
                PipelineRunSchema: TaggableResourceTypes.PIPELINE_RUN,
                RunTemplateSchema: TaggableResourceTypes.RUN_TEMPLATE,
            }

            sorted_tags = (
                select(TagResourceSchema.resource_id, TagSchema.name)
                .join(TagSchema, TagResourceSchema.tag_id == TagSchema.id)  # type: ignore[arg-type]
                .filter(
                    TagResourceSchema.resource_type  # type: ignore[arg-type]
                    == resource_type_mapping[table]
                )
                .order_by(
                    asc(TagResourceSchema.resource_id), asc(TagSchema.name)
                )
            ).alias("sorted_tags")

            tags_subquery = (
                select(
                    sorted_tags.c.resource_id,
                    func.group_concat(sorted_tags.c.name, ", ").label(
                        "tags_list"
                    ),
                ).group_by(sorted_tags.c.resource_id)
            ).alias("tags_subquery")

            query = query.add_columns(tags_subquery.c.tags_list).outerjoin(
                tags_subquery, table.id == tags_subquery.c.resource_id
            )

            # Apply ordering based on the tags list
            if operand == SorterOps.ASCENDING:
                query = query.order_by(asc("tags_list"))
            else:
                query = query.order_by(desc("tags_list"))

            return query

        return super().apply_sorting(query=query, table=table)

add_tag_to_tags()

Deprecated the tag attribute in favor of the tags attribute.

Returns:

Type Description
TaggableFilter

self

Source code in src/zenml/models/v2/base/scoped.py
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
@model_validator(mode="after")
def add_tag_to_tags(self) -> "TaggableFilter":
    """Deprecated the tag attribute in favor of the tags attribute.

    Returns:
        self
    """
    if self.tag is not None:
        logger.warning(
            "The `tag` attribute is deprecated in favor of the `tags` attribute. "
            "Please update your code to use the `tags` attribute instead."
        )
        if self.tags is not None:
            self.tags.append(self.tag)
        else:
            self.tags = [self.tag]

        self.tag = None

    return self

apply_filter(query, table)

Applies the filter to a query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/base/scoped.py
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
def apply_filter(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Applies the filter to a query.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    from zenml.zen_stores.schemas import TagResourceSchema, TagSchema

    query = super().apply_filter(query=query, table=table)

    if self.tags:
        query = query.join(
            TagResourceSchema,
            TagResourceSchema.resource_id == getattr(table, "id"),
        ).join(TagSchema, TagSchema.id == TagResourceSchema.tag_id)

    return query

apply_sorting(query, table)

Apply sorting to the query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the sorting.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with sorting applied.

Source code in src/zenml/models/v2/base/scoped.py
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
def apply_sorting(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Apply sorting to the query.

    Args:
        query: The query to which to apply the sorting.
        table: The query table.

    Returns:
        The query with sorting applied.
    """
    sort_by, operand = self.sorting_params

    if sort_by == "tags":
        from sqlmodel import asc, desc, func, select

        from zenml.enums import SorterOps, TaggableResourceTypes
        from zenml.zen_stores.schemas import (
            ArtifactSchema,
            ArtifactVersionSchema,
            ModelSchema,
            ModelVersionSchema,
            PipelineRunSchema,
            PipelineSchema,
            RunTemplateSchema,
            TagResourceSchema,
            TagSchema,
        )

        resource_type_mapping = {
            ArtifactSchema: TaggableResourceTypes.ARTIFACT,
            ArtifactVersionSchema: TaggableResourceTypes.ARTIFACT_VERSION,
            ModelSchema: TaggableResourceTypes.MODEL,
            ModelVersionSchema: TaggableResourceTypes.MODEL_VERSION,
            PipelineSchema: TaggableResourceTypes.PIPELINE,
            PipelineRunSchema: TaggableResourceTypes.PIPELINE_RUN,
            RunTemplateSchema: TaggableResourceTypes.RUN_TEMPLATE,
        }

        sorted_tags = (
            select(TagResourceSchema.resource_id, TagSchema.name)
            .join(TagSchema, TagResourceSchema.tag_id == TagSchema.id)  # type: ignore[arg-type]
            .filter(
                TagResourceSchema.resource_type  # type: ignore[arg-type]
                == resource_type_mapping[table]
            )
            .order_by(
                asc(TagResourceSchema.resource_id), asc(TagSchema.name)
            )
        ).alias("sorted_tags")

        tags_subquery = (
            select(
                sorted_tags.c.resource_id,
                func.group_concat(sorted_tags.c.name, ", ").label(
                    "tags_list"
                ),
            ).group_by(sorted_tags.c.resource_id)
        ).alias("tags_subquery")

        query = query.add_columns(tags_subquery.c.tags_list).outerjoin(
            tags_subquery, table.id == tags_subquery.c.resource_id
        )

        # Apply ordering based on the tags list
        if operand == SorterOps.ASCENDING:
            query = query.order_by(asc("tags_list"))
        else:
            query = query.order_by(desc("tags_list"))

        return query

    return super().apply_sorting(query=query, table=table)

get_custom_filters(table)

Get custom tag filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/base/scoped.py
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom tag filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    if self.tags:
        from sqlmodel import exists, select

        from zenml.zen_stores.schemas import TagResourceSchema, TagSchema

        for tag in self.tags:
            condition = self.generate_custom_query_conditions_for_column(
                value=tag, table=TagSchema, column="name"
            )
            exists_subquery = exists(
                select(TagResourceSchema)
                .join(TagSchema, TagSchema.id == TagResourceSchema.tag_id)  # type: ignore[arg-type]
                .where(
                    TagResourceSchema.resource_id == table.id, condition
                )
            )
            custom_filters.append(exists_subquery)

    return custom_filters

TriggerExecutionFilter

Bases: ProjectScopedFilter

Model to enable advanced filtering of all trigger executions.

Source code in src/zenml/models/v2/core/trigger_execution.py
112
113
114
115
116
117
118
119
class TriggerExecutionFilter(ProjectScopedFilter):
    """Model to enable advanced filtering of all trigger executions."""

    trigger_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="ID of the trigger of the execution.",
        union_mode="left_to_right",
    )

TriggerExecutionRequest

Bases: BaseRequest

Model for creating a new Trigger execution.

Source code in src/zenml/models/v2/core/trigger_execution.py
38
39
40
41
42
class TriggerExecutionRequest(BaseRequest):
    """Model for creating a new Trigger execution."""

    trigger: UUID
    event_metadata: Dict[str, Any] = {}

TriggerExecutionResponse

Bases: BaseIdentifiedResponse[TriggerExecutionResponseBody, TriggerExecutionResponseMetadata, TriggerExecutionResponseResources]

Response model for trigger executions.

Source code in src/zenml/models/v2/core/trigger_execution.py
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
class TriggerExecutionResponse(
    BaseIdentifiedResponse[
        TriggerExecutionResponseBody,
        TriggerExecutionResponseMetadata,
        TriggerExecutionResponseResources,
    ]
):
    """Response model for trigger executions."""

    def get_hydrated_version(self) -> "TriggerExecutionResponse":
        """Get the hydrated version of this trigger execution.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_trigger_execution(self.id)

    # Body and metadata properties

    @property
    def trigger(self) -> "TriggerResponse":
        """The `trigger` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().trigger

    @property
    def event_metadata(self) -> Dict[str, Any]:
        """The `event_metadata` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().event_metadata

event_metadata property

The event_metadata property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

trigger property

The trigger property.

Returns:

Type Description
TriggerResponse

the value of the property.

get_hydrated_version()

Get the hydrated version of this trigger execution.

Returns:

Type Description
TriggerExecutionResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/trigger_execution.py
78
79
80
81
82
83
84
85
86
def get_hydrated_version(self) -> "TriggerExecutionResponse":
    """Get the hydrated version of this trigger execution.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_trigger_execution(self.id)

TriggerExecutionResponseBody

Bases: BaseDatedResponseBody

Response body for trigger executions.

Source code in src/zenml/models/v2/core/trigger_execution.py
51
52
class TriggerExecutionResponseBody(BaseDatedResponseBody):
    """Response body for trigger executions."""

TriggerExecutionResponseMetadata

Bases: BaseResponseMetadata

Response metadata for trigger executions.

Source code in src/zenml/models/v2/core/trigger_execution.py
55
56
57
58
class TriggerExecutionResponseMetadata(BaseResponseMetadata):
    """Response metadata for trigger executions."""

    event_metadata: Dict[str, Any] = {}

TriggerExecutionResponseResources

Bases: BaseResponseResources

Class for all resource models associated with the trigger entity.

Source code in src/zenml/models/v2/core/trigger_execution.py
61
62
63
64
65
66
class TriggerExecutionResponseResources(BaseResponseResources):
    """Class for all resource models associated with the trigger entity."""

    trigger: "TriggerResponse" = Field(
        title="The event source that activates this trigger.",
    )

TriggerFilter

Bases: ProjectScopedFilter

Model to enable advanced filtering of all triggers.

Source code in src/zenml/models/v2/core/trigger.py
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
class TriggerFilter(ProjectScopedFilter):
    """Model to enable advanced filtering of all triggers."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *ProjectScopedFilter.FILTER_EXCLUDE_FIELDS,
        "action_flavor",
        "action_subtype",
        "event_source_flavor",
        "event_source_subtype",
    ]

    name: Optional[str] = Field(
        default=None,
        description="Name of the trigger.",
    )
    event_source_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="The event source this trigger is attached to.",
        union_mode="left_to_right",
    )
    action_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="The action this trigger is attached to.",
        union_mode="left_to_right",
    )
    is_active: Optional[bool] = Field(
        default=None,
        description="Whether the trigger is active.",
    )
    action_flavor: Optional[str] = Field(
        default=None,
        title="The flavor of the action that is executed by this trigger.",
    )
    action_subtype: Optional[str] = Field(
        default=None,
        title="The subtype of the action that is executed by this trigger.",
    )
    event_source_flavor: Optional[str] = Field(
        default=None,
        title="The flavor of the event source that activates this trigger.",
    )
    event_source_subtype: Optional[str] = Field(
        default=None,
        title="The subtype of the event source that activates this trigger.",
    )

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        from sqlmodel import and_

        from zenml.zen_stores.schemas import (
            ActionSchema,
            EventSourceSchema,
            TriggerSchema,
        )

        custom_filters = super().get_custom_filters(table)

        if self.event_source_flavor:
            event_source_flavor_filter = and_(
                EventSourceSchema.id == TriggerSchema.event_source_id,
                EventSourceSchema.flavor == self.event_source_flavor,
            )
            custom_filters.append(event_source_flavor_filter)

        if self.event_source_subtype:
            event_source_subtype_filter = and_(
                EventSourceSchema.id == TriggerSchema.event_source_id,
                EventSourceSchema.plugin_subtype == self.event_source_subtype,
            )
            custom_filters.append(event_source_subtype_filter)

        if self.action_flavor:
            action_flavor_filter = and_(
                ActionSchema.id == TriggerSchema.action_id,
                ActionSchema.flavor == self.action_flavor,
            )
            custom_filters.append(action_flavor_filter)

        if self.action_subtype:
            action_subtype_filter = and_(
                ActionSchema.id == TriggerSchema.action_id,
                ActionSchema.plugin_subtype == self.action_subtype,
            )
            custom_filters.append(action_subtype_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/trigger.py
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    from sqlmodel import and_

    from zenml.zen_stores.schemas import (
        ActionSchema,
        EventSourceSchema,
        TriggerSchema,
    )

    custom_filters = super().get_custom_filters(table)

    if self.event_source_flavor:
        event_source_flavor_filter = and_(
            EventSourceSchema.id == TriggerSchema.event_source_id,
            EventSourceSchema.flavor == self.event_source_flavor,
        )
        custom_filters.append(event_source_flavor_filter)

    if self.event_source_subtype:
        event_source_subtype_filter = and_(
            EventSourceSchema.id == TriggerSchema.event_source_id,
            EventSourceSchema.plugin_subtype == self.event_source_subtype,
        )
        custom_filters.append(event_source_subtype_filter)

    if self.action_flavor:
        action_flavor_filter = and_(
            ActionSchema.id == TriggerSchema.action_id,
            ActionSchema.flavor == self.action_flavor,
        )
        custom_filters.append(action_flavor_filter)

    if self.action_subtype:
        action_subtype_filter = and_(
            ActionSchema.id == TriggerSchema.action_id,
            ActionSchema.plugin_subtype == self.action_subtype,
        )
        custom_filters.append(action_subtype_filter)

    return custom_filters

TriggerRequest

Bases: ProjectScopedRequest

Model for creating a new trigger.

Source code in src/zenml/models/v2/core/trigger.py
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
class TriggerRequest(ProjectScopedRequest):
    """Model for creating a new trigger."""

    name: str = Field(
        title="The name of the trigger.", max_length=STR_FIELD_MAX_LENGTH
    )
    description: str = Field(
        default="",
        title="The description of the trigger",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    action_id: UUID = Field(
        title="The action that is executed by this trigger.",
    )
    schedule: Optional[Schedule] = Field(
        default=None,
        title="The schedule for the trigger. Either a schedule or an event "
        "source is required.",
    )
    event_source_id: Optional[UUID] = Field(
        default=None,
        title="The event source that activates this trigger. Either a schedule "
        "or an event source is required.",
    )
    event_filter: Optional[Dict[str, Any]] = Field(
        default=None,
        title="Filter applied to events that activate this trigger. Only "
        "set if the trigger is activated by an event source.",
    )

    @model_validator(mode="after")
    def _validate_schedule_or_event_source(self) -> "TriggerRequest":
        """Validate that either a schedule or an event source is provided.

        Returns:
            The validated request.

        Raises:
            ValueError: If neither a schedule nor an event source is provided,
                or if both are provided.
        """
        if not self.schedule and not self.event_source_id:
            raise ValueError(
                "Either a schedule or an event source is required."
            )

        if self.schedule and self.event_source_id:
            raise ValueError("Only a schedule or an event source is allowed.")

        return self

TriggerResponse

Bases: ProjectScopedResponse[TriggerResponseBody, TriggerResponseMetadata, TriggerResponseResources]

Response model for models.

Source code in src/zenml/models/v2/core/trigger.py
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
class TriggerResponse(
    ProjectScopedResponse[
        TriggerResponseBody, TriggerResponseMetadata, TriggerResponseResources
    ]
):
    """Response model for models."""

    name: str = Field(
        title="The name of the trigger",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "TriggerResponse":
        """Get the hydrated version of this trigger.

        Returns:
            An instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_trigger(self.id)

    @property
    def action_flavor(self) -> str:
        """The `action_flavor` property.

        Returns:
            the value of the property.
        """
        return self.get_body().action_flavor

    @property
    def action_subtype(self) -> str:
        """The `action_subtype` property.

        Returns:
            the value of the property.
        """
        return self.get_body().action_subtype

    @property
    def event_source_flavor(self) -> Optional[str]:
        """The `event_source_flavor` property.

        Returns:
            the value of the property.
        """
        return self.get_body().event_source_flavor

    @property
    def event_source_subtype(self) -> Optional[str]:
        """The `event_source_subtype` property.

        Returns:
            the value of the property.
        """
        return self.get_body().event_source_subtype

    @property
    def is_active(self) -> bool:
        """The `is_active` property.

        Returns:
            the value of the property.
        """
        return self.get_body().is_active

    @property
    def event_filter(self) -> Optional[Dict[str, Any]]:
        """The `event_filter` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().event_filter

    @property
    def description(self) -> str:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

    @property
    def action(self) -> "ActionResponse":
        """The `action` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().action

    @property
    def event_source(self) -> Optional["EventSourceResponse"]:
        """The `event_source` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().event_source

    @property
    def executions(self) -> Page[TriggerExecutionResponse]:
        """The `event_source` property.

        Returns:
            the value of the property.
        """
        return self.get_resources().executions

action property

The action property.

Returns:

Type Description
ActionResponse

the value of the property.

action_flavor property

The action_flavor property.

Returns:

Type Description
str

the value of the property.

action_subtype property

The action_subtype property.

Returns:

Type Description
str

the value of the property.

description property

The description property.

Returns:

Type Description
str

the value of the property.

event_filter property

The event_filter property.

Returns:

Type Description
Optional[Dict[str, Any]]

the value of the property.

event_source property

The event_source property.

Returns:

Type Description
Optional[EventSourceResponse]

the value of the property.

event_source_flavor property

The event_source_flavor property.

Returns:

Type Description
Optional[str]

the value of the property.

event_source_subtype property

The event_source_subtype property.

Returns:

Type Description
Optional[str]

the value of the property.

executions property

The event_source property.

Returns:

Type Description
Page[TriggerExecutionResponse]

the value of the property.

is_active property

The is_active property.

Returns:

Type Description
bool

the value of the property.

get_hydrated_version()

Get the hydrated version of this trigger.

Returns:

Type Description
TriggerResponse

An instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/trigger.py
223
224
225
226
227
228
229
230
231
def get_hydrated_version(self) -> "TriggerResponse":
    """Get the hydrated version of this trigger.

    Returns:
        An instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_trigger(self.id)

TriggerResponseBody

Bases: ProjectScopedResponseBody

Response body for triggers.

Source code in src/zenml/models/v2/core/trigger.py
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
class TriggerResponseBody(ProjectScopedResponseBody):
    """Response body for triggers."""

    action_flavor: str = Field(
        title="The flavor of the action that is executed by this trigger.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    action_subtype: str = Field(
        title="The subtype of the action that is executed by this trigger.",
    )
    event_source_flavor: Optional[str] = Field(
        default=None,
        title="The flavor of the event source that activates this trigger. Not "
        "set if the trigger is activated by a schedule.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    event_source_subtype: Optional[str] = Field(
        default=None,
        title="The subtype of the event source that activates this trigger. "
        "Not set if the trigger is activated by a schedule.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    is_active: bool = Field(
        title="Whether the trigger is active.",
    )

TriggerResponseMetadata

Bases: ProjectScopedResponseMetadata

Response metadata for triggers.

Source code in src/zenml/models/v2/core/trigger.py
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
class TriggerResponseMetadata(ProjectScopedResponseMetadata):
    """Response metadata for triggers."""

    description: str = Field(
        default="",
        title="The description of the trigger.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    event_filter: Optional[Dict[str, Any]] = Field(
        default=None,
        title="The event that activates this trigger. Not set if the trigger "
        "is activated by a schedule.",
    )
    schedule: Optional[Schedule] = Field(
        default=None,
        title="The schedule that activates this trigger. Not set if the "
        "trigger is activated by an event source.",
    )

TriggerResponseResources

Bases: ProjectScopedResponseResources

Class for all resource models associated with the trigger entity.

Source code in src/zenml/models/v2/core/trigger.py
195
196
197
198
199
200
201
202
203
204
205
206
207
208
class TriggerResponseResources(ProjectScopedResponseResources):
    """Class for all resource models associated with the trigger entity."""

    action: "ActionResponse" = Field(
        title="The action that is executed by this trigger.",
    )
    event_source: Optional["EventSourceResponse"] = Field(
        default=None,
        title="The event source that activates this trigger. Not set if the "
        "trigger is activated by a schedule.",
    )
    executions: Page[TriggerExecutionResponse] = Field(
        title="The executions of this trigger.",
    )

TriggerUpdate

Bases: BaseUpdate

Update model for triggers.

Source code in src/zenml/models/v2/core/trigger.py
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
class TriggerUpdate(BaseUpdate):
    """Update model for triggers."""

    name: Optional[str] = Field(
        default=None,
        title="The new name for the trigger.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    description: Optional[str] = Field(
        default=None,
        title="The new description for the trigger.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    event_filter: Optional[Dict[str, Any]] = Field(
        default=None,
        title="New filter applied to events that activate this trigger. Only "
        "valid if the trigger is already configured to be activated by an "
        "event source.",
    )
    schedule: Optional[Schedule] = Field(
        default=None,
        title="The updated schedule for the trigger. Only valid if the trigger "
        "is already configured to be activated by a schedule.",
    )
    is_active: Optional[bool] = Field(
        default=None,
        title="The new status of the trigger.",
    )

UUIDFilter

Bases: StrFilter

Filter for all uuid fields which are mostly treated like strings.

Source code in src/zenml/models/v2/base/filter.py
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
class UUIDFilter(StrFilter):
    """Filter for all uuid fields which are mostly treated like strings."""

    @field_validator("value", mode="before")
    @classmethod
    def _remove_hyphens_from_value(cls, value: Any) -> Any:
        """Remove hyphens from the value to enable string comparisons.

        Args:
            value: The filter value.

        Returns:
            The filter value with removed hyphens.
        """
        if isinstance(value, str):
            return value.replace("-", "")

        if isinstance(value, list):
            return [str(v).replace("-", "") for v in value]

        return value

    def generate_query_conditions_from_column(self, column: Any) -> Any:
        """Generate query conditions for a UUID column.

        Args:
            column: The UUID column of an SQLModel table on which to filter.

        Returns:
            A list of query conditions.
        """
        import sqlalchemy
        from sqlalchemy_utils.functions import cast_if

        from zenml.utils import uuid_utils

        # For equality checks, compare the UUID directly
        if self.operation == GenericFilterOps.EQUALS:
            if not uuid_utils.is_valid_uuid(self.value):
                return False

            return column == self.value

        if self.operation == GenericFilterOps.NOT_EQUALS:
            if not uuid_utils.is_valid_uuid(self.value):
                return True

            return column != self.value

        # For all other operations, cast and handle the column as string
        return super().generate_query_conditions_from_column(
            column=cast_if(column, sqlalchemy.String)
        )

generate_query_conditions_from_column(column)

Generate query conditions for a UUID column.

Parameters:

Name Type Description Default
column Any

The UUID column of an SQLModel table on which to filter.

required

Returns:

Type Description
Any

A list of query conditions.

Source code in src/zenml/models/v2/base/filter.py
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
def generate_query_conditions_from_column(self, column: Any) -> Any:
    """Generate query conditions for a UUID column.

    Args:
        column: The UUID column of an SQLModel table on which to filter.

    Returns:
        A list of query conditions.
    """
    import sqlalchemy
    from sqlalchemy_utils.functions import cast_if

    from zenml.utils import uuid_utils

    # For equality checks, compare the UUID directly
    if self.operation == GenericFilterOps.EQUALS:
        if not uuid_utils.is_valid_uuid(self.value):
            return False

        return column == self.value

    if self.operation == GenericFilterOps.NOT_EQUALS:
        if not uuid_utils.is_valid_uuid(self.value):
            return True

        return column != self.value

    # For all other operations, cast and handle the column as string
    return super().generate_query_conditions_from_column(
        column=cast_if(column, sqlalchemy.String)
    )

UserAuthModel

Bases: BaseZenModel

Authentication Model for the User.

This model is only used server-side. The server endpoints can use this model to authenticate the user credentials (Token, Password).

Source code in src/zenml/models/v2/misc/user_auth.py
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
class UserAuthModel(BaseZenModel):
    """Authentication Model for the User.

    This model is only used server-side. The server endpoints can use this model
    to authenticate the user credentials (Token, Password).
    """

    id: UUID = Field(title="The unique resource id.")

    created: datetime = Field(title="Time when this resource was created.")
    updated: datetime = Field(
        title="Time when this resource was last updated."
    )

    active: bool = Field(default=False, title="Active account.")
    is_service_account: bool = Field(
        title="Indicates whether this is a service account or a regular user "
        "account."
    )

    activation_token: Optional[PlainSerializedSecretStr] = Field(
        default=None, exclude=True
    )
    password: Optional[PlainSerializedSecretStr] = Field(
        default=None, exclude=True
    )
    name: str = Field(
        title="The unique username for the account.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    full_name: str = Field(
        default="",
        title="The full name for the account owner. Only relevant for user "
        "accounts.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    email_opted_in: Optional[bool] = Field(
        default=None,
        title="Whether the user agreed to share their email. Only relevant for "
        "user accounts",
        description="`null` if not answered, `true` if agreed, "
        "`false` if skipped.",
    )

    @classmethod
    def _get_crypt_context(cls) -> "CryptContext":
        """Returns the password encryption context.

        Returns:
            The password encryption context.
        """
        from passlib.context import CryptContext

        return CryptContext(schemes=["bcrypt"], deprecated="auto")

    @classmethod
    def _is_hashed_secret(cls, secret: SecretStr) -> bool:
        """Checks if a secret value is already hashed.

        Args:
            secret: The secret value to check.

        Returns:
            True if the secret value is hashed, otherwise False.
        """
        return (
            re.match(r"^\$2[ayb]\$.{56}$", secret.get_secret_value())
            is not None
        )

    @classmethod
    def _get_hashed_secret(cls, secret: Optional[SecretStr]) -> Optional[str]:
        """Hashes the input secret and returns the hash value.

        Only applied if supplied and if not already hashed.

        Args:
            secret: The secret value to hash.

        Returns:
            The secret hash value, or None if no secret was supplied.
        """
        if secret is None:
            return None
        if cls._is_hashed_secret(secret):
            return secret.get_secret_value()
        pwd_context = cls._get_crypt_context()
        return pwd_context.hash(secret.get_secret_value())

    def get_password(self) -> Optional[str]:
        """Get the password.

        Returns:
            The password as a plain string, if it exists.
        """
        if self.password is None:
            return None
        return self.password.get_secret_value()

    def get_hashed_password(self) -> Optional[str]:
        """Returns the hashed password, if configured.

        Returns:
            The hashed password.
        """
        return self._get_hashed_secret(self.password)

    def get_hashed_activation_token(self) -> Optional[str]:
        """Returns the hashed activation token, if configured.

        Returns:
            The hashed activation token.
        """
        return self._get_hashed_secret(self.activation_token)

    @classmethod
    def verify_password(
        cls, plain_password: str, user: Optional["UserAuthModel"] = None
    ) -> bool:
        """Verifies a given plain password against the stored password.

        Args:
            plain_password: Input password to be verified.
            user: User for which the password is to be verified.

        Returns:
            True if the passwords match.
        """
        # even when the user or password is not set, we still want to execute
        # the password hash verification to protect against response discrepancy
        # attacks (https://cwe.mitre.org/data/definitions/204.html)
        password_hash: Optional[str] = None
        if (
            user is not None
            # Disable password verification for service accounts as an extra
            # security measure. Service accounts should only be used with API
            # keys.
            and not user.is_service_account
            and user.password is not None
        ):  # and user.active:
            password_hash = user.get_hashed_password()
        pwd_context = cls._get_crypt_context()
        return pwd_context.verify(plain_password, password_hash)

    @classmethod
    def verify_activation_token(
        cls, activation_token: str, user: Optional["UserAuthModel"] = None
    ) -> bool:
        """Verifies a given activation token against the stored token.

        Args:
            activation_token: Input activation token to be verified.
            user: User for which the activation token is to be verified.

        Returns:
            True if the token is valid.
        """
        # even when the user or token is not set, we still want to execute the
        # token hash verification to protect against response discrepancy
        # attacks (https://cwe.mitre.org/data/definitions/204.html)
        token_hash: str = ""
        if (
            user is not None
            # Disable activation tokens for service accounts as an extra
            # security measure. Service accounts should only be used with API
            # keys.
            and not user.is_service_account
            and user.activation_token is not None
            and not user.active
        ):
            token_hash = user.get_hashed_activation_token() or ""
        pwd_context = cls._get_crypt_context()
        return pwd_context.verify(activation_token, token_hash)

get_hashed_activation_token()

Returns the hashed activation token, if configured.

Returns:

Type Description
Optional[str]

The hashed activation token.

Source code in src/zenml/models/v2/misc/user_auth.py
139
140
141
142
143
144
145
def get_hashed_activation_token(self) -> Optional[str]:
    """Returns the hashed activation token, if configured.

    Returns:
        The hashed activation token.
    """
    return self._get_hashed_secret(self.activation_token)

get_hashed_password()

Returns the hashed password, if configured.

Returns:

Type Description
Optional[str]

The hashed password.

Source code in src/zenml/models/v2/misc/user_auth.py
131
132
133
134
135
136
137
def get_hashed_password(self) -> Optional[str]:
    """Returns the hashed password, if configured.

    Returns:
        The hashed password.
    """
    return self._get_hashed_secret(self.password)

get_password()

Get the password.

Returns:

Type Description
Optional[str]

The password as a plain string, if it exists.

Source code in src/zenml/models/v2/misc/user_auth.py
121
122
123
124
125
126
127
128
129
def get_password(self) -> Optional[str]:
    """Get the password.

    Returns:
        The password as a plain string, if it exists.
    """
    if self.password is None:
        return None
    return self.password.get_secret_value()

verify_activation_token(activation_token, user=None) classmethod

Verifies a given activation token against the stored token.

Parameters:

Name Type Description Default
activation_token str

Input activation token to be verified.

required
user Optional[UserAuthModel]

User for which the activation token is to be verified.

None

Returns:

Type Description
bool

True if the token is valid.

Source code in src/zenml/models/v2/misc/user_auth.py
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
@classmethod
def verify_activation_token(
    cls, activation_token: str, user: Optional["UserAuthModel"] = None
) -> bool:
    """Verifies a given activation token against the stored token.

    Args:
        activation_token: Input activation token to be verified.
        user: User for which the activation token is to be verified.

    Returns:
        True if the token is valid.
    """
    # even when the user or token is not set, we still want to execute the
    # token hash verification to protect against response discrepancy
    # attacks (https://cwe.mitre.org/data/definitions/204.html)
    token_hash: str = ""
    if (
        user is not None
        # Disable activation tokens for service accounts as an extra
        # security measure. Service accounts should only be used with API
        # keys.
        and not user.is_service_account
        and user.activation_token is not None
        and not user.active
    ):
        token_hash = user.get_hashed_activation_token() or ""
    pwd_context = cls._get_crypt_context()
    return pwd_context.verify(activation_token, token_hash)

verify_password(plain_password, user=None) classmethod

Verifies a given plain password against the stored password.

Parameters:

Name Type Description Default
plain_password str

Input password to be verified.

required
user Optional[UserAuthModel]

User for which the password is to be verified.

None

Returns:

Type Description
bool

True if the passwords match.

Source code in src/zenml/models/v2/misc/user_auth.py
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
@classmethod
def verify_password(
    cls, plain_password: str, user: Optional["UserAuthModel"] = None
) -> bool:
    """Verifies a given plain password against the stored password.

    Args:
        plain_password: Input password to be verified.
        user: User for which the password is to be verified.

    Returns:
        True if the passwords match.
    """
    # even when the user or password is not set, we still want to execute
    # the password hash verification to protect against response discrepancy
    # attacks (https://cwe.mitre.org/data/definitions/204.html)
    password_hash: Optional[str] = None
    if (
        user is not None
        # Disable password verification for service accounts as an extra
        # security measure. Service accounts should only be used with API
        # keys.
        and not user.is_service_account
        and user.password is not None
    ):  # and user.active:
        password_hash = user.get_hashed_password()
    pwd_context = cls._get_crypt_context()
    return pwd_context.verify(plain_password, password_hash)

UserFilter

Bases: BaseFilter

Model to enable advanced filtering of all Users.

Source code in src/zenml/models/v2/core/user.py
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
class UserFilter(BaseFilter):
    """Model to enable advanced filtering of all Users."""

    name: Optional[str] = Field(
        default=None,
        description="Name of the user",
    )
    full_name: Optional[str] = Field(
        default=None,
        description="Full Name of the user",
    )
    email: Optional[str] = Field(
        default=None,
        description="Email of the user",
    )
    active: Optional[Union[bool, str]] = Field(
        default=None,
        description="Whether the user is active",
        union_mode="left_to_right",
    )
    email_opted_in: Optional[Union[bool, str]] = Field(
        default=None,
        description="Whether the user has opted in to emails",
        union_mode="left_to_right",
    )
    external_user_id: Optional[Union[UUID, str]] = Field(
        default=None,
        title="The external user ID associated with the account.",
        union_mode="left_to_right",
    )

    def apply_filter(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Override to filter out service accounts from the query.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        query = super().apply_filter(query=query, table=table)
        query = query.where(
            getattr(table, "is_service_account") != True  # noqa: E712
        )

        return query

apply_filter(query, table)

Override to filter out service accounts from the query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/core/user.py
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
def apply_filter(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Override to filter out service accounts from the query.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    query = super().apply_filter(query=query, table=table)
    query = query.where(
        getattr(table, "is_service_account") != True  # noqa: E712
    )

    return query

UserRequest

Bases: UserBase, BaseRequest

Request model for users.

Source code in src/zenml/models/v2/core/user.py
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
class UserRequest(UserBase, BaseRequest):
    """Request model for users."""

    # Analytics fields for user request models
    ANALYTICS_FIELDS: ClassVar[List[str]] = [
        "name",
        "full_name",
        "active",
        "email_opted_in",
    ]

    name: str = Field(
        title="The unique username for the account.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    full_name: str = Field(
        default="",
        title="The full name for the account owner. Only relevant for user "
        "accounts.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    is_admin: bool = Field(
        title="Whether the account is an administrator.",
    )
    active: bool = Field(default=False, title="Whether the account is active.")

    model_config = ConfigDict(
        # Validate attributes when assigning them
        validate_assignment=True,
        # Forbid extra attributes to prevent unexpected behavior
        extra="ignore",
    )

UserResponse

Bases: BaseIdentifiedResponse[UserResponseBody, UserResponseMetadata, UserResponseResources]

Response model for user and service accounts.

This returns the activation_token that is required for the user-invitation-flow of the frontend. The email is returned optionally as well for use by the analytics on the client-side.

Source code in src/zenml/models/v2/core/user.py
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
class UserResponse(
    BaseIdentifiedResponse[
        UserResponseBody, UserResponseMetadata, UserResponseResources
    ]
):
    """Response model for user and service accounts.

    This returns the activation_token that is required for the
    user-invitation-flow of the frontend. The email is returned optionally as
    well for use by the analytics on the client-side.
    """

    ANALYTICS_FIELDS: ClassVar[List[str]] = [
        "name",
        "full_name",
        "active",
        "email_opted_in",
        "is_service_account",
    ]

    name: str = Field(
        title="The unique username for the account.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "UserResponse":
        """Get the hydrated version of this user.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_user(self.id)

    # Body and metadata properties
    @property
    def active(self) -> bool:
        """The `active` property.

        Returns:
            the value of the property.
        """
        return self.get_body().active

    @property
    def activation_token(self) -> Optional[str]:
        """The `activation_token` property.

        Returns:
            the value of the property.
        """
        return self.get_body().activation_token

    @property
    def full_name(self) -> str:
        """The `full_name` property.

        Returns:
            the value of the property.
        """
        return self.get_body().full_name

    @property
    def email_opted_in(self) -> Optional[bool]:
        """The `email_opted_in` property.

        Returns:
            the value of the property.
        """
        return self.get_body().email_opted_in

    @property
    def is_service_account(self) -> bool:
        """The `is_service_account` property.

        Returns:
            the value of the property.
        """
        return self.get_body().is_service_account

    @property
    def is_admin(self) -> bool:
        """The `is_admin` property.

        Returns:
            Whether the user is an admin.
        """
        return self.get_body().is_admin

    @property
    def email(self) -> Optional[str]:
        """The `email` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().email

    @property
    def external_user_id(self) -> Optional[UUID]:
        """The `external_user_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().external_user_id

    @property
    def user_metadata(self) -> Dict[str, Any]:
        """The `user_metadata` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().user_metadata

    @property
    def default_project_id(self) -> Optional[UUID]:
        """The `default_project_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().default_project_id

    # Helper methods
    @classmethod
    def _get_crypt_context(cls) -> "CryptContext":
        """Returns the password encryption context.

        Returns:
            The password encryption context.
        """
        from passlib.context import CryptContext

        return CryptContext(schemes=["bcrypt"], deprecated="auto")

activation_token property

The activation_token property.

Returns:

Type Description
Optional[str]

the value of the property.

active property

The active property.

Returns:

Type Description
bool

the value of the property.

default_project_id property

The default_project_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

email property

The email property.

Returns:

Type Description
Optional[str]

the value of the property.

email_opted_in property

The email_opted_in property.

Returns:

Type Description
Optional[bool]

the value of the property.

external_user_id property

The external_user_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

full_name property

The full_name property.

Returns:

Type Description
str

the value of the property.

is_admin property

The is_admin property.

Returns:

Type Description
bool

Whether the user is an admin.

is_service_account property

The is_service_account property.

Returns:

Type Description
bool

the value of the property.

user_metadata property

The user_metadata property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

get_hydrated_version()

Get the hydrated version of this user.

Returns:

Type Description
UserResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/user.py
341
342
343
344
345
346
347
348
349
def get_hydrated_version(self) -> "UserResponse":
    """Get the hydrated version of this user.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_user(self.id)

UserResponseBody

Bases: BaseDatedResponseBody

Response body for users.

Source code in src/zenml/models/v2/core/user.py
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
class UserResponseBody(BaseDatedResponseBody):
    """Response body for users."""

    active: bool = Field(default=False, title="Whether the account is active.")
    activation_token: Optional[str] = Field(
        default=None,
        max_length=STR_FIELD_MAX_LENGTH,
        title="The activation token for the user. Only relevant for user "
        "accounts.",
    )
    full_name: str = Field(
        default="",
        title="The full name for the account owner. Only relevant for user "
        "accounts.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    email_opted_in: Optional[bool] = Field(
        default=None,
        title="Whether the user agreed to share their email. Only relevant for "
        "user accounts",
        description="`null` if not answered, `true` if agreed, "
        "`false` if skipped.",
    )
    is_service_account: bool = Field(
        title="Indicates whether this is a service account or a user account."
    )
    is_admin: bool = Field(
        title="Whether the account is an administrator.",
    )
    default_project_id: Optional[UUID] = Field(
        default=None,
        title="The default project ID for the user.",
    )

UserResponseMetadata

Bases: BaseResponseMetadata

Response metadata for users.

Source code in src/zenml/models/v2/core/user.py
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
class UserResponseMetadata(BaseResponseMetadata):
    """Response metadata for users."""

    email: Optional[str] = Field(
        default="",
        title="The email address associated with the account. Only relevant "
        "for user accounts.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    external_user_id: Optional[UUID] = Field(
        default=None,
        title="The external user ID associated with the account. Only relevant "
        "for user accounts.",
    )
    user_metadata: Dict[str, Any] = Field(
        default={},
        title="The metadata associated with the user.",
    )

UserScopedFilter

Bases: BaseFilter

Model to enable advanced user-based scoping.

Source code in src/zenml/models/v2/base/scoped.py
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
class UserScopedFilter(BaseFilter):
    """Model to enable advanced user-based scoping."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *BaseFilter.FILTER_EXCLUDE_FIELDS,
        "user",
        "scope_user",
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *BaseFilter.CLI_EXCLUDE_FIELDS,
        "scope_user",
    ]
    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *BaseFilter.CUSTOM_SORTING_OPTIONS,
        "user",
    ]

    scope_user: Optional[UUID] = Field(
        default=None,
        description="The user to scope this query to.",
    )
    user: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the user that created the entity.",
        union_mode="left_to_right",
    )

    def set_scope_user(self, user_id: UUID) -> None:
        """Set the user that is performing the filtering to scope the response.

        Args:
            user_id: The user ID to scope the response to.
        """
        self.scope_user = user_id

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_

        from zenml.zen_stores.schemas import UserSchema

        if self.user:
            user_filter = and_(
                getattr(table, "user_id") == UserSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.user,
                    table=UserSchema,
                    additional_columns=["full_name"],
                ),
            )
            custom_filters.append(user_filter)

        return custom_filters

    def apply_sorting(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Apply sorting to the query.

        Args:
            query: The query to which to apply the sorting.
            table: The query table.

        Returns:
            The query with sorting applied.
        """
        from sqlmodel import asc, desc

        from zenml.enums import SorterOps
        from zenml.zen_stores.schemas import UserSchema

        sort_by, operand = self.sorting_params

        if sort_by == "user":
            column = UserSchema.name

            query = query.outerjoin(
                UserSchema,
                getattr(table, "user_id") == UserSchema.id,
            )

            query = query.add_columns(UserSchema.name)

            if operand == SorterOps.ASCENDING:
                query = query.order_by(asc(column))
            else:
                query = query.order_by(desc(column))

            return query

        return super().apply_sorting(query=query, table=table)

    def apply_filter(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Applies the filter to a query.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        query = super().apply_filter(query=query, table=table)

        if self.scope_user:
            query = query.where(getattr(table, "user_id") == self.scope_user)

        return query

apply_filter(query, table)

Applies the filter to a query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/base/scoped.py
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
def apply_filter(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Applies the filter to a query.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    query = super().apply_filter(query=query, table=table)

    if self.scope_user:
        query = query.where(getattr(table, "user_id") == self.scope_user)

    return query

apply_sorting(query, table)

Apply sorting to the query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the sorting.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with sorting applied.

Source code in src/zenml/models/v2/base/scoped.py
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
def apply_sorting(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Apply sorting to the query.

    Args:
        query: The query to which to apply the sorting.
        table: The query table.

    Returns:
        The query with sorting applied.
    """
    from sqlmodel import asc, desc

    from zenml.enums import SorterOps
    from zenml.zen_stores.schemas import UserSchema

    sort_by, operand = self.sorting_params

    if sort_by == "user":
        column = UserSchema.name

        query = query.outerjoin(
            UserSchema,
            getattr(table, "user_id") == UserSchema.id,
        )

        query = query.add_columns(UserSchema.name)

        if operand == SorterOps.ASCENDING:
            query = query.order_by(asc(column))
        else:
            query = query.order_by(desc(column))

        return query

    return super().apply_sorting(query=query, table=table)

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/base/scoped.py
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_

    from zenml.zen_stores.schemas import UserSchema

    if self.user:
        user_filter = and_(
            getattr(table, "user_id") == UserSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.user,
                table=UserSchema,
                additional_columns=["full_name"],
            ),
        )
        custom_filters.append(user_filter)

    return custom_filters

set_scope_user(user_id)

Set the user that is performing the filtering to scope the response.

Parameters:

Name Type Description Default
user_id UUID

The user ID to scope the response to.

required
Source code in src/zenml/models/v2/base/scoped.py
197
198
199
200
201
202
203
def set_scope_user(self, user_id: UUID) -> None:
    """Set the user that is performing the filtering to scope the response.

    Args:
        user_id: The user ID to scope the response to.
    """
    self.scope_user = user_id

UserScopedRequest

Bases: BaseRequest

Base user-owned request model.

Used as a base class for all domain models that are "owned" by a user.

Source code in src/zenml/models/v2/base/scoped.py
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
class UserScopedRequest(BaseRequest):
    """Base user-owned request model.

    Used as a base class for all domain models that are "owned" by a user.
    """

    user: Optional[UUID] = Field(
        default=None,
        title="The id of the user that created this resource. Set "
        "automatically by the server.",
        # This field is set automatically by the server, so the client doesn't
        # need to set it and it will not be serialized.
        exclude=True,
    )

    def get_analytics_metadata(self) -> Dict[str, Any]:
        """Fetches the analytics metadata for user scoped models.

        Returns:
            The analytics metadata.
        """
        metadata = super().get_analytics_metadata()
        metadata["user_id"] = self.user
        return metadata

get_analytics_metadata()

Fetches the analytics metadata for user scoped models.

Returns:

Type Description
Dict[str, Any]

The analytics metadata.

Source code in src/zenml/models/v2/base/scoped.py
72
73
74
75
76
77
78
79
80
def get_analytics_metadata(self) -> Dict[str, Any]:
    """Fetches the analytics metadata for user scoped models.

    Returns:
        The analytics metadata.
    """
    metadata = super().get_analytics_metadata()
    metadata["user_id"] = self.user
    return metadata

UserScopedResponse

Bases: BaseIdentifiedResponse[UserBody, UserMetadata, UserResources], Generic[UserBody, UserMetadata, UserResources]

Base user-owned model.

Used as a base class for all domain models that are "owned" by a user.

Source code in src/zenml/models/v2/base/scoped.py
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
class UserScopedResponse(
    BaseIdentifiedResponse[UserBody, UserMetadata, UserResources],
    Generic[UserBody, UserMetadata, UserResources],
):
    """Base user-owned model.

    Used as a base class for all domain models that are "owned" by a user.
    """

    # Analytics
    def get_analytics_metadata(self) -> Dict[str, Any]:
        """Fetches the analytics metadata for user scoped models.

        Returns:
            The analytics metadata.
        """
        metadata = super().get_analytics_metadata()
        if user_id := self.user_id:
            metadata["user_id"] = user_id
        return metadata

    # Body and metadata properties
    @property
    def user_id(self) -> Optional[UUID]:
        """The user ID property.

        Returns:
            the value of the property.
        """
        return self.get_body().user_id

    @property
    def user(self) -> Optional["UserResponse"]:
        """The user property.

        Returns:
            the value of the property.
        """
        return self.get_resources().user

user property

The user property.

Returns:

Type Description
Optional[UserResponse]

the value of the property.

user_id property

The user ID property.

Returns:

Type Description
Optional[UUID]

the value of the property.

get_analytics_metadata()

Fetches the analytics metadata for user scoped models.

Returns:

Type Description
Dict[str, Any]

The analytics metadata.

Source code in src/zenml/models/v2/base/scoped.py
139
140
141
142
143
144
145
146
147
148
def get_analytics_metadata(self) -> Dict[str, Any]:
    """Fetches the analytics metadata for user scoped models.

    Returns:
        The analytics metadata.
    """
    metadata = super().get_analytics_metadata()
    if user_id := self.user_id:
        metadata["user_id"] = user_id
    return metadata

UserScopedResponseBody

Bases: BaseDatedResponseBody

Base user-owned body.

Source code in src/zenml/models/v2/base/scoped.py
106
107
108
109
class UserScopedResponseBody(BaseDatedResponseBody):
    """Base user-owned body."""

    user_id: Optional[UUID] = Field(title="The user id.", default=None)

UserScopedResponseMetadata

Bases: BaseResponseMetadata

Base user-owned metadata.

Source code in src/zenml/models/v2/base/scoped.py
112
113
class UserScopedResponseMetadata(BaseResponseMetadata):
    """Base user-owned metadata."""

UserUpdate

Bases: UserBase, BaseUpdate

Update model for users.

Source code in src/zenml/models/v2/core/user.py
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
class UserUpdate(UserBase, BaseUpdate):
    """Update model for users."""

    name: Optional[str] = Field(
        title="The unique username for the account.",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )
    full_name: Optional[str] = Field(
        default=None,
        title="The full name for the account owner. Only relevant for user "
        "accounts.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    is_admin: Optional[bool] = Field(
        default=None,
        title="Whether the account is an administrator.",
    )
    active: Optional[bool] = Field(
        default=None, title="Whether the account is active."
    )
    old_password: Optional[str] = Field(
        default=None,
        title="The previous password for the user. Only relevant for user "
        "accounts. Required when updating the password.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    default_project_id: Optional[UUID] = Field(
        default=None,
        title="The default project ID for the user.",
    )

    @model_validator(mode="after")
    def user_email_updates(self) -> "UserUpdate":
        """Validate that the UserUpdateModel conforms to the email-opt-in-flow.

        Returns:
            The validated values.

        Raises:
            ValueError: If the email was not provided when the email_opted_in
                field was set to True.
        """
        # When someone sets the email, or updates the email and hasn't
        #  before explicitly opted out, they are opted in
        if self.email is not None:
            if self.email_opted_in is None:
                self.email_opted_in = True

        # It should not be possible to do opt in without an email
        if self.email_opted_in is True:
            if self.email is None:
                raise ValueError(
                    "Please provide an email, when you are opting-in with "
                    "your email."
                )
        return self

    def create_copy(self, exclude: AbstractSet[str]) -> "UserUpdate":
        """Create a copy of the current instance.

        Args:
            exclude: Fields to exclude from the copy.

        Returns:
            A copy of the current instance.
        """
        return UserUpdate(
            **self.model_dump(
                exclude=set(exclude),
                exclude_unset=True,
            )
        )

create_copy(exclude)

Create a copy of the current instance.

Parameters:

Name Type Description Default
exclude AbstractSet[str]

Fields to exclude from the copy.

required

Returns:

Type Description
UserUpdate

A copy of the current instance.

Source code in src/zenml/models/v2/core/user.py
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
def create_copy(self, exclude: AbstractSet[str]) -> "UserUpdate":
    """Create a copy of the current instance.

    Args:
        exclude: Fields to exclude from the copy.

    Returns:
        A copy of the current instance.
    """
    return UserUpdate(
        **self.model_dump(
            exclude=set(exclude),
            exclude_unset=True,
        )
    )

user_email_updates()

Validate that the UserUpdateModel conforms to the email-opt-in-flow.

Returns:

Type Description
UserUpdate

The validated values.

Raises:

Type Description
ValueError

If the email was not provided when the email_opted_in field was set to True.

Source code in src/zenml/models/v2/core/user.py
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
@model_validator(mode="after")
def user_email_updates(self) -> "UserUpdate":
    """Validate that the UserUpdateModel conforms to the email-opt-in-flow.

    Returns:
        The validated values.

    Raises:
        ValueError: If the email was not provided when the email_opted_in
            field was set to True.
    """
    # When someone sets the email, or updates the email and hasn't
    #  before explicitly opted out, they are opted in
    if self.email is not None:
        if self.email_opted_in is None:
            self.email_opted_in = True

    # It should not be possible to do opt in without an email
    if self.email_opted_in is True:
        if self.email is None:
            raise ValueError(
                "Please provide an email, when you are opting-in with "
                "your email."
            )
    return self

Orchestrators

Initialization for ZenML orchestrators.

An orchestrator is a special kind of backend that manages the running of each step of the pipeline. Orchestrators administer the actual pipeline runs. You can think of it as the 'root' of any pipeline job that you run during your experimentation.

ZenML supports a local orchestrator out of the box which allows you to run your pipelines in a local environment. We also support using Apache Airflow as the orchestrator to handle the steps of your pipeline.

BaseOrchestrator

Bases: StackComponent, ABC

Base class for all orchestrators.

In order to implement an orchestrator you will need to subclass from this class.

How it works:

The run(...) method is the entrypoint that is executed when the pipeline's run method is called within the user code (pipeline_instance.run(...)).

This method will do some internal preparation and then call the prepare_or_run_pipeline(...) method. BaseOrchestrator subclasses must implement this method and either run the pipeline steps directly or deploy the pipeline to some remote infrastructure.

Source code in src/zenml/orchestrators/base_orchestrator.py
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
class BaseOrchestrator(StackComponent, ABC):
    """Base class for all orchestrators.

    In order to implement an orchestrator you will need to subclass from this
    class.

    How it works:
    -------------
    The `run(...)` method is the entrypoint that is executed when the
    pipeline's run method is called within the user code
    (`pipeline_instance.run(...)`).

    This method will do some internal preparation and then call the
    `prepare_or_run_pipeline(...)` method. BaseOrchestrator subclasses must
    implement this method and either run the pipeline steps directly or deploy
    the pipeline to some remote infrastructure.
    """

    _active_deployment: Optional["PipelineDeploymentResponse"] = None

    @property
    def config(self) -> BaseOrchestratorConfig:
        """Returns the `BaseOrchestratorConfig` config.

        Returns:
            The configuration.
        """
        return cast(BaseOrchestratorConfig, self._config)

    @abstractmethod
    def get_orchestrator_run_id(self) -> str:
        """Returns the run id of the active orchestrator run.

        Important: This needs to be a unique ID and return the same value for
        all steps of a pipeline run.

        Returns:
            The orchestrator run id.
        """

    @abstractmethod
    def prepare_or_run_pipeline(
        self,
        deployment: "PipelineDeploymentResponse",
        stack: "Stack",
        environment: Dict[str, str],
        placeholder_run: Optional["PipelineRunResponse"] = None,
    ) -> Optional[Iterator[Dict[str, MetadataType]]]:
        """The method needs to be implemented by the respective orchestrator.

        Depending on the type of orchestrator you'll have to perform slightly
        different operations.

        Simple Case:
        ------------
        The Steps are run directly from within the same environment in which
        the orchestrator code is executed. In this case you will need to
        deal with implementation-specific runtime configurations (like the
        schedule) and then iterate through the steps and finally call
        `self.run_step(...)` to execute each step.

        Advanced Case:
        --------------
        Most orchestrators will not run the steps directly. Instead, they
        build some intermediate representation of the pipeline that is then
        used to create and run the pipeline and its steps on the target
        environment. For such orchestrators this method will have to build
        this representation and deploy it.

        Regardless of the implementation details, the orchestrator will need
        to run each step in the target environment. For this the
        `self.run_step(...)` method should be used.

        The easiest way to make this work is by using an entrypoint
        configuration to run single steps (`zenml.entrypoints.step_entrypoint_configuration.StepEntrypointConfiguration`)
        or entire pipelines (`zenml.entrypoints.pipeline_entrypoint_configuration.PipelineEntrypointConfiguration`).

        Args:
            deployment: The pipeline deployment to prepare or run.
            stack: The stack the pipeline will run on.
            environment: Environment variables to set in the orchestration
                environment. These don't need to be set if running locally.
            placeholder_run: An optional placeholder run for the deployment.

        Yields:
            Metadata for the pipeline run.
        """

    def run(
        self,
        deployment: "PipelineDeploymentResponse",
        stack: "Stack",
        placeholder_run: Optional["PipelineRunResponse"] = None,
    ) -> None:
        """Runs a pipeline on a stack.

        Args:
            deployment: The pipeline deployment.
            stack: The stack on which to run the pipeline.
            placeholder_run: An optional placeholder run for the deployment.
                This will be deleted in case the pipeline deployment failed.
        """
        self._prepare_run(deployment=deployment)

        pipeline_run_id: Optional[UUID] = None
        schedule_id: Optional[UUID] = None
        if deployment.schedule:
            schedule_id = deployment.schedule.id
        if placeholder_run:
            pipeline_run_id = placeholder_run.id

        environment = get_config_environment_vars(
            schedule_id=schedule_id,
            pipeline_run_id=pipeline_run_id,
        )

        prevent_client_side_caching = handle_bool_env_var(
            ENV_ZENML_PREVENT_CLIENT_SIDE_CACHING, default=False
        )

        if (
            placeholder_run
            and self.config.supports_client_side_caching
            and not deployment.schedule
            and not prevent_client_side_caching
        ):
            from zenml.orchestrators import cache_utils

            run_required = (
                cache_utils.create_cached_step_runs_and_prune_deployment(
                    deployment=deployment,
                    pipeline_run=placeholder_run,
                    stack=stack,
                )
            )

            if not run_required:
                self._cleanup_run()
                return
        else:
            logger.debug("Skipping client-side caching.")

        try:
            if metadata_iterator := self.prepare_or_run_pipeline(
                deployment=deployment,
                stack=stack,
                environment=environment,
                placeholder_run=placeholder_run,
            ):
                for metadata_dict in metadata_iterator:
                    try:
                        if placeholder_run:
                            publish_pipeline_run_metadata(
                                pipeline_run_id=placeholder_run.id,
                                pipeline_run_metadata={self.id: metadata_dict},
                            )
                    except Exception as e:
                        logger.debug(
                            "Something went went wrong trying to publish the"
                            f"run metadata: {e}"
                        )
        finally:
            self._cleanup_run()

    def run_step(self, step: "Step") -> None:
        """Runs the given step.

        Args:
            step: The step to run.
        """
        assert self._active_deployment
        launcher = StepLauncher(
            deployment=self._active_deployment,
            step=step,
            orchestrator_run_id=self.get_orchestrator_run_id(),
        )
        launcher.launch()

    @staticmethod
    def requires_resources_in_orchestration_environment(
        step: "Step",
    ) -> bool:
        """Checks if the orchestrator should run this step on special resources.

        Args:
            step: The step that will be checked.

        Returns:
            True if the step requires special resources in the orchestration
            environment, False otherwise.
        """
        # If the step requires custom resources and doesn't run with a step
        # operator, it would need these requirements in the orchestrator
        # environment
        if step.config.step_operator:
            return False

        return not step.config.resource_settings.empty

    def _prepare_run(self, deployment: "PipelineDeploymentResponse") -> None:
        """Prepares a run.

        Args:
            deployment: The deployment to prepare.
        """
        self._active_deployment = deployment

    def _cleanup_run(self) -> None:
        """Cleans up the active run."""
        self._active_deployment = None

    def fetch_status(self, run: "PipelineRunResponse") -> ExecutionStatus:
        """Refreshes the status of a specific pipeline run.

        Args:
            run: A pipeline run response to fetch its status.

        Raises:
            NotImplementedError: If any orchestrator inheriting from the base
                class does not implement this logic.
        """
        raise NotImplementedError(
            "The fetch status functionality is not implemented for the "
            f"'{self.__class__.__name__}' orchestrator."
        )

config property

Returns the BaseOrchestratorConfig config.

Returns:

Type Description
BaseOrchestratorConfig

The configuration.

fetch_status(run)

Refreshes the status of a specific pipeline run.

Parameters:

Name Type Description Default
run PipelineRunResponse

A pipeline run response to fetch its status.

required

Raises:

Type Description
NotImplementedError

If any orchestrator inheriting from the base class does not implement this logic.

Source code in src/zenml/orchestrators/base_orchestrator.py
308
309
310
311
312
313
314
315
316
317
318
319
320
321
def fetch_status(self, run: "PipelineRunResponse") -> ExecutionStatus:
    """Refreshes the status of a specific pipeline run.

    Args:
        run: A pipeline run response to fetch its status.

    Raises:
        NotImplementedError: If any orchestrator inheriting from the base
            class does not implement this logic.
    """
    raise NotImplementedError(
        "The fetch status functionality is not implemented for the "
        f"'{self.__class__.__name__}' orchestrator."
    )

get_orchestrator_run_id() abstractmethod

Returns the run id of the active orchestrator run.

Important: This needs to be a unique ID and return the same value for all steps of a pipeline run.

Returns:

Type Description
str

The orchestrator run id.

Source code in src/zenml/orchestrators/base_orchestrator.py
126
127
128
129
130
131
132
133
134
135
@abstractmethod
def get_orchestrator_run_id(self) -> str:
    """Returns the run id of the active orchestrator run.

    Important: This needs to be a unique ID and return the same value for
    all steps of a pipeline run.

    Returns:
        The orchestrator run id.
    """

prepare_or_run_pipeline(deployment, stack, environment, placeholder_run=None) abstractmethod

The method needs to be implemented by the respective orchestrator.

Depending on the type of orchestrator you'll have to perform slightly different operations.

Simple Case:

The Steps are run directly from within the same environment in which the orchestrator code is executed. In this case you will need to deal with implementation-specific runtime configurations (like the schedule) and then iterate through the steps and finally call self.run_step(...) to execute each step.

Advanced Case:

Most orchestrators will not run the steps directly. Instead, they build some intermediate representation of the pipeline that is then used to create and run the pipeline and its steps on the target environment. For such orchestrators this method will have to build this representation and deploy it.

Regardless of the implementation details, the orchestrator will need to run each step in the target environment. For this the self.run_step(...) method should be used.

The easiest way to make this work is by using an entrypoint configuration to run single steps (zenml.entrypoints.step_entrypoint_configuration.StepEntrypointConfiguration) or entire pipelines (zenml.entrypoints.pipeline_entrypoint_configuration.PipelineEntrypointConfiguration).

Parameters:

Name Type Description Default
deployment PipelineDeploymentResponse

The pipeline deployment to prepare or run.

required
stack Stack

The stack the pipeline will run on.

required
environment Dict[str, str]

Environment variables to set in the orchestration environment. These don't need to be set if running locally.

required
placeholder_run Optional[PipelineRunResponse]

An optional placeholder run for the deployment.

None

Yields:

Type Description
Optional[Iterator[Dict[str, MetadataType]]]

Metadata for the pipeline run.

Source code in src/zenml/orchestrators/base_orchestrator.py
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
@abstractmethod
def prepare_or_run_pipeline(
    self,
    deployment: "PipelineDeploymentResponse",
    stack: "Stack",
    environment: Dict[str, str],
    placeholder_run: Optional["PipelineRunResponse"] = None,
) -> Optional[Iterator[Dict[str, MetadataType]]]:
    """The method needs to be implemented by the respective orchestrator.

    Depending on the type of orchestrator you'll have to perform slightly
    different operations.

    Simple Case:
    ------------
    The Steps are run directly from within the same environment in which
    the orchestrator code is executed. In this case you will need to
    deal with implementation-specific runtime configurations (like the
    schedule) and then iterate through the steps and finally call
    `self.run_step(...)` to execute each step.

    Advanced Case:
    --------------
    Most orchestrators will not run the steps directly. Instead, they
    build some intermediate representation of the pipeline that is then
    used to create and run the pipeline and its steps on the target
    environment. For such orchestrators this method will have to build
    this representation and deploy it.

    Regardless of the implementation details, the orchestrator will need
    to run each step in the target environment. For this the
    `self.run_step(...)` method should be used.

    The easiest way to make this work is by using an entrypoint
    configuration to run single steps (`zenml.entrypoints.step_entrypoint_configuration.StepEntrypointConfiguration`)
    or entire pipelines (`zenml.entrypoints.pipeline_entrypoint_configuration.PipelineEntrypointConfiguration`).

    Args:
        deployment: The pipeline deployment to prepare or run.
        stack: The stack the pipeline will run on.
        environment: Environment variables to set in the orchestration
            environment. These don't need to be set if running locally.
        placeholder_run: An optional placeholder run for the deployment.

    Yields:
        Metadata for the pipeline run.
    """

requires_resources_in_orchestration_environment(step) staticmethod

Checks if the orchestrator should run this step on special resources.

Parameters:

Name Type Description Default
step Step

The step that will be checked.

required

Returns:

Type Description
bool

True if the step requires special resources in the orchestration

bool

environment, False otherwise.

Source code in src/zenml/orchestrators/base_orchestrator.py
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
@staticmethod
def requires_resources_in_orchestration_environment(
    step: "Step",
) -> bool:
    """Checks if the orchestrator should run this step on special resources.

    Args:
        step: The step that will be checked.

    Returns:
        True if the step requires special resources in the orchestration
        environment, False otherwise.
    """
    # If the step requires custom resources and doesn't run with a step
    # operator, it would need these requirements in the orchestrator
    # environment
    if step.config.step_operator:
        return False

    return not step.config.resource_settings.empty

run(deployment, stack, placeholder_run=None)

Runs a pipeline on a stack.

Parameters:

Name Type Description Default
deployment PipelineDeploymentResponse

The pipeline deployment.

required
stack Stack

The stack on which to run the pipeline.

required
placeholder_run Optional[PipelineRunResponse]

An optional placeholder run for the deployment. This will be deleted in case the pipeline deployment failed.

None
Source code in src/zenml/orchestrators/base_orchestrator.py
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
def run(
    self,
    deployment: "PipelineDeploymentResponse",
    stack: "Stack",
    placeholder_run: Optional["PipelineRunResponse"] = None,
) -> None:
    """Runs a pipeline on a stack.

    Args:
        deployment: The pipeline deployment.
        stack: The stack on which to run the pipeline.
        placeholder_run: An optional placeholder run for the deployment.
            This will be deleted in case the pipeline deployment failed.
    """
    self._prepare_run(deployment=deployment)

    pipeline_run_id: Optional[UUID] = None
    schedule_id: Optional[UUID] = None
    if deployment.schedule:
        schedule_id = deployment.schedule.id
    if placeholder_run:
        pipeline_run_id = placeholder_run.id

    environment = get_config_environment_vars(
        schedule_id=schedule_id,
        pipeline_run_id=pipeline_run_id,
    )

    prevent_client_side_caching = handle_bool_env_var(
        ENV_ZENML_PREVENT_CLIENT_SIDE_CACHING, default=False
    )

    if (
        placeholder_run
        and self.config.supports_client_side_caching
        and not deployment.schedule
        and not prevent_client_side_caching
    ):
        from zenml.orchestrators import cache_utils

        run_required = (
            cache_utils.create_cached_step_runs_and_prune_deployment(
                deployment=deployment,
                pipeline_run=placeholder_run,
                stack=stack,
            )
        )

        if not run_required:
            self._cleanup_run()
            return
    else:
        logger.debug("Skipping client-side caching.")

    try:
        if metadata_iterator := self.prepare_or_run_pipeline(
            deployment=deployment,
            stack=stack,
            environment=environment,
            placeholder_run=placeholder_run,
        ):
            for metadata_dict in metadata_iterator:
                try:
                    if placeholder_run:
                        publish_pipeline_run_metadata(
                            pipeline_run_id=placeholder_run.id,
                            pipeline_run_metadata={self.id: metadata_dict},
                        )
                except Exception as e:
                    logger.debug(
                        "Something went went wrong trying to publish the"
                        f"run metadata: {e}"
                    )
    finally:
        self._cleanup_run()

run_step(step)

Runs the given step.

Parameters:

Name Type Description Default
step Step

The step to run.

required
Source code in src/zenml/orchestrators/base_orchestrator.py
261
262
263
264
265
266
267
268
269
270
271
272
273
def run_step(self, step: "Step") -> None:
    """Runs the given step.

    Args:
        step: The step to run.
    """
    assert self._active_deployment
    launcher = StepLauncher(
        deployment=self._active_deployment,
        step=step,
        orchestrator_run_id=self.get_orchestrator_run_id(),
    )
    launcher.launch()

BaseOrchestratorConfig

Bases: StackComponentConfig

Base orchestrator config.

Source code in src/zenml/orchestrators/base_orchestrator.py
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
class BaseOrchestratorConfig(StackComponentConfig):
    """Base orchestrator config."""

    @model_validator(mode="before")
    @classmethod
    @before_validator_handler
    def _deprecations(cls, data: Dict[str, Any]) -> Dict[str, Any]:
        """Validate and/or remove deprecated fields.

        Args:
            data: The values to validate.

        Returns:
            The validated values.
        """
        if "custom_docker_base_image_name" in data:
            image_name = data.pop("custom_docker_base_image_name", None)
            if image_name:
                logger.warning(
                    "The 'custom_docker_base_image_name' field has been "
                    "deprecated. To use a custom base container image with your "
                    "orchestrators, please use the DockerSettings in your "
                    "pipeline (see https://docs.zenml.io/concepts/containerization)."
                )

        return data

    @property
    def is_synchronous(self) -> bool:
        """Whether the orchestrator runs synchronous or not.

        Returns:
            Whether the orchestrator runs synchronous or not.
        """
        return False

    @property
    def is_schedulable(self) -> bool:
        """Whether the orchestrator is schedulable or not.

        Returns:
            Whether the orchestrator is schedulable or not.
        """
        return False

    @property
    def supports_client_side_caching(self) -> bool:
        """Whether the orchestrator supports client side caching.

        Returns:
            Whether the orchestrator supports client side caching.
        """
        return True

is_schedulable property

Whether the orchestrator is schedulable or not.

Returns:

Type Description
bool

Whether the orchestrator is schedulable or not.

is_synchronous property

Whether the orchestrator runs synchronous or not.

Returns:

Type Description
bool

Whether the orchestrator runs synchronous or not.

supports_client_side_caching property

Whether the orchestrator supports client side caching.

Returns:

Type Description
bool

Whether the orchestrator supports client side caching.

BaseOrchestratorFlavor

Bases: Flavor

Base orchestrator flavor class.

Source code in src/zenml/orchestrators/base_orchestrator.py
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
class BaseOrchestratorFlavor(Flavor):
    """Base orchestrator flavor class."""

    @property
    def type(self) -> StackComponentType:
        """Returns the flavor type.

        Returns:
            The flavor type.
        """
        return StackComponentType.ORCHESTRATOR

    @property
    def config_class(self) -> Type[BaseOrchestratorConfig]:
        """Config class for the base orchestrator flavor.

        Returns:
            The config class.
        """
        return BaseOrchestratorConfig

    @property
    @abstractmethod
    def implementation_class(self) -> Type["BaseOrchestrator"]:
        """Implementation class for this flavor.

        Returns:
            The implementation class.
        """

config_class property

Config class for the base orchestrator flavor.

Returns:

Type Description
Type[BaseOrchestratorConfig]

The config class.

implementation_class abstractmethod property

Implementation class for this flavor.

Returns:

Type Description
Type[BaseOrchestrator]

The implementation class.

type property

Returns the flavor type.

Returns:

Type Description
StackComponentType

The flavor type.

ContainerizedOrchestrator

Bases: BaseOrchestrator, ABC

Base class for containerized orchestrators.

Source code in src/zenml/orchestrators/containerized_orchestrator.py
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
class ContainerizedOrchestrator(BaseOrchestrator, ABC):
    """Base class for containerized orchestrators."""

    @staticmethod
    def get_image(
        deployment: "PipelineDeploymentResponse",
        step_name: Optional[str] = None,
    ) -> str:
        """Gets the Docker image for the pipeline/a step.

        Args:
            deployment: The deployment from which to get the image.
            step_name: Pipeline step name for which to get the image. If not
                given the generic pipeline image will be returned.

        Raises:
            RuntimeError: If the deployment does not have an associated build.

        Returns:
            The image name or digest.
        """
        if not deployment.build:
            raise RuntimeError(
                f"Missing build for deployment {deployment.id}. This is "
                "probably because the build was manually deleted."
            )

        return deployment.build.get_image(
            component_key=ORCHESTRATOR_DOCKER_IMAGE_KEY, step=step_name
        )

    def get_docker_builds(
        self, deployment: "PipelineDeploymentBase"
    ) -> List["BuildConfiguration"]:
        """Gets the Docker builds required for the component.

        Args:
            deployment: The pipeline deployment for which to get the builds.

        Returns:
            The required Docker builds.
        """
        pipeline_settings = deployment.pipeline_configuration.docker_settings

        included_pipeline_build = False
        builds = []

        for name, step in deployment.step_configurations.items():
            step_settings = step.config.docker_settings

            if step_settings != pipeline_settings:
                build = BuildConfiguration(
                    key=ORCHESTRATOR_DOCKER_IMAGE_KEY,
                    settings=step_settings,
                    step_name=name,
                )
                builds.append(build)
            elif not included_pipeline_build:
                pipeline_build = BuildConfiguration(
                    key=ORCHESTRATOR_DOCKER_IMAGE_KEY,
                    settings=pipeline_settings,
                )
                builds.append(pipeline_build)
                included_pipeline_build = True

        return builds

get_docker_builds(deployment)

Gets the Docker builds required for the component.

Parameters:

Name Type Description Default
deployment PipelineDeploymentBase

The pipeline deployment for which to get the builds.

required

Returns:

Type Description
List[BuildConfiguration]

The required Docker builds.

Source code in src/zenml/orchestrators/containerized_orchestrator.py
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
def get_docker_builds(
    self, deployment: "PipelineDeploymentBase"
) -> List["BuildConfiguration"]:
    """Gets the Docker builds required for the component.

    Args:
        deployment: The pipeline deployment for which to get the builds.

    Returns:
        The required Docker builds.
    """
    pipeline_settings = deployment.pipeline_configuration.docker_settings

    included_pipeline_build = False
    builds = []

    for name, step in deployment.step_configurations.items():
        step_settings = step.config.docker_settings

        if step_settings != pipeline_settings:
            build = BuildConfiguration(
                key=ORCHESTRATOR_DOCKER_IMAGE_KEY,
                settings=step_settings,
                step_name=name,
            )
            builds.append(build)
        elif not included_pipeline_build:
            pipeline_build = BuildConfiguration(
                key=ORCHESTRATOR_DOCKER_IMAGE_KEY,
                settings=pipeline_settings,
            )
            builds.append(pipeline_build)
            included_pipeline_build = True

    return builds

get_image(deployment, step_name=None) staticmethod

Gets the Docker image for the pipeline/a step.

Parameters:

Name Type Description Default
deployment PipelineDeploymentResponse

The deployment from which to get the image.

required
step_name Optional[str]

Pipeline step name for which to get the image. If not given the generic pipeline image will be returned.

None

Raises:

Type Description
RuntimeError

If the deployment does not have an associated build.

Returns:

Type Description
str

The image name or digest.

Source code in src/zenml/orchestrators/containerized_orchestrator.py
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
@staticmethod
def get_image(
    deployment: "PipelineDeploymentResponse",
    step_name: Optional[str] = None,
) -> str:
    """Gets the Docker image for the pipeline/a step.

    Args:
        deployment: The deployment from which to get the image.
        step_name: Pipeline step name for which to get the image. If not
            given the generic pipeline image will be returned.

    Raises:
        RuntimeError: If the deployment does not have an associated build.

    Returns:
        The image name or digest.
    """
    if not deployment.build:
        raise RuntimeError(
            f"Missing build for deployment {deployment.id}. This is "
            "probably because the build was manually deleted."
        )

    return deployment.build.get_image(
        component_key=ORCHESTRATOR_DOCKER_IMAGE_KEY, step=step_name
    )

LocalDockerOrchestrator

Bases: ContainerizedOrchestrator

Orchestrator responsible for running pipelines locally using Docker.

This orchestrator does not allow for concurrent execution of steps and also does not support running on a schedule.

Source code in src/zenml/orchestrators/local_docker/local_docker_orchestrator.py
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
class LocalDockerOrchestrator(ContainerizedOrchestrator):
    """Orchestrator responsible for running pipelines locally using Docker.

    This orchestrator does not allow for concurrent execution of steps and also
    does not support running on a schedule.
    """

    @property
    def settings_class(self) -> Optional[Type["BaseSettings"]]:
        """Settings class for the Local Docker orchestrator.

        Returns:
            The settings class.
        """
        return LocalDockerOrchestratorSettings

    @property
    def validator(self) -> Optional[StackValidator]:
        """Ensures there is an image builder in the stack.

        Returns:
            A `StackValidator` instance.
        """
        return StackValidator(
            required_components={StackComponentType.IMAGE_BUILDER}
        )

    def get_orchestrator_run_id(self) -> str:
        """Returns the active orchestrator run id.

        Raises:
            RuntimeError: If the environment variable specifying the run id
                is not set.

        Returns:
            The orchestrator run id.
        """
        try:
            return os.environ[ENV_ZENML_DOCKER_ORCHESTRATOR_RUN_ID]
        except KeyError:
            raise RuntimeError(
                "Unable to read run id from environment variable "
                f"{ENV_ZENML_DOCKER_ORCHESTRATOR_RUN_ID}."
            )

    def prepare_or_run_pipeline(
        self,
        deployment: "PipelineDeploymentResponse",
        stack: "Stack",
        environment: Dict[str, str],
        placeholder_run: Optional["PipelineRunResponse"] = None,
    ) -> Any:
        """Sequentially runs all pipeline steps in local Docker containers.

        Args:
            deployment: The pipeline deployment to prepare or run.
            stack: The stack the pipeline will run on.
            environment: Environment variables to set in the orchestration
                environment.
            placeholder_run: An optional placeholder run for the deployment.

        Raises:
            RuntimeError: If a step fails.
        """
        if deployment.schedule:
            logger.warning(
                "Local Docker Orchestrator currently does not support the "
                "use of schedules. The `schedule` will be ignored "
                "and the pipeline will be run immediately."
            )

        docker_client = docker_utils._try_get_docker_client_from_env()

        entrypoint = StepEntrypointConfiguration.get_entrypoint_command()

        # Add the local stores path as a volume mount
        stack.check_local_paths()
        local_stores_path = GlobalConfiguration().local_stores_path
        volumes = {
            local_stores_path: {
                "bind": local_stores_path,
                "mode": "rw",
            }
        }
        orchestrator_run_id = str(uuid4())
        environment[ENV_ZENML_DOCKER_ORCHESTRATOR_RUN_ID] = orchestrator_run_id
        environment[ENV_ZENML_LOCAL_STORES_PATH] = local_stores_path
        start_time = time.time()

        # Run each step
        for step_name, step in deployment.step_configurations.items():
            if self.requires_resources_in_orchestration_environment(step):
                logger.warning(
                    "Specifying step resources is not supported for the local "
                    "Docker orchestrator, ignoring resource configuration for "
                    "step %s.",
                    step_name,
                )

            arguments = StepEntrypointConfiguration.get_entrypoint_arguments(
                step_name=step_name, deployment_id=deployment.id
            )

            settings = cast(
                LocalDockerOrchestratorSettings,
                self.get_settings(step),
            )
            image = self.get_image(deployment=deployment, step_name=step_name)

            user = None
            if sys.platform != "win32":
                user = os.getuid()
            logger.info("Running step `%s` in Docker:", step_name)

            run_args = copy.deepcopy(settings.run_args)
            docker_environment = run_args.pop("environment", {})
            docker_environment.update(environment)

            docker_volumes = run_args.pop("volumes", {})
            docker_volumes.update(volumes)

            extra_hosts = run_args.pop("extra_hosts", {})
            extra_hosts["host.docker.internal"] = "host-gateway"

            try:
                logs = docker_client.containers.run(
                    image=image,
                    entrypoint=entrypoint,
                    command=arguments,
                    user=user,
                    volumes=docker_volumes,
                    environment=docker_environment,
                    stream=True,
                    extra_hosts=extra_hosts,
                    **run_args,
                )

                for line in logs:
                    logger.info(line.strip().decode())
            except ContainerError as e:
                error_message = e.stderr.decode()
                raise RuntimeError(error_message)

        run_duration = time.time() - start_time
        logger.info(
            "Pipeline run has finished in `%s`.",
            string_utils.get_human_readable_time(run_duration),
        )

settings_class property

Settings class for the Local Docker orchestrator.

Returns:

Type Description
Optional[Type[BaseSettings]]

The settings class.

validator property

Ensures there is an image builder in the stack.

Returns:

Type Description
Optional[StackValidator]

A StackValidator instance.

get_orchestrator_run_id()

Returns the active orchestrator run id.

Raises:

Type Description
RuntimeError

If the environment variable specifying the run id is not set.

Returns:

Type Description
str

The orchestrator run id.

Source code in src/zenml/orchestrators/local_docker/local_docker_orchestrator.py
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
def get_orchestrator_run_id(self) -> str:
    """Returns the active orchestrator run id.

    Raises:
        RuntimeError: If the environment variable specifying the run id
            is not set.

    Returns:
        The orchestrator run id.
    """
    try:
        return os.environ[ENV_ZENML_DOCKER_ORCHESTRATOR_RUN_ID]
    except KeyError:
        raise RuntimeError(
            "Unable to read run id from environment variable "
            f"{ENV_ZENML_DOCKER_ORCHESTRATOR_RUN_ID}."
        )

prepare_or_run_pipeline(deployment, stack, environment, placeholder_run=None)

Sequentially runs all pipeline steps in local Docker containers.

Parameters:

Name Type Description Default
deployment PipelineDeploymentResponse

The pipeline deployment to prepare or run.

required
stack Stack

The stack the pipeline will run on.

required
environment Dict[str, str]

Environment variables to set in the orchestration environment.

required
placeholder_run Optional[PipelineRunResponse]

An optional placeholder run for the deployment.

None

Raises:

Type Description
RuntimeError

If a step fails.

Source code in src/zenml/orchestrators/local_docker/local_docker_orchestrator.py
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
def prepare_or_run_pipeline(
    self,
    deployment: "PipelineDeploymentResponse",
    stack: "Stack",
    environment: Dict[str, str],
    placeholder_run: Optional["PipelineRunResponse"] = None,
) -> Any:
    """Sequentially runs all pipeline steps in local Docker containers.

    Args:
        deployment: The pipeline deployment to prepare or run.
        stack: The stack the pipeline will run on.
        environment: Environment variables to set in the orchestration
            environment.
        placeholder_run: An optional placeholder run for the deployment.

    Raises:
        RuntimeError: If a step fails.
    """
    if deployment.schedule:
        logger.warning(
            "Local Docker Orchestrator currently does not support the "
            "use of schedules. The `schedule` will be ignored "
            "and the pipeline will be run immediately."
        )

    docker_client = docker_utils._try_get_docker_client_from_env()

    entrypoint = StepEntrypointConfiguration.get_entrypoint_command()

    # Add the local stores path as a volume mount
    stack.check_local_paths()
    local_stores_path = GlobalConfiguration().local_stores_path
    volumes = {
        local_stores_path: {
            "bind": local_stores_path,
            "mode": "rw",
        }
    }
    orchestrator_run_id = str(uuid4())
    environment[ENV_ZENML_DOCKER_ORCHESTRATOR_RUN_ID] = orchestrator_run_id
    environment[ENV_ZENML_LOCAL_STORES_PATH] = local_stores_path
    start_time = time.time()

    # Run each step
    for step_name, step in deployment.step_configurations.items():
        if self.requires_resources_in_orchestration_environment(step):
            logger.warning(
                "Specifying step resources is not supported for the local "
                "Docker orchestrator, ignoring resource configuration for "
                "step %s.",
                step_name,
            )

        arguments = StepEntrypointConfiguration.get_entrypoint_arguments(
            step_name=step_name, deployment_id=deployment.id
        )

        settings = cast(
            LocalDockerOrchestratorSettings,
            self.get_settings(step),
        )
        image = self.get_image(deployment=deployment, step_name=step_name)

        user = None
        if sys.platform != "win32":
            user = os.getuid()
        logger.info("Running step `%s` in Docker:", step_name)

        run_args = copy.deepcopy(settings.run_args)
        docker_environment = run_args.pop("environment", {})
        docker_environment.update(environment)

        docker_volumes = run_args.pop("volumes", {})
        docker_volumes.update(volumes)

        extra_hosts = run_args.pop("extra_hosts", {})
        extra_hosts["host.docker.internal"] = "host-gateway"

        try:
            logs = docker_client.containers.run(
                image=image,
                entrypoint=entrypoint,
                command=arguments,
                user=user,
                volumes=docker_volumes,
                environment=docker_environment,
                stream=True,
                extra_hosts=extra_hosts,
                **run_args,
            )

            for line in logs:
                logger.info(line.strip().decode())
        except ContainerError as e:
            error_message = e.stderr.decode()
            raise RuntimeError(error_message)

    run_duration = time.time() - start_time
    logger.info(
        "Pipeline run has finished in `%s`.",
        string_utils.get_human_readable_time(run_duration),
    )

LocalDockerOrchestratorFlavor

Bases: BaseOrchestratorFlavor

Flavor for the local Docker orchestrator.

Source code in src/zenml/orchestrators/local_docker/local_docker_orchestrator.py
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
class LocalDockerOrchestratorFlavor(BaseOrchestratorFlavor):
    """Flavor for the local Docker orchestrator."""

    @property
    def name(self) -> str:
        """Name of the orchestrator flavor.

        Returns:
            Name of the orchestrator flavor.
        """
        return "local_docker"

    @property
    def docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_docs_url()

    @property
    def sdk_docs_url(self) -> Optional[str]:
        """A url to point at SDK docs explaining this flavor.

        Returns:
            A flavor SDK docs url.
        """
        return self.generate_default_sdk_docs_url()

    @property
    def logo_url(self) -> str:
        """A url to represent the flavor in the dashboard.

        Returns:
            The flavor logo.
        """
        return "https://public-flavor-logos.s3.eu-central-1.amazonaws.com/orchestrator/docker.png"

    @property
    def config_class(self) -> Type[BaseOrchestratorConfig]:
        """Config class for the base orchestrator flavor.

        Returns:
            The config class.
        """
        return LocalDockerOrchestratorConfig

    @property
    def implementation_class(self) -> Type["LocalDockerOrchestrator"]:
        """Implementation class for this flavor.

        Returns:
            Implementation class for this flavor.
        """
        return LocalDockerOrchestrator

config_class property

Config class for the base orchestrator flavor.

Returns:

Type Description
Type[BaseOrchestratorConfig]

The config class.

docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

implementation_class property

Implementation class for this flavor.

Returns:

Type Description
Type[LocalDockerOrchestrator]

Implementation class for this flavor.

logo_url property

A url to represent the flavor in the dashboard.

Returns:

Type Description
str

The flavor logo.

name property

Name of the orchestrator flavor.

Returns:

Type Description
str

Name of the orchestrator flavor.

sdk_docs_url property

A url to point at SDK docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor SDK docs url.

LocalOrchestrator

Bases: BaseOrchestrator

Orchestrator responsible for running pipelines locally.

This orchestrator does not allow for concurrent execution of steps and also does not support running on a schedule.

Source code in src/zenml/orchestrators/local/local_orchestrator.py
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
class LocalOrchestrator(BaseOrchestrator):
    """Orchestrator responsible for running pipelines locally.

    This orchestrator does not allow for concurrent execution of steps and also
    does not support running on a schedule.
    """

    _orchestrator_run_id: Optional[str] = None

    def prepare_or_run_pipeline(
        self,
        deployment: "PipelineDeploymentResponse",
        stack: "Stack",
        environment: Dict[str, str],
        placeholder_run: Optional["PipelineRunResponse"] = None,
    ) -> Any:
        """Iterates through all steps and executes them sequentially.

        Args:
            deployment: The pipeline deployment to prepare or run.
            stack: The stack on which the pipeline is deployed.
            environment: Environment variables to set in the orchestration
                environment.
            placeholder_run: An optional placeholder run for the deployment.
        """
        if deployment.schedule:
            logger.warning(
                "Local Orchestrator currently does not support the "
                "use of schedules. The `schedule` will be ignored "
                "and the pipeline will be run immediately."
            )

        self._orchestrator_run_id = str(uuid4())
        start_time = time.time()

        # Run each step
        for step_name, step in deployment.step_configurations.items():
            if self.requires_resources_in_orchestration_environment(step):
                logger.warning(
                    "Specifying step resources is not supported for the local "
                    "orchestrator, ignoring resource configuration for "
                    "step %s.",
                    step_name,
                )

            self.run_step(
                step=step,
            )

        run_duration = time.time() - start_time
        logger.info(
            "Pipeline run has finished in `%s`.",
            string_utils.get_human_readable_time(run_duration),
        )
        self._orchestrator_run_id = None

    def get_orchestrator_run_id(self) -> str:
        """Returns the active orchestrator run id.

        Raises:
            RuntimeError: If no run id exists. This happens when this method
                gets called while the orchestrator is not running a pipeline.

        Returns:
            The orchestrator run id.
        """
        if not self._orchestrator_run_id:
            raise RuntimeError("No run id set.")

        return self._orchestrator_run_id

get_orchestrator_run_id()

Returns the active orchestrator run id.

Raises:

Type Description
RuntimeError

If no run id exists. This happens when this method gets called while the orchestrator is not running a pipeline.

Returns:

Type Description
str

The orchestrator run id.

Source code in src/zenml/orchestrators/local/local_orchestrator.py
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
def get_orchestrator_run_id(self) -> str:
    """Returns the active orchestrator run id.

    Raises:
        RuntimeError: If no run id exists. This happens when this method
            gets called while the orchestrator is not running a pipeline.

    Returns:
        The orchestrator run id.
    """
    if not self._orchestrator_run_id:
        raise RuntimeError("No run id set.")

    return self._orchestrator_run_id

prepare_or_run_pipeline(deployment, stack, environment, placeholder_run=None)

Iterates through all steps and executes them sequentially.

Parameters:

Name Type Description Default
deployment PipelineDeploymentResponse

The pipeline deployment to prepare or run.

required
stack Stack

The stack on which the pipeline is deployed.

required
environment Dict[str, str]

Environment variables to set in the orchestration environment.

required
placeholder_run Optional[PipelineRunResponse]

An optional placeholder run for the deployment.

None
Source code in src/zenml/orchestrators/local/local_orchestrator.py
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
def prepare_or_run_pipeline(
    self,
    deployment: "PipelineDeploymentResponse",
    stack: "Stack",
    environment: Dict[str, str],
    placeholder_run: Optional["PipelineRunResponse"] = None,
) -> Any:
    """Iterates through all steps and executes them sequentially.

    Args:
        deployment: The pipeline deployment to prepare or run.
        stack: The stack on which the pipeline is deployed.
        environment: Environment variables to set in the orchestration
            environment.
        placeholder_run: An optional placeholder run for the deployment.
    """
    if deployment.schedule:
        logger.warning(
            "Local Orchestrator currently does not support the "
            "use of schedules. The `schedule` will be ignored "
            "and the pipeline will be run immediately."
        )

    self._orchestrator_run_id = str(uuid4())
    start_time = time.time()

    # Run each step
    for step_name, step in deployment.step_configurations.items():
        if self.requires_resources_in_orchestration_environment(step):
            logger.warning(
                "Specifying step resources is not supported for the local "
                "orchestrator, ignoring resource configuration for "
                "step %s.",
                step_name,
            )

        self.run_step(
            step=step,
        )

    run_duration = time.time() - start_time
    logger.info(
        "Pipeline run has finished in `%s`.",
        string_utils.get_human_readable_time(run_duration),
    )
    self._orchestrator_run_id = None

LocalOrchestratorFlavor

Bases: BaseOrchestratorFlavor

Class for the LocalOrchestratorFlavor.

Source code in src/zenml/orchestrators/local/local_orchestrator.py
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
class LocalOrchestratorFlavor(BaseOrchestratorFlavor):
    """Class for the `LocalOrchestratorFlavor`."""

    @property
    def name(self) -> str:
        """The flavor name.

        Returns:
            The flavor name.
        """
        return "local"

    @property
    def docs_url(self) -> Optional[str]:
        """A URL to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return self.generate_default_docs_url()

    @property
    def sdk_docs_url(self) -> Optional[str]:
        """A URL to point at SDK docs explaining this flavor.

        Returns:
            A flavor SDK docs url.
        """
        return self.generate_default_sdk_docs_url()

    @property
    def logo_url(self) -> str:
        """A URL to represent the flavor in the dashboard.

        Returns:
            The flavor logo.
        """
        return "https://public-flavor-logos.s3.eu-central-1.amazonaws.com/orchestrator/local.png"

    @property
    def config_class(self) -> Type[BaseOrchestratorConfig]:
        """Config class for the base orchestrator flavor.

        Returns:
            The config class.
        """
        return LocalOrchestratorConfig

    @property
    def implementation_class(self) -> Type[LocalOrchestrator]:
        """Implementation class for this flavor.

        Returns:
            The implementation class for this flavor.
        """
        return LocalOrchestrator

config_class property

Config class for the base orchestrator flavor.

Returns:

Type Description
Type[BaseOrchestratorConfig]

The config class.

docs_url property

A URL to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

implementation_class property

Implementation class for this flavor.

Returns:

Type Description
Type[LocalOrchestrator]

The implementation class for this flavor.

logo_url property

A URL to represent the flavor in the dashboard.

Returns:

Type Description
str

The flavor logo.

name property

The flavor name.

Returns:

Type Description
str

The flavor name.

sdk_docs_url property

A URL to point at SDK docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor SDK docs url.

WheeledOrchestrator

Bases: BaseOrchestrator, ABC

Base class for wheeled orchestrators.

Source code in src/zenml/orchestrators/wheeled_orchestrator.py
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
class WheeledOrchestrator(BaseOrchestrator, ABC):
    """Base class for wheeled orchestrators."""

    package_name = DEFAULT_PACKAGE_NAME
    package_version = __version__

    def copy_repository_to_temp_dir_and_add_setup_py(self) -> str:
        """Copy the repository to a temporary directory and add a setup.py file.

        Returns:
            Path to the temporary directory containing the copied repository.
        """
        repo_path = get_source_root()

        self.package_name = f"{DEFAULT_PACKAGE_NAME}_{self.sanitize_name(os.path.basename(repo_path))}"

        # Create a temporary folder
        temp_dir = tempfile.mkdtemp(prefix="zenml-temp-")

        # Create a folder within the temporary directory
        temp_repo_path = os.path.join(temp_dir, self.package_name)
        fileio.mkdir(temp_repo_path)

        # Copy the repository to the temporary directory
        copy_dir(repo_path, temp_repo_path)

        # Create init file in the copied directory
        init_file_path = os.path.join(temp_repo_path, "__init__.py")
        with fileio.open(init_file_path, "w") as f:
            f.write("")

        # Create a setup.py file
        setup_py_content = f"""
from setuptools import setup, find_packages

setup(
    name="{self.package_name}",
    version="{self.package_version}",
    packages=find_packages(),
)
"""
        setup_py_path = os.path.join(temp_dir, "setup.py")
        with fileio.open(setup_py_path, "w") as f:
            f.write(setup_py_content)

        return temp_dir

    def create_wheel(self, temp_dir: str) -> str:
        """Create a wheel for the package in the given temporary directory.

        Args:
            temp_dir (str): Path to the temporary directory containing the package.

        Raises:
            RuntimeError: If the wheel file could not be created.

        Returns:
            str: Path to the created wheel file.
        """
        # Change to the temporary directory
        original_dir = os.getcwd()
        os.chdir(temp_dir)

        try:
            # Run the `pip wheel` command to create the wheel
            result = subprocess.run(
                ["pip", "wheel", "."], check=True, capture_output=True
            )
            logger.debug(f"Wheel creation stdout: {result.stdout.decode()}")
            logger.debug(f"Wheel creation stderr: {result.stderr.decode()}")

            # Find the created wheel file
            wheel_file = next(
                (
                    file
                    for file in os.listdir(temp_dir)
                    if file.endswith(".whl")
                ),
                None,
            )

            if wheel_file is None:
                raise RuntimeError("Failed to create wheel file.")

            wheel_path = os.path.join(temp_dir, wheel_file)

            # Verify the wheel file is a valid zip file
            import zipfile

            if not zipfile.is_zipfile(wheel_path):
                raise RuntimeError(
                    f"The file {wheel_path} is not a valid zip file."
                )

            return wheel_path
        finally:
            # Change back to the original directory
            os.chdir(original_dir)

    def sanitize_name(self, name: str) -> str:
        """Sanitize the value to be used in a cluster name.

        Args:
            name: Arbitrary input cluster name.

        Returns:
            Sanitized cluster name.
        """
        name = re.sub(
            r"[^a-z0-9-]", "-", name.lower()
        )  # replaces any character that is not a lowercase letter, digit, or hyphen with a hyphen
        name = re.sub(r"^[-]+", "", name)  # trim leading hyphens
        name = re.sub(r"[-]+$", "", name)  # trim trailing hyphens
        return name

copy_repository_to_temp_dir_and_add_setup_py()

Copy the repository to a temporary directory and add a setup.py file.

Returns:

Type Description
str

Path to the temporary directory containing the copied repository.

Source code in src/zenml/orchestrators/wheeled_orchestrator.py
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
    def copy_repository_to_temp_dir_and_add_setup_py(self) -> str:
        """Copy the repository to a temporary directory and add a setup.py file.

        Returns:
            Path to the temporary directory containing the copied repository.
        """
        repo_path = get_source_root()

        self.package_name = f"{DEFAULT_PACKAGE_NAME}_{self.sanitize_name(os.path.basename(repo_path))}"

        # Create a temporary folder
        temp_dir = tempfile.mkdtemp(prefix="zenml-temp-")

        # Create a folder within the temporary directory
        temp_repo_path = os.path.join(temp_dir, self.package_name)
        fileio.mkdir(temp_repo_path)

        # Copy the repository to the temporary directory
        copy_dir(repo_path, temp_repo_path)

        # Create init file in the copied directory
        init_file_path = os.path.join(temp_repo_path, "__init__.py")
        with fileio.open(init_file_path, "w") as f:
            f.write("")

        # Create a setup.py file
        setup_py_content = f"""
from setuptools import setup, find_packages

setup(
    name="{self.package_name}",
    version="{self.package_version}",
    packages=find_packages(),
)
"""
        setup_py_path = os.path.join(temp_dir, "setup.py")
        with fileio.open(setup_py_path, "w") as f:
            f.write(setup_py_content)

        return temp_dir

create_wheel(temp_dir)

Create a wheel for the package in the given temporary directory.

Parameters:

Name Type Description Default
temp_dir str

Path to the temporary directory containing the package.

required

Raises:

Type Description
RuntimeError

If the wheel file could not be created.

Returns:

Name Type Description
str str

Path to the created wheel file.

Source code in src/zenml/orchestrators/wheeled_orchestrator.py
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
def create_wheel(self, temp_dir: str) -> str:
    """Create a wheel for the package in the given temporary directory.

    Args:
        temp_dir (str): Path to the temporary directory containing the package.

    Raises:
        RuntimeError: If the wheel file could not be created.

    Returns:
        str: Path to the created wheel file.
    """
    # Change to the temporary directory
    original_dir = os.getcwd()
    os.chdir(temp_dir)

    try:
        # Run the `pip wheel` command to create the wheel
        result = subprocess.run(
            ["pip", "wheel", "."], check=True, capture_output=True
        )
        logger.debug(f"Wheel creation stdout: {result.stdout.decode()}")
        logger.debug(f"Wheel creation stderr: {result.stderr.decode()}")

        # Find the created wheel file
        wheel_file = next(
            (
                file
                for file in os.listdir(temp_dir)
                if file.endswith(".whl")
            ),
            None,
        )

        if wheel_file is None:
            raise RuntimeError("Failed to create wheel file.")

        wheel_path = os.path.join(temp_dir, wheel_file)

        # Verify the wheel file is a valid zip file
        import zipfile

        if not zipfile.is_zipfile(wheel_path):
            raise RuntimeError(
                f"The file {wheel_path} is not a valid zip file."
            )

        return wheel_path
    finally:
        # Change back to the original directory
        os.chdir(original_dir)

sanitize_name(name)

Sanitize the value to be used in a cluster name.

Parameters:

Name Type Description Default
name str

Arbitrary input cluster name.

required

Returns:

Type Description
str

Sanitized cluster name.

Source code in src/zenml/orchestrators/wheeled_orchestrator.py
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
def sanitize_name(self, name: str) -> str:
    """Sanitize the value to be used in a cluster name.

    Args:
        name: Arbitrary input cluster name.

    Returns:
        Sanitized cluster name.
    """
    name = re.sub(
        r"[^a-z0-9-]", "-", name.lower()
    )  # replaces any character that is not a lowercase letter, digit, or hyphen with a hyphen
    name = re.sub(r"^[-]+", "", name)  # trim leading hyphens
    name = re.sub(r"[-]+$", "", name)  # trim trailing hyphens
    return name

Pipelines

PipelineContext

Provides pipeline configuration context.

Usage example:

from zenml import get_pipeline_context

...

@pipeline(
    extra={
        "complex_parameter": [
            ("sklearn.tree", "DecisionTreeClassifier"),
            ("sklearn.ensemble", "RandomForestClassifier"),
        ]
    }
)
def my_pipeline():
    context = get_pipeline_context()

    after = []
    search_steps_prefix = "hp_tuning_search_"
    for i, model_search_configuration in enumerate(
        context.extra["complex_parameter"]
    ):
        step_name = f"{search_steps_prefix}{i}"
        cross_validation(
            model_package=model_search_configuration[0],
            model_class=model_search_configuration[1],
            id=step_name
        )
        after.append(step_name)
    select_best_model(
        search_steps_prefix=search_steps_prefix,
        after=after,
    )
Source code in src/zenml/pipelines/pipeline_context.py
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
class PipelineContext:
    """Provides pipeline configuration context.

    Usage example:

    ```python
    from zenml import get_pipeline_context

    ...

    @pipeline(
        extra={
            "complex_parameter": [
                ("sklearn.tree", "DecisionTreeClassifier"),
                ("sklearn.ensemble", "RandomForestClassifier"),
            ]
        }
    )
    def my_pipeline():
        context = get_pipeline_context()

        after = []
        search_steps_prefix = "hp_tuning_search_"
        for i, model_search_configuration in enumerate(
            context.extra["complex_parameter"]
        ):
            step_name = f"{search_steps_prefix}{i}"
            cross_validation(
                model_package=model_search_configuration[0],
                model_class=model_search_configuration[1],
                id=step_name
            )
            after.append(step_name)
        select_best_model(
            search_steps_prefix=search_steps_prefix,
            after=after,
        )
    ```
    """

    def __init__(self, pipeline_configuration: "PipelineConfiguration"):
        """Initialize the context of the current pipeline.

        Args:
            pipeline_configuration: The configuration of the pipeline derived
                from Pipeline class.
        """
        self.name = pipeline_configuration.name
        self.enable_cache = pipeline_configuration.enable_cache
        self.enable_artifact_metadata = (
            pipeline_configuration.enable_artifact_metadata
        )
        self.enable_artifact_visualization = (
            pipeline_configuration.enable_artifact_visualization
        )
        self.enable_step_logs = pipeline_configuration.enable_step_logs
        self.enable_pipeline_logs = pipeline_configuration.enable_pipeline_logs
        self.settings = pipeline_configuration.settings
        self.extra = pipeline_configuration.extra
        self.model = pipeline_configuration.model

__init__(pipeline_configuration)

Initialize the context of the current pipeline.

Parameters:

Name Type Description Default
pipeline_configuration PipelineConfiguration

The configuration of the pipeline derived from Pipeline class.

required
Source code in src/zenml/pipelines/pipeline_context.py
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
def __init__(self, pipeline_configuration: "PipelineConfiguration"):
    """Initialize the context of the current pipeline.

    Args:
        pipeline_configuration: The configuration of the pipeline derived
            from Pipeline class.
    """
    self.name = pipeline_configuration.name
    self.enable_cache = pipeline_configuration.enable_cache
    self.enable_artifact_metadata = (
        pipeline_configuration.enable_artifact_metadata
    )
    self.enable_artifact_visualization = (
        pipeline_configuration.enable_artifact_visualization
    )
    self.enable_step_logs = pipeline_configuration.enable_step_logs
    self.enable_pipeline_logs = pipeline_configuration.enable_pipeline_logs
    self.settings = pipeline_configuration.settings
    self.extra = pipeline_configuration.extra
    self.model = pipeline_configuration.model

Schedule

Bases: BaseModel

Class for defining a pipeline schedule.

Attributes:

Name Type Description
name Optional[str]

Optional name to give to the schedule. If not set, a default name will be generated based on the pipeline name and the current date and time.

cron_expression Optional[str]

Cron expression for the pipeline schedule. If a value for this is set it takes precedence over the start time + interval.

start_time Optional[datetime]

When the schedule should start. If this is a datetime object without any timezone, it is treated as a datetime in the local timezone.

end_time Optional[datetime]

When the schedule should end. If this is a datetime object without any timezone, it is treated as a datetime in the local timezone.

interval_second Optional[timedelta]

datetime timedelta indicating the seconds between two recurring runs for a periodic schedule.

catchup bool

Whether the recurring run should catch up if behind schedule. For example, if the recurring run is paused for a while and re-enabled afterward. If catchup=True, the scheduler will catch up on (backfill) each missed interval. Otherwise, it only schedules the latest interval if more than one interval is ready to be scheduled. Usually, if your pipeline handles backfill internally, you should turn catchup off to avoid duplicate backfill.

run_once_start_time Optional[datetime]

When to run the pipeline once. If this is a datetime object without any timezone, it is treated as a datetime in the local timezone.

Source code in src/zenml/config/schedule.py
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
class Schedule(BaseModel):
    """Class for defining a pipeline schedule.

    Attributes:
        name: Optional name to give to the schedule. If not set, a default name
            will be generated based on the pipeline name and the current date
            and time.
        cron_expression: Cron expression for the pipeline schedule. If a value
            for this is set it takes precedence over the start time + interval.
        start_time: When the schedule should start. If this is a datetime object
            without any timezone, it is treated as a datetime in the local
            timezone.
        end_time: When the schedule should end. If this is a datetime object
            without any timezone, it is treated as a datetime in the local
            timezone.
        interval_second: datetime timedelta indicating the seconds between two
            recurring runs for a periodic schedule.
        catchup: Whether the recurring run should catch up if behind schedule.
            For example, if the recurring run is paused for a while and
            re-enabled afterward. If catchup=True, the scheduler will catch
            up on (backfill) each missed interval. Otherwise, it only
            schedules the latest interval if more than one interval is ready to
            be scheduled. Usually, if your pipeline handles backfill
            internally, you should turn catchup off to avoid duplicate backfill.
        run_once_start_time: When to run the pipeline once. If this is a
            datetime object without any timezone, it is treated as a datetime
            in the local timezone.
    """

    name: Optional[str] = None
    cron_expression: Optional[str] = None
    start_time: Optional[datetime] = None
    end_time: Optional[datetime] = None
    interval_second: Optional[timedelta] = None
    catchup: bool = False
    run_once_start_time: Optional[datetime] = None

    @field_validator(
        "start_time", "end_time", "run_once_start_time", mode="after"
    )
    @classmethod
    def _ensure_timezone(
        cls, value: Optional[datetime], info: ValidationInfo
    ) -> Optional[datetime]:
        """Ensures that all datetimes are timezone aware.

        Args:
            value: The datetime.
            info: The validation info.

        Returns:
            A timezone aware datetime or None.
        """
        if value and value.tzinfo is None:
            assert info.field_name
            logger.warning(
                "Your schedule `%s` is missing a timezone. It will be treated "
                "as a datetime in your local timezone.",
                info.field_name,
            )
            value = value.astimezone()

        return value

    @model_validator(mode="after")
    def _ensure_cron_or_periodic_schedule_configured(self) -> "Schedule":
        """Ensures that the cron expression or start time + interval are set.

        Returns:
            All schedule attributes.

        Raises:
            ValueError: If no cron expression or start time + interval were
                provided.
        """
        periodic_schedule = self.start_time and self.interval_second

        if self.cron_expression and periodic_schedule:
            logger.warning(
                "This schedule was created with a cron expression as well as "
                "values for `start_time` and `interval_second`. The resulting "
                "behavior depends on the concrete orchestrator implementation "
                "but will usually ignore the interval and use the cron "
                "expression."
            )

            return self
        elif self.cron_expression and self.run_once_start_time:
            logger.warning(
                "This schedule was created with a cron expression as well as "
                "a value for `run_once_start_time`. The resulting behavior "
                "depends on the concrete orchestrator implementation but will "
                "usually ignore the `run_once_start_time`."
            )
            return self
        elif (
            self.cron_expression
            or periodic_schedule
            or self.run_once_start_time
        ):
            return self
        else:
            raise ValueError(
                "Either a cron expression, a start time and interval seconds "
                "or a run once start time "
                "need to be set for a valid schedule."
            )

get_pipeline_context()

Get the context of the current pipeline.

Returns:

Type Description
PipelineContext

The context of the current pipeline.

Raises:

Type Description
RuntimeError

If no active pipeline is found.

RuntimeError

If inside a running step.

Source code in src/zenml/pipelines/pipeline_context.py
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
def get_pipeline_context() -> "PipelineContext":
    """Get the context of the current pipeline.

    Returns:
        The context of the current pipeline.

    Raises:
        RuntimeError: If no active pipeline is found.
        RuntimeError: If inside a running step.
    """
    from zenml.pipelines.pipeline_definition import Pipeline

    if Pipeline.ACTIVE_PIPELINE is None:
        try:
            from zenml.steps.step_context import get_step_context

            get_step_context()
        except RuntimeError:
            raise RuntimeError("No active pipeline found.")
        else:
            raise RuntimeError(
                "Inside a step use `from zenml import get_step_context` "
                "instead."
            )

    return PipelineContext(
        pipeline_configuration=Pipeline.ACTIVE_PIPELINE.configuration
    )

pipeline(_func=None, *, name=None, enable_cache=None, enable_artifact_metadata=None, enable_step_logs=None, enable_pipeline_logs=None, settings=None, tags=None, extra=None, on_failure=None, on_success=None, model=None, substitutions=None)

pipeline(_func: F) -> Pipeline
pipeline(
    *,
    name: Optional[str] = None,
    enable_cache: Optional[bool] = None,
    enable_artifact_metadata: Optional[bool] = None,
    enable_step_logs: Optional[bool] = None,
    enable_pipeline_logs: Optional[bool] = None,
    settings: Optional[Dict[str, SettingsOrDict]] = None,
    tags: Optional[List[Union[str, Tag]]] = None,
    extra: Optional[Dict[str, Any]] = None,
    on_failure: Optional[HookSpecification] = None,
    on_success: Optional[HookSpecification] = None,
    model: Optional[Model] = None,
    substitutions: Optional[Dict[str, str]] = None,
) -> Callable[[F], Pipeline]

Decorator to create a pipeline.

Parameters:

Name Type Description Default
_func Optional[F]

The decorated function.

None
name Optional[str]

The name of the pipeline. If left empty, the name of the decorated function will be used as a fallback.

None
enable_cache Optional[bool]

Whether to use caching or not.

None
enable_artifact_metadata Optional[bool]

Whether to enable artifact metadata or not.

None
enable_step_logs Optional[bool]

If step logs should be enabled for this pipeline.

None
enable_pipeline_logs Optional[bool]

If pipeline logs should be enabled for this pipeline.

None
settings Optional[Dict[str, SettingsOrDict]]

Settings for this pipeline.

None
tags Optional[List[Union[str, Tag]]]

Tags to apply to runs of the pipeline.

None
extra Optional[Dict[str, Any]]

Extra configurations for this pipeline.

None
on_failure Optional[HookSpecification]

Callback function in event of failure of the step. Can be a function with a single argument of type BaseException, or a source path to such a function (e.g. module.my_function).

None
on_success Optional[HookSpecification]

Callback function in event of success of the step. Can be a function with no arguments, or a source path to such a function (e.g. module.my_function).

None
model Optional[Model]

configuration of the model in the Model Control Plane.

None
substitutions Optional[Dict[str, str]]

Extra placeholders to use in the name templates.

None

Returns:

Type Description
Union[Pipeline, Callable[[F], Pipeline]]

A pipeline instance.

Source code in src/zenml/pipelines/pipeline_decorator.py
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
def pipeline(
    _func: Optional["F"] = None,
    *,
    name: Optional[str] = None,
    enable_cache: Optional[bool] = None,
    enable_artifact_metadata: Optional[bool] = None,
    enable_step_logs: Optional[bool] = None,
    enable_pipeline_logs: Optional[bool] = None,
    settings: Optional[Dict[str, "SettingsOrDict"]] = None,
    tags: Optional[List[Union[str, "Tag"]]] = None,
    extra: Optional[Dict[str, Any]] = None,
    on_failure: Optional["HookSpecification"] = None,
    on_success: Optional["HookSpecification"] = None,
    model: Optional["Model"] = None,
    substitutions: Optional[Dict[str, str]] = None,
) -> Union["Pipeline", Callable[["F"], "Pipeline"]]:
    """Decorator to create a pipeline.

    Args:
        _func: The decorated function.
        name: The name of the pipeline. If left empty, the name of the
            decorated function will be used as a fallback.
        enable_cache: Whether to use caching or not.
        enable_artifact_metadata: Whether to enable artifact metadata or not.
        enable_step_logs: If step logs should be enabled for this pipeline.
        enable_pipeline_logs: If pipeline logs should be enabled for this pipeline.
        settings: Settings for this pipeline.
        tags: Tags to apply to runs of the pipeline.
        extra: Extra configurations for this pipeline.
        on_failure: Callback function in event of failure of the step. Can be a
            function with a single argument of type `BaseException`, or a source
            path to such a function (e.g. `module.my_function`).
        on_success: Callback function in event of success of the step. Can be a
            function with no arguments, or a source path to such a function
            (e.g. `module.my_function`).
        model: configuration of the model in the Model Control Plane.
        substitutions: Extra placeholders to use in the name templates.

    Returns:
        A pipeline instance.
    """

    def inner_decorator(func: "F") -> "Pipeline":
        from zenml.pipelines.pipeline_definition import Pipeline

        p = Pipeline(
            name=name or func.__name__,
            enable_cache=enable_cache,
            enable_artifact_metadata=enable_artifact_metadata,
            enable_step_logs=enable_step_logs,
            enable_pipeline_logs=enable_pipeline_logs,
            settings=settings,
            tags=tags,
            extra=extra,
            on_failure=on_failure,
            on_success=on_success,
            model=model,
            entrypoint=func,
            substitutions=substitutions,
        )

        p.__doc__ = func.__doc__
        return p

    return inner_decorator if _func is None else inner_decorator(_func)

Plugins

Secret

Initialization of the ZenML Secret module.

A ZenML Secret is a grouping of key-value pairs. These are accessed and administered via the ZenML Secret Store.

BaseSecretSchema

Bases: BaseModel

Base class for all Secret Schemas.

Source code in src/zenml/secret/base_secret.py
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
class BaseSecretSchema(BaseModel):
    """Base class for all Secret Schemas."""

    @classmethod
    def get_schema_keys(cls) -> List[str]:
        """Get all attributes that are part of the schema.

        These schema keys can be used to define all required key-value pairs of
        a secret schema.

        Returns:
            A list of all attribute names that are part of the schema.
        """
        return list(cls.model_fields.keys())

    def get_values(self) -> Dict[str, Any]:
        """Get all values of the secret schema.

        Returns:
            A dictionary of all attribute names and their corresponding values.
        """
        return self.model_dump(exclude_none=True)

    model_config = ConfigDict(
        # validate attribute assignments
        validate_assignment=True,
        # report extra attributes as validation failures
        extra="ignore",
    )

get_schema_keys() classmethod

Get all attributes that are part of the schema.

These schema keys can be used to define all required key-value pairs of a secret schema.

Returns:

Type Description
List[str]

A list of all attribute names that are part of the schema.

Source code in src/zenml/secret/base_secret.py
24
25
26
27
28
29
30
31
32
33
34
@classmethod
def get_schema_keys(cls) -> List[str]:
    """Get all attributes that are part of the schema.

    These schema keys can be used to define all required key-value pairs of
    a secret schema.

    Returns:
        A list of all attribute names that are part of the schema.
    """
    return list(cls.model_fields.keys())

get_values()

Get all values of the secret schema.

Returns:

Type Description
Dict[str, Any]

A dictionary of all attribute names and their corresponding values.

Source code in src/zenml/secret/base_secret.py
36
37
38
39
40
41
42
def get_values(self) -> Dict[str, Any]:
    """Get all values of the secret schema.

    Returns:
        A dictionary of all attribute names and their corresponding values.
    """
    return self.model_dump(exclude_none=True)

Service Connectors

ZenML Service Connectors.

Services

Initialization of the ZenML services module.

A service is a process or set of processes that outlive a pipeline run.

BaseService

Bases: BaseTypedModel

Base service class.

This class implements generic functionality concerning the life-cycle management and tracking of an external service (e.g. process, container, Kubernetes deployment etc.).

Attributes:

Name Type Description
SERVICE_TYPE ServiceType

a service type descriptor with information describing the service class. Every concrete service class must define this.

admin_state ServiceState

the administrative state of the service.

uuid UUID

unique UUID identifier for the service instance.

config ServiceConfig

service configuration

status ServiceStatus

service status

endpoint Optional[BaseServiceEndpoint]

optional service endpoint

Source code in src/zenml/services/service.py
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
class BaseService(BaseTypedModel):
    """Base service class.

    This class implements generic functionality concerning the life-cycle
    management and tracking of an external service (e.g. process, container,
    Kubernetes deployment etc.).

    Attributes:
        SERVICE_TYPE: a service type descriptor with information describing
            the service class. Every concrete service class must define this.
        admin_state: the administrative state of the service.
        uuid: unique UUID identifier for the service instance.
        config: service configuration
        status: service status
        endpoint: optional service endpoint
    """

    SERVICE_TYPE: ClassVar[ServiceType]

    uuid: UUID
    admin_state: ServiceState = ServiceState.INACTIVE
    config: ServiceConfig
    status: ServiceStatus
    # TODO [ENG-703]: allow multiple endpoints per service
    endpoint: Optional[BaseServiceEndpoint] = None

    def __init__(
        self,
        **attrs: Any,
    ) -> None:
        """Initialize the service instance.

        Args:
            **attrs: keyword arguments.
        """
        super().__init__(**attrs)
        self.config.name = self.config.name or self.__class__.__name__

    @classmethod
    def from_model(cls, model: "ServiceResponse") -> "BaseService":
        """Loads a service from a model.

        Args:
            model: The ServiceResponse to load from.

        Returns:
            The loaded service object.

        Raises:
            ValueError: if the service source is not found in the model.
        """
        if not model.service_source:
            raise ValueError("Service source not found in the model.")
        class_: Type[BaseService] = source_utils.load_and_validate_class(
            source=model.service_source, expected_class=BaseService
        )
        return class_(
            uuid=model.id,
            admin_state=model.admin_state,
            config=model.config,
            status=model.status,
            service_type=model.service_type.model_dump(),
            endpoint=model.endpoint,
        )

    @classmethod
    def from_json(cls, json_str: str) -> "BaseTypedModel":
        """Loads a service from a JSON string.

        Args:
            json_str: the JSON string to load from.

        Returns:
            The loaded service object.
        """
        service_dict = json.loads(json_str)
        class_: Type[BaseService] = source_utils.load_and_validate_class(
            source=service_dict["type"], expected_class=BaseService
        )
        return class_.from_dict(service_dict)

    @abstractmethod
    def check_status(self) -> Tuple[ServiceState, str]:
        """Check the the current operational state of the external service.

        This method should be overridden by subclasses that implement
        concrete service tracking functionality.

        Returns:
            The operational state of the external service and a message
            providing additional information about that state (e.g. a
            description of the error if one is encountered while checking the
            service status).
        """

    @abstractmethod
    def get_logs(
        self, follow: bool = False, tail: Optional[int] = None
    ) -> Generator[str, bool, None]:
        """Retrieve the service logs.

        This method should be overridden by subclasses that implement
        concrete service tracking functionality.

        Args:
            follow: if True, the logs will be streamed as they are written
            tail: only retrieve the last NUM lines of log output.

        Returns:
            A generator that can be accessed to get the service logs.
        """

    def update_status(self) -> None:
        """Update the status of the service.

        Check the current operational state of the external service
        and update the local operational status information to reflect it.

        This method should be overridden by subclasses that implement
        concrete service status tracking functionality.
        """
        logger.debug(
            "Running status check for service '%s' ...",
            self,
        )
        try:
            state, err = self.check_status()
            logger.debug(
                "Status check results for service '%s': %s [%s]",
                self,
                state.name,
                err,
            )
            self.status.update_state(state, err)

            # don't bother checking the endpoint state if the service is not active
            if self.status.state == ServiceState.INACTIVE:
                return

            if self.endpoint:
                self.endpoint.update_status()
        except Exception as e:
            logger.error(
                f"Failed to update status for service '{self}': {e}",
                exc_info=True,
            )
            self.status.update_state(ServiceState.ERROR, str(e))

    def get_service_status_message(self) -> str:
        """Get a service status message.

        Returns:
            A message providing information about the current operational
            state of the service.
        """
        return (
            f"  Administrative state: `{self.admin_state.value}`\n"
            f"  Operational state: `{self.status.state.value}`\n"
            f"  Last status message: '{self.status.last_error}'\n"
        )

    def poll_service_status(self, timeout: int = 0) -> bool:
        """Polls the external service status.

        It does this until the service operational state matches the
        administrative state, the service enters a failed state, or the timeout
        is reached.

        Args:
            timeout: maximum time to wait for the service operational state
                to match the administrative state, in seconds

        Returns:
            True if the service operational state matches the administrative
            state, False otherwise.
        """
        time_remaining = timeout
        while True:
            if self.admin_state == ServiceState.ACTIVE and self.is_running:
                return True
            if self.admin_state == ServiceState.INACTIVE and self.is_stopped:
                return True
            if self.is_failed:
                return False
            if time_remaining <= 0:
                break
            time.sleep(1)
            time_remaining -= 1

        if timeout > 0:
            logger.error(
                f"Timed out waiting for service {self} to become "
                f"{self.admin_state.value}:\n"
                + self.get_service_status_message()
            )

        return False

    @property
    def is_running(self) -> bool:
        """Check if the service is currently running.

        This method will actively poll the external service to get its status
        and will return the result.

        Returns:
            True if the service is running and active (i.e. the endpoints are
            responsive, if any are configured), otherwise False.
        """
        self.update_status()
        return self.status.state == ServiceState.ACTIVE and (
            not self.endpoint or self.endpoint.is_active()
        )

    @property
    def is_stopped(self) -> bool:
        """Check if the service is currently stopped.

        This method will actively poll the external service to get its status
        and will return the result.

        Returns:
            True if the service is stopped, otherwise False.
        """
        self.update_status()
        return self.status.state == ServiceState.INACTIVE

    @property
    def is_failed(self) -> bool:
        """Check if the service is currently failed.

        This method will actively poll the external service to get its status
        and will return the result.

        Returns:
            True if the service is in a failure state, otherwise False.
        """
        self.update_status()
        return self.status.state == ServiceState.ERROR

    def provision(self) -> None:
        """Provisions resources to run the service.

        Raises:
            NotImplementedError: if the service does not implement provisioning functionality
        """
        raise NotImplementedError(
            f"Provisioning resources not implemented for {self}."
        )

    def deprovision(self, force: bool = False) -> None:
        """Deprovisions all resources used by the service.

        Args:
            force: if True, the service will be deprovisioned even if it is
                in a failed state.

        Raises:
            NotImplementedError: if the service does not implement
                deprovisioning functionality.
        """
        raise NotImplementedError(
            f"Deprovisioning resources not implemented for {self}."
        )

    def update(self, config: ServiceConfig) -> None:
        """Update the service configuration.

        Args:
            config: the new service configuration.
        """
        self.config = config

    @update_service_status(
        pre_status=ServiceState.PENDING_STARTUP,
        post_status=ServiceState.ACTIVE,
    )
    def start(self, timeout: int = 0) -> None:
        """Start the service and optionally wait for it to become active.

        Args:
            timeout: amount of time to wait for the service to become active.
                If set to 0, the method will return immediately after checking
                the service status.
        """
        with console.status(f"Starting service '{self}'.\n"):
            self.admin_state = ServiceState.ACTIVE
            self.provision()
            if timeout > 0 and not self.poll_service_status(timeout):
                logger.error(
                    f"Failed to start service {self}\n"
                    + self.get_service_status_message()
                )

    @update_service_status(
        pre_status=ServiceState.PENDING_SHUTDOWN,
        post_status=ServiceState.INACTIVE,
    )
    def stop(self, timeout: int = 0, force: bool = False) -> None:
        """Stop the service and optionally wait for it to shutdown.

        Args:
            timeout: amount of time to wait for the service to shutdown.
                If set to 0, the method will return immediately after checking
                the service status.
            force: if True, the service will be stopped even if it is not
                currently running.
        """
        with console.status(f"Stopping service '{self}'.\n"):
            self.admin_state = ServiceState.INACTIVE
            self.deprovision(force)
            if timeout > 0:
                self.poll_service_status(timeout)
                if not self.is_stopped:
                    logger.error(
                        f"Failed to stop service {self}. Last state: "
                        f"'{self.status.state.value}'. Last error: "
                        f"'{self.status.last_error}'"
                    )

    def get_prediction_url(self) -> Optional[str]:
        """Gets the prediction URL for the endpoint.

        Returns:
            the prediction URL for the endpoint
        """
        prediction_url = None
        if isinstance(self, BaseDeploymentService) and self.prediction_url:
            prediction_url = self.prediction_url
        elif self.endpoint:
            prediction_url = (
                self.endpoint.status.uri if self.endpoint.status else None
            )
        return prediction_url

    def get_healthcheck_url(self) -> Optional[str]:
        """Gets the healthcheck URL for the endpoint.

        Returns:
            the healthcheck URL for the endpoint
        """
        return (
            self.endpoint.monitor.get_healthcheck_uri(self.endpoint)
            if (self.endpoint and self.endpoint.monitor)
            and isinstance(self.endpoint.monitor, HTTPEndpointHealthMonitor)
            else None
        )

    def __repr__(self) -> str:
        """String representation of the service.

        Returns:
            A string representation of the service.
        """
        return f"{self.__class__.__qualname__}[{self.uuid}] (type: {self.SERVICE_TYPE.type}, flavor: {self.SERVICE_TYPE.flavor})"

    def __str__(self) -> str:
        """String representation of the service.

        Returns:
            A string representation of the service.
        """
        return self.__repr__()

    model_config = ConfigDict(
        # validate attribute assignments
        validate_assignment=True,
    )

is_failed property

Check if the service is currently failed.

This method will actively poll the external service to get its status and will return the result.

Returns:

Type Description
bool

True if the service is in a failure state, otherwise False.

is_running property

Check if the service is currently running.

This method will actively poll the external service to get its status and will return the result.

Returns:

Type Description
bool

True if the service is running and active (i.e. the endpoints are

bool

responsive, if any are configured), otherwise False.

is_stopped property

Check if the service is currently stopped.

This method will actively poll the external service to get its status and will return the result.

Returns:

Type Description
bool

True if the service is stopped, otherwise False.

__init__(**attrs)

Initialize the service instance.

Parameters:

Name Type Description Default
**attrs Any

keyword arguments.

{}
Source code in src/zenml/services/service.py
188
189
190
191
192
193
194
195
196
197
198
def __init__(
    self,
    **attrs: Any,
) -> None:
    """Initialize the service instance.

    Args:
        **attrs: keyword arguments.
    """
    super().__init__(**attrs)
    self.config.name = self.config.name or self.__class__.__name__

__repr__()

String representation of the service.

Returns:

Type Description
str

A string representation of the service.

Source code in src/zenml/services/service.py
510
511
512
513
514
515
516
def __repr__(self) -> str:
    """String representation of the service.

    Returns:
        A string representation of the service.
    """
    return f"{self.__class__.__qualname__}[{self.uuid}] (type: {self.SERVICE_TYPE.type}, flavor: {self.SERVICE_TYPE.flavor})"

__str__()

String representation of the service.

Returns:

Type Description
str

A string representation of the service.

Source code in src/zenml/services/service.py
518
519
520
521
522
523
524
def __str__(self) -> str:
    """String representation of the service.

    Returns:
        A string representation of the service.
    """
    return self.__repr__()

check_status() abstractmethod

Check the the current operational state of the external service.

This method should be overridden by subclasses that implement concrete service tracking functionality.

Returns:

Type Description
ServiceState

The operational state of the external service and a message

str

providing additional information about that state (e.g. a

Tuple[ServiceState, str]

description of the error if one is encountered while checking the

Tuple[ServiceState, str]

service status).

Source code in src/zenml/services/service.py
243
244
245
246
247
248
249
250
251
252
253
254
255
@abstractmethod
def check_status(self) -> Tuple[ServiceState, str]:
    """Check the the current operational state of the external service.

    This method should be overridden by subclasses that implement
    concrete service tracking functionality.

    Returns:
        The operational state of the external service and a message
        providing additional information about that state (e.g. a
        description of the error if one is encountered while checking the
        service status).
    """

deprovision(force=False)

Deprovisions all resources used by the service.

Parameters:

Name Type Description Default
force bool

if True, the service will be deprovisioned even if it is in a failed state.

False

Raises:

Type Description
NotImplementedError

if the service does not implement deprovisioning functionality.

Source code in src/zenml/services/service.py
412
413
414
415
416
417
418
419
420
421
422
423
424
425
def deprovision(self, force: bool = False) -> None:
    """Deprovisions all resources used by the service.

    Args:
        force: if True, the service will be deprovisioned even if it is
            in a failed state.

    Raises:
        NotImplementedError: if the service does not implement
            deprovisioning functionality.
    """
    raise NotImplementedError(
        f"Deprovisioning resources not implemented for {self}."
    )

from_json(json_str) classmethod

Loads a service from a JSON string.

Parameters:

Name Type Description Default
json_str str

the JSON string to load from.

required

Returns:

Type Description
BaseTypedModel

The loaded service object.

Source code in src/zenml/services/service.py
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
@classmethod
def from_json(cls, json_str: str) -> "BaseTypedModel":
    """Loads a service from a JSON string.

    Args:
        json_str: the JSON string to load from.

    Returns:
        The loaded service object.
    """
    service_dict = json.loads(json_str)
    class_: Type[BaseService] = source_utils.load_and_validate_class(
        source=service_dict["type"], expected_class=BaseService
    )
    return class_.from_dict(service_dict)

from_model(model) classmethod

Loads a service from a model.

Parameters:

Name Type Description Default
model ServiceResponse

The ServiceResponse to load from.

required

Returns:

Type Description
BaseService

The loaded service object.

Raises:

Type Description
ValueError

if the service source is not found in the model.

Source code in src/zenml/services/service.py
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
@classmethod
def from_model(cls, model: "ServiceResponse") -> "BaseService":
    """Loads a service from a model.

    Args:
        model: The ServiceResponse to load from.

    Returns:
        The loaded service object.

    Raises:
        ValueError: if the service source is not found in the model.
    """
    if not model.service_source:
        raise ValueError("Service source not found in the model.")
    class_: Type[BaseService] = source_utils.load_and_validate_class(
        source=model.service_source, expected_class=BaseService
    )
    return class_(
        uuid=model.id,
        admin_state=model.admin_state,
        config=model.config,
        status=model.status,
        service_type=model.service_type.model_dump(),
        endpoint=model.endpoint,
    )

get_healthcheck_url()

Gets the healthcheck URL for the endpoint.

Returns:

Type Description
Optional[str]

the healthcheck URL for the endpoint

Source code in src/zenml/services/service.py
497
498
499
500
501
502
503
504
505
506
507
508
def get_healthcheck_url(self) -> Optional[str]:
    """Gets the healthcheck URL for the endpoint.

    Returns:
        the healthcheck URL for the endpoint
    """
    return (
        self.endpoint.monitor.get_healthcheck_uri(self.endpoint)
        if (self.endpoint and self.endpoint.monitor)
        and isinstance(self.endpoint.monitor, HTTPEndpointHealthMonitor)
        else None
    )

get_logs(follow=False, tail=None) abstractmethod

Retrieve the service logs.

This method should be overridden by subclasses that implement concrete service tracking functionality.

Parameters:

Name Type Description Default
follow bool

if True, the logs will be streamed as they are written

False
tail Optional[int]

only retrieve the last NUM lines of log output.

None

Returns:

Type Description
None

A generator that can be accessed to get the service logs.

Source code in src/zenml/services/service.py
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
@abstractmethod
def get_logs(
    self, follow: bool = False, tail: Optional[int] = None
) -> Generator[str, bool, None]:
    """Retrieve the service logs.

    This method should be overridden by subclasses that implement
    concrete service tracking functionality.

    Args:
        follow: if True, the logs will be streamed as they are written
        tail: only retrieve the last NUM lines of log output.

    Returns:
        A generator that can be accessed to get the service logs.
    """

get_prediction_url()

Gets the prediction URL for the endpoint.

Returns:

Type Description
Optional[str]

the prediction URL for the endpoint

Source code in src/zenml/services/service.py
482
483
484
485
486
487
488
489
490
491
492
493
494
495
def get_prediction_url(self) -> Optional[str]:
    """Gets the prediction URL for the endpoint.

    Returns:
        the prediction URL for the endpoint
    """
    prediction_url = None
    if isinstance(self, BaseDeploymentService) and self.prediction_url:
        prediction_url = self.prediction_url
    elif self.endpoint:
        prediction_url = (
            self.endpoint.status.uri if self.endpoint.status else None
        )
    return prediction_url

get_service_status_message()

Get a service status message.

Returns:

Type Description
str

A message providing information about the current operational

str

state of the service.

Source code in src/zenml/services/service.py
310
311
312
313
314
315
316
317
318
319
320
321
def get_service_status_message(self) -> str:
    """Get a service status message.

    Returns:
        A message providing information about the current operational
        state of the service.
    """
    return (
        f"  Administrative state: `{self.admin_state.value}`\n"
        f"  Operational state: `{self.status.state.value}`\n"
        f"  Last status message: '{self.status.last_error}'\n"
    )

poll_service_status(timeout=0)

Polls the external service status.

It does this until the service operational state matches the administrative state, the service enters a failed state, or the timeout is reached.

Parameters:

Name Type Description Default
timeout int

maximum time to wait for the service operational state to match the administrative state, in seconds

0

Returns:

Type Description
bool

True if the service operational state matches the administrative

bool

state, False otherwise.

Source code in src/zenml/services/service.py
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
def poll_service_status(self, timeout: int = 0) -> bool:
    """Polls the external service status.

    It does this until the service operational state matches the
    administrative state, the service enters a failed state, or the timeout
    is reached.

    Args:
        timeout: maximum time to wait for the service operational state
            to match the administrative state, in seconds

    Returns:
        True if the service operational state matches the administrative
        state, False otherwise.
    """
    time_remaining = timeout
    while True:
        if self.admin_state == ServiceState.ACTIVE and self.is_running:
            return True
        if self.admin_state == ServiceState.INACTIVE and self.is_stopped:
            return True
        if self.is_failed:
            return False
        if time_remaining <= 0:
            break
        time.sleep(1)
        time_remaining -= 1

    if timeout > 0:
        logger.error(
            f"Timed out waiting for service {self} to become "
            f"{self.admin_state.value}:\n"
            + self.get_service_status_message()
        )

    return False

provision()

Provisions resources to run the service.

Raises:

Type Description
NotImplementedError

if the service does not implement provisioning functionality

Source code in src/zenml/services/service.py
402
403
404
405
406
407
408
409
410
def provision(self) -> None:
    """Provisions resources to run the service.

    Raises:
        NotImplementedError: if the service does not implement provisioning functionality
    """
    raise NotImplementedError(
        f"Provisioning resources not implemented for {self}."
    )

start(timeout=0)

Start the service and optionally wait for it to become active.

Parameters:

Name Type Description Default
timeout int

amount of time to wait for the service to become active. If set to 0, the method will return immediately after checking the service status.

0
Source code in src/zenml/services/service.py
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
@update_service_status(
    pre_status=ServiceState.PENDING_STARTUP,
    post_status=ServiceState.ACTIVE,
)
def start(self, timeout: int = 0) -> None:
    """Start the service and optionally wait for it to become active.

    Args:
        timeout: amount of time to wait for the service to become active.
            If set to 0, the method will return immediately after checking
            the service status.
    """
    with console.status(f"Starting service '{self}'.\n"):
        self.admin_state = ServiceState.ACTIVE
        self.provision()
        if timeout > 0 and not self.poll_service_status(timeout):
            logger.error(
                f"Failed to start service {self}\n"
                + self.get_service_status_message()
            )

stop(timeout=0, force=False)

Stop the service and optionally wait for it to shutdown.

Parameters:

Name Type Description Default
timeout int

amount of time to wait for the service to shutdown. If set to 0, the method will return immediately after checking the service status.

0
force bool

if True, the service will be stopped even if it is not currently running.

False
Source code in src/zenml/services/service.py
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
@update_service_status(
    pre_status=ServiceState.PENDING_SHUTDOWN,
    post_status=ServiceState.INACTIVE,
)
def stop(self, timeout: int = 0, force: bool = False) -> None:
    """Stop the service and optionally wait for it to shutdown.

    Args:
        timeout: amount of time to wait for the service to shutdown.
            If set to 0, the method will return immediately after checking
            the service status.
        force: if True, the service will be stopped even if it is not
            currently running.
    """
    with console.status(f"Stopping service '{self}'.\n"):
        self.admin_state = ServiceState.INACTIVE
        self.deprovision(force)
        if timeout > 0:
            self.poll_service_status(timeout)
            if not self.is_stopped:
                logger.error(
                    f"Failed to stop service {self}. Last state: "
                    f"'{self.status.state.value}'. Last error: "
                    f"'{self.status.last_error}'"
                )

update(config)

Update the service configuration.

Parameters:

Name Type Description Default
config ServiceConfig

the new service configuration.

required
Source code in src/zenml/services/service.py
427
428
429
430
431
432
433
def update(self, config: ServiceConfig) -> None:
    """Update the service configuration.

    Args:
        config: the new service configuration.
    """
    self.config = config

update_status()

Update the status of the service.

Check the current operational state of the external service and update the local operational status information to reflect it.

This method should be overridden by subclasses that implement concrete service status tracking functionality.

Source code in src/zenml/services/service.py
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
def update_status(self) -> None:
    """Update the status of the service.

    Check the current operational state of the external service
    and update the local operational status information to reflect it.

    This method should be overridden by subclasses that implement
    concrete service status tracking functionality.
    """
    logger.debug(
        "Running status check for service '%s' ...",
        self,
    )
    try:
        state, err = self.check_status()
        logger.debug(
            "Status check results for service '%s': %s [%s]",
            self,
            state.name,
            err,
        )
        self.status.update_state(state, err)

        # don't bother checking the endpoint state if the service is not active
        if self.status.state == ServiceState.INACTIVE:
            return

        if self.endpoint:
            self.endpoint.update_status()
    except Exception as e:
        logger.error(
            f"Failed to update status for service '{self}': {e}",
            exc_info=True,
        )
        self.status.update_state(ServiceState.ERROR, str(e))

BaseServiceEndpoint

Bases: BaseTypedModel

Base service class.

This class implements generic functionality concerning the life-cycle management and tracking of an external service endpoint (e.g. a HTTP/HTTPS API or generic TCP endpoint exposed by a service).

Attributes:

Name Type Description
admin_state ServiceState

the administrative state of the service endpoint

config ServiceEndpointConfig

service endpoint configuration

status ServiceEndpointStatus

service endpoint status

monitor Optional[BaseServiceEndpointHealthMonitor]

optional service endpoint health monitor

Source code in src/zenml/services/service_endpoint.py
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
class BaseServiceEndpoint(BaseTypedModel):
    """Base service class.

    This class implements generic functionality concerning the life-cycle
    management and tracking of an external service endpoint (e.g. a HTTP/HTTPS
    API or generic TCP endpoint exposed by a service).

    Attributes:
        admin_state: the administrative state of the service endpoint
        config: service endpoint configuration
        status: service endpoint status
        monitor: optional service endpoint health monitor
    """

    admin_state: ServiceState = ServiceState.INACTIVE
    config: ServiceEndpointConfig
    status: ServiceEndpointStatus
    # TODO [ENG-701]: allow multiple monitors per endpoint
    monitor: Optional[BaseServiceEndpointHealthMonitor] = None

    def __init__(
        self,
        *args: Any,
        **kwargs: Any,
    ) -> None:
        """Initialize the service endpoint.

        Args:
            *args: positional arguments.
            **kwargs: keyword arguments.
        """
        super().__init__(*args, **kwargs)
        self.config.name = self.config.name or self.__class__.__name__

    def check_status(self) -> Tuple[ServiceState, str]:
        """Check the the current operational state of the external service endpoint.

        Returns:
            The operational state of the external service endpoint and a
            message providing additional information about that state
            (e.g. a description of the error, if one is encountered while
            checking the service status).
        """
        if not self.monitor:
            # no health monitor configured; assume service operational state
            # always matches the admin state
            return self.admin_state, ""
        return self.monitor.check_endpoint_status(self)

    def update_status(self) -> None:
        """Check the the current operational state of the external service endpoint.

        It updates the local operational status information accordingly.
        """
        logger.debug(
            "Running health check for service endpoint '%s' ...",
            self.config.name,
        )
        state, err = self.check_status()
        logger.debug(
            "Health check results for service endpoint '%s': %s [%s]",
            self.config.name,
            state.name,
            err,
        )
        self.status.update_state(state, err)

    def is_active(self) -> bool:
        """Check if the service endpoint is active.

        This means that it is responsive and can receive requests). This method
        will use the configured health monitor to actively check the endpoint
        status and will return the result.

        Returns:
            True if the service endpoint is active, otherwise False.
        """
        self.update_status()
        return self.status.state == ServiceState.ACTIVE

    def is_inactive(self) -> bool:
        """Check if the service endpoint is inactive.

        This means that it is unresponsive and cannot receive requests. This
        method will use the configured health monitor to actively check the
        endpoint status and will return the result.

        Returns:
            True if the service endpoint is inactive, otherwise False.
        """
        self.update_status()
        return self.status.state == ServiceState.INACTIVE

__init__(*args, **kwargs)

Initialize the service endpoint.

Parameters:

Name Type Description Default
*args Any

positional arguments.

()
**kwargs Any

keyword arguments.

{}
Source code in src/zenml/services/service_endpoint.py
111
112
113
114
115
116
117
118
119
120
121
122
123
def __init__(
    self,
    *args: Any,
    **kwargs: Any,
) -> None:
    """Initialize the service endpoint.

    Args:
        *args: positional arguments.
        **kwargs: keyword arguments.
    """
    super().__init__(*args, **kwargs)
    self.config.name = self.config.name or self.__class__.__name__

check_status()

Check the the current operational state of the external service endpoint.

Returns:

Type Description
ServiceState

The operational state of the external service endpoint and a

str

message providing additional information about that state

Tuple[ServiceState, str]

(e.g. a description of the error, if one is encountered while

Tuple[ServiceState, str]

checking the service status).

Source code in src/zenml/services/service_endpoint.py
125
126
127
128
129
130
131
132
133
134
135
136
137
138
def check_status(self) -> Tuple[ServiceState, str]:
    """Check the the current operational state of the external service endpoint.

    Returns:
        The operational state of the external service endpoint and a
        message providing additional information about that state
        (e.g. a description of the error, if one is encountered while
        checking the service status).
    """
    if not self.monitor:
        # no health monitor configured; assume service operational state
        # always matches the admin state
        return self.admin_state, ""
    return self.monitor.check_endpoint_status(self)

is_active()

Check if the service endpoint is active.

This means that it is responsive and can receive requests). This method will use the configured health monitor to actively check the endpoint status and will return the result.

Returns:

Type Description
bool

True if the service endpoint is active, otherwise False.

Source code in src/zenml/services/service_endpoint.py
158
159
160
161
162
163
164
165
166
167
168
169
def is_active(self) -> bool:
    """Check if the service endpoint is active.

    This means that it is responsive and can receive requests). This method
    will use the configured health monitor to actively check the endpoint
    status and will return the result.

    Returns:
        True if the service endpoint is active, otherwise False.
    """
    self.update_status()
    return self.status.state == ServiceState.ACTIVE

is_inactive()

Check if the service endpoint is inactive.

This means that it is unresponsive and cannot receive requests. This method will use the configured health monitor to actively check the endpoint status and will return the result.

Returns:

Type Description
bool

True if the service endpoint is inactive, otherwise False.

Source code in src/zenml/services/service_endpoint.py
171
172
173
174
175
176
177
178
179
180
181
182
def is_inactive(self) -> bool:
    """Check if the service endpoint is inactive.

    This means that it is unresponsive and cannot receive requests. This
    method will use the configured health monitor to actively check the
    endpoint status and will return the result.

    Returns:
        True if the service endpoint is inactive, otherwise False.
    """
    self.update_status()
    return self.status.state == ServiceState.INACTIVE

update_status()

Check the the current operational state of the external service endpoint.

It updates the local operational status information accordingly.

Source code in src/zenml/services/service_endpoint.py
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
def update_status(self) -> None:
    """Check the the current operational state of the external service endpoint.

    It updates the local operational status information accordingly.
    """
    logger.debug(
        "Running health check for service endpoint '%s' ...",
        self.config.name,
    )
    state, err = self.check_status()
    logger.debug(
        "Health check results for service endpoint '%s': %s [%s]",
        self.config.name,
        state.name,
        err,
    )
    self.status.update_state(state, err)

BaseServiceEndpointHealthMonitor

Bases: BaseTypedModel

Base class used for service endpoint health monitors.

Attributes:

Name Type Description
config ServiceEndpointHealthMonitorConfig

health monitor configuration for endpoint

Source code in src/zenml/services/service_monitor.py
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
class BaseServiceEndpointHealthMonitor(BaseTypedModel):
    """Base class used for service endpoint health monitors.

    Attributes:
        config: health monitor configuration for endpoint
    """

    config: ServiceEndpointHealthMonitorConfig = Field(
        default_factory=ServiceEndpointHealthMonitorConfig
    )

    @abstractmethod
    def check_endpoint_status(
        self, endpoint: "BaseServiceEndpoint"
    ) -> Tuple[ServiceState, str]:
        """Check the the current operational state of the external service endpoint.

        Args:
            endpoint: service endpoint to check

        This method should be overridden by subclasses that implement
        concrete service endpoint tracking functionality.

        Returns:
            The operational state of the external service endpoint and an
            optional error message, if an error is encountered while checking
            the service endpoint status.
        """

check_endpoint_status(endpoint) abstractmethod

Check the the current operational state of the external service endpoint.

Parameters:

Name Type Description Default
endpoint BaseServiceEndpoint

service endpoint to check

required

This method should be overridden by subclasses that implement concrete service endpoint tracking functionality.

Returns:

Type Description
ServiceState

The operational state of the external service endpoint and an

str

optional error message, if an error is encountered while checking

Tuple[ServiceState, str]

the service endpoint status.

Source code in src/zenml/services/service_monitor.py
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
@abstractmethod
def check_endpoint_status(
    self, endpoint: "BaseServiceEndpoint"
) -> Tuple[ServiceState, str]:
    """Check the the current operational state of the external service endpoint.

    Args:
        endpoint: service endpoint to check

    This method should be overridden by subclasses that implement
    concrete service endpoint tracking functionality.

    Returns:
        The operational state of the external service endpoint and an
        optional error message, if an error is encountered while checking
        the service endpoint status.
    """

ContainerService

Bases: BaseService

A service represented by a containerized process.

This class extends the base service class with functionality concerning the life-cycle management and tracking of external services implemented as docker containers.

To define a containerized service, subclass this class and implement the run method. Upon start, the service will spawn a container that ends up calling the run method.

For example,


from zenml.services import ServiceType, ContainerService, ContainerServiceConfig
import time

class SleepingServiceConfig(ContainerServiceConfig):

    wake_up_after: int

class SleepingService(ContainerService):

    SERVICE_TYPE = ServiceType(
        name="sleeper",
        description="Sleeping container",
        type="container",
        flavor="sleeping",
    )
    config: SleepingServiceConfig

    def run(self) -> None:
        time.sleep(self.config.wake_up_after)

service = SleepingService(config=SleepingServiceConfig(wake_up_after=10))
service.start()

NOTE: the SleepingService class and its parent module have to be discoverable as part of a ZenML Integration, otherwise the daemon will fail with the following error:

TypeError: Cannot load service with unregistered service type:
name='sleeper' type='container' flavor='sleeping' description='Sleeping container'

Attributes:

Name Type Description
config ContainerServiceConfig

service configuration

status ContainerServiceStatus

service status

endpoint Optional[ContainerServiceEndpoint]

optional service endpoint

Source code in src/zenml/services/container/container_service.py
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
class ContainerService(BaseService):
    """A service represented by a containerized process.

    This class extends the base service class with functionality concerning
    the life-cycle management and tracking of external services implemented as
    docker containers.

    To define a containerized service, subclass this class and implement the
    `run` method. Upon `start`, the service will spawn a container that
    ends up calling the `run` method.

    For example,

    ```python

    from zenml.services import ServiceType, ContainerService, ContainerServiceConfig
    import time

    class SleepingServiceConfig(ContainerServiceConfig):

        wake_up_after: int

    class SleepingService(ContainerService):

        SERVICE_TYPE = ServiceType(
            name="sleeper",
            description="Sleeping container",
            type="container",
            flavor="sleeping",
        )
        config: SleepingServiceConfig

        def run(self) -> None:
            time.sleep(self.config.wake_up_after)

    service = SleepingService(config=SleepingServiceConfig(wake_up_after=10))
    service.start()
    ```

    NOTE: the `SleepingService` class and its parent module have to be
    discoverable as part of a ZenML `Integration`, otherwise the daemon will
    fail with the following error:

    ```
    TypeError: Cannot load service with unregistered service type:
    name='sleeper' type='container' flavor='sleeping' description='Sleeping container'
    ```

    Attributes:
        config: service configuration
        status: service status
        endpoint: optional service endpoint
    """

    config: ContainerServiceConfig = Field(
        default_factory=ContainerServiceConfig
    )
    status: ContainerServiceStatus = Field(
        default_factory=ContainerServiceStatus
    )
    # TODO [ENG-705]: allow multiple endpoints per service
    endpoint: Optional[ContainerServiceEndpoint] = None

    _docker_client: Optional[DockerClient] = None

    @property
    def docker_client(self) -> DockerClient:
        """Initialize and/or return the docker client.

        Returns:
            The docker client.
        """
        if self._docker_client is None:
            self._docker_client = (
                docker_utils._try_get_docker_client_from_env()
            )
        return self._docker_client

    @property
    def container_id(self) -> str:
        """Get the ID of the docker container for a service.

        Returns:
            The ID of the docker container for the service.
        """
        return f"zenml-{str(self.uuid)}"

    def get_service_status_message(self) -> str:
        """Get a message about the current operational state of the service.

        Returns:
            A message providing information about the current operational
            state of the service.
        """
        msg = super().get_service_status_message()
        msg += f"  Container ID: `{self.container_id}`\n"
        if self.status.log_file:
            msg += (
                f"For more information on the service status, please see the "
                f"following log file: {self.status.log_file}\n"
            )
        return msg

    def check_status(self) -> Tuple[ServiceState, str]:
        """Check the the current operational state of the docker container.

        Returns:
            The operational state of the docker container and a message
            providing additional information about that state (e.g. a
            description of the error, if one is encountered).
        """
        container: Optional[Container] = None
        try:
            container = self.docker_client.containers.get(self.container_id)
        except docker_errors.NotFound:
            # container doesn't exist yet or was removed
            pass

        if container is None:
            return ServiceState.INACTIVE, "Docker container is not present"
        elif container.status == "running":
            return ServiceState.ACTIVE, "Docker container is running"
        elif container.status == "exited":
            return (
                ServiceState.ERROR,
                "Docker container has exited.",
            )
        else:
            return (
                ServiceState.INACTIVE,
                f"Docker container is {container.status}",
            )

    def _setup_runtime_path(self) -> None:
        """Set up the runtime path for the service.

        This method sets up the runtime path for the service.
        """
        # reuse the config file and logfile location from a previous run,
        # if available
        if not self.status.runtime_path or not os.path.exists(
            self.status.runtime_path
        ):
            if self.config.root_runtime_path:
                if self.config.singleton:
                    self.status.runtime_path = self.config.root_runtime_path
                else:
                    self.status.runtime_path = os.path.join(
                        self.config.root_runtime_path,
                        str(self.uuid),
                    )
                create_dir_recursive_if_not_exists(self.status.runtime_path)
            else:
                self.status.runtime_path = tempfile.mkdtemp(
                    prefix="zenml-service-"
                )

    def _get_container_cmd(self) -> Tuple[List[str], Dict[str, str]]:
        """Get the command to run the service container.

        The default implementation provided by this class is the following:

          * this ContainerService instance and its configuration
          are serialized as JSON and saved to a file
          * the entrypoint.py script is launched as a docker container
          and pointed to the serialized service file
          * the entrypoint script re-creates the ContainerService instance
          from the serialized configuration, then calls the `run`
          method that must be implemented by the subclass

        Subclasses that need a different command to launch the container
        should override this method.

        Returns:
            Command needed to launch the docker container and the environment
            variables to set, in the formats accepted by subprocess.Popen.
        """
        # to avoid circular imports, import here
        from zenml.services.container import entrypoint

        assert self.status.config_file is not None
        assert self.status.log_file is not None

        with open(self.status.config_file, "w") as f:
            f.write(self.model_dump_json(indent=4))
        pathlib.Path(self.status.log_file).touch()

        command = [
            "python",
            "-m",
            entrypoint.__name__,
            "--config-file",
            os.path.join(SERVICE_CONTAINER_PATH, SERVICE_CONFIG_FILE_NAME),
        ]

        command_env = {
            ENV_ZENML_SERVICE_CONTAINER: "true",
        }
        for k, v in os.environ.items():
            if k.startswith("ZENML_"):
                command_env[k] = v
        # the global configuration is mounted into the container at a
        # different location
        command_env[ENV_ZENML_CONFIG_PATH] = (
            SERVICE_CONTAINER_GLOBAL_CONFIG_PATH
        )

        return command, command_env

    def _get_container_volumes(self) -> Dict[str, Dict[str, str]]:
        """Get the volumes to mount into the service container.

        The default implementation provided by this class mounts the
        following directories into the container:

          * the service runtime path
          * the global configuration directory

        Subclasses that need to mount additional volumes should override
        this method.

        Returns:
            A dictionary mapping host paths to dictionaries containing
            the mount options for each volume.
        """
        volumes: Dict[str, Dict[str, str]] = {}

        assert self.status.runtime_path is not None

        volumes[self.status.runtime_path] = {
            "bind": SERVICE_CONTAINER_PATH,
            "mode": "rw",
        }

        volumes[get_global_config_directory()] = {
            "bind": SERVICE_CONTAINER_GLOBAL_CONFIG_PATH,
            "mode": "rw",
        }

        return volumes

    @property
    def container(self) -> Optional[Container]:
        """Get the docker container for the service.

        Returns:
            The docker container for the service, or None if the container
            does not exist.
        """
        try:
            return self.docker_client.containers.get(self.container_id)
        except docker_errors.NotFound:
            # container doesn't exist yet or was removed
            return None

    def _start_container(self) -> None:
        """Start the service docker container associated with this service."""
        container = self.container

        if container:
            # the container exists, check if it is running
            if container.status == "running":
                logger.debug(
                    "Container for service '%s' is already running",
                    self,
                )
                return

            # the container is stopped or in an error state, remove it
            logger.debug(
                "Removing previous container for service '%s'",
                self,
            )
            container.remove(force=True)

        logger.debug("Starting container for service '%s'...", self)

        try:
            self.docker_client.images.get(self.config.image)
        except docker_errors.ImageNotFound:
            logger.debug(
                "Pulling container image '%s' for service '%s'...",
                self.config.image,
                self,
            )
            self.docker_client.images.pull(self.config.image)

        self._setup_runtime_path()

        ports: Dict[int, Optional[int]] = {}
        if self.endpoint:
            self.endpoint.prepare_for_start()
            if self.endpoint.status.port:
                ports[self.endpoint.status.port] = self.endpoint.status.port

        command, env = self._get_container_cmd()
        volumes = self._get_container_volumes()

        try:
            uid_args: Dict[str, Any] = {}
            if sys.platform == "win32":
                # File permissions are not checked on Windows. This if clause
                # prevents mypy from complaining about unused 'type: ignore'
                # statements
                pass
            else:
                # Run the container in the context of the local UID/GID
                # to ensure that the local database can be shared
                # with the container.
                logger.debug(
                    "Setting UID and GID to local user/group in container."
                )
                uid_args = dict(
                    user=os.getuid(),
                    group_add=[os.getgid()],
                )

            container = self.docker_client.containers.run(
                name=self.container_id,
                image=self.config.image,
                entrypoint=command,
                detach=True,
                volumes=volumes,
                environment=env,
                remove=False,
                auto_remove=False,
                ports=ports,
                labels={
                    "zenml-service-uuid": str(self.uuid),
                },
                working_dir=SERVICE_CONTAINER_PATH,
                extra_hosts={"host.docker.internal": "host-gateway"},
                **uid_args,
            )

            logger.debug(
                "Docker container for service '%s' started with ID: %s",
                self,
                self.container_id,
            )
        except docker_errors.DockerException as e:
            logger.error(
                "Docker container for service '%s' failed to start: %s",
                self,
                e,
            )

    def _stop_daemon(self, force: bool = False) -> None:
        """Stop the service docker container associated with this service.

        Args:
            force: if True, the service container will be forcefully stopped
        """
        container = self.container
        if not container:
            # service container is not running
            logger.debug(
                "Docker container for service '%s' no longer running",
                self,
            )
            return

        logger.debug("Stopping container for service '%s' ...", self)
        if force:
            container.kill()
            container.remove(force=True)
        else:
            container.stop()
            container.remove()

    def provision(self) -> None:
        """Provision the service."""
        self._start_container()

    def deprovision(self, force: bool = False) -> None:
        """Deprovision the service.

        Args:
            force: if True, the service container will be forcefully stopped
        """
        self._stop_daemon(force)

    def get_logs(
        self, follow: bool = False, tail: Optional[int] = None
    ) -> Generator[str, bool, None]:
        """Retrieve the service logs.

        Args:
            follow: if True, the logs will be streamed as they are written
            tail: only retrieve the last NUM lines of log output.

        Yields:
            A generator that can be accessed to get the service logs.
        """
        if not self.status.log_file or not os.path.exists(
            self.status.log_file
        ):
            return

        with open(self.status.log_file, "r") as f:
            if tail:
                # TODO[ENG-864]: implement a more efficient tailing mechanism that
                #   doesn't read the entire file
                lines = f.readlines()[-tail:]
                for line in lines:
                    yield line.rstrip("\n")
                if not follow:
                    return
            line = ""
            while True:
                partial_line = f.readline()
                if partial_line:
                    line += partial_line
                    if line.endswith("\n"):
                        stop = yield line.rstrip("\n")
                        if stop:
                            break
                        line = ""
                elif follow:
                    time.sleep(1)
                else:
                    break

    @abstractmethod
    def run(self) -> None:
        """Run the containerized service logic associated with this service.

        Subclasses must implement this method to provide the containerized
        service functionality. This method will be executed in the context of
        the running container, not in the context of the process that calls the
        `start` method.
        """

container property

Get the docker container for the service.

Returns:

Type Description
Optional[Container]

The docker container for the service, or None if the container

Optional[Container]

does not exist.

container_id property

Get the ID of the docker container for a service.

Returns:

Type Description
str

The ID of the docker container for the service.

docker_client property

Initialize and/or return the docker client.

Returns:

Type Description
DockerClient

The docker client.

check_status()

Check the the current operational state of the docker container.

Returns:

Type Description
ServiceState

The operational state of the docker container and a message

str

providing additional information about that state (e.g. a

Tuple[ServiceState, str]

description of the error, if one is encountered).

Source code in src/zenml/services/container/container_service.py
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
def check_status(self) -> Tuple[ServiceState, str]:
    """Check the the current operational state of the docker container.

    Returns:
        The operational state of the docker container and a message
        providing additional information about that state (e.g. a
        description of the error, if one is encountered).
    """
    container: Optional[Container] = None
    try:
        container = self.docker_client.containers.get(self.container_id)
    except docker_errors.NotFound:
        # container doesn't exist yet or was removed
        pass

    if container is None:
        return ServiceState.INACTIVE, "Docker container is not present"
    elif container.status == "running":
        return ServiceState.ACTIVE, "Docker container is running"
    elif container.status == "exited":
        return (
            ServiceState.ERROR,
            "Docker container has exited.",
        )
    else:
        return (
            ServiceState.INACTIVE,
            f"Docker container is {container.status}",
        )

deprovision(force=False)

Deprovision the service.

Parameters:

Name Type Description Default
force bool

if True, the service container will be forcefully stopped

False
Source code in src/zenml/services/container/container_service.py
483
484
485
486
487
488
489
def deprovision(self, force: bool = False) -> None:
    """Deprovision the service.

    Args:
        force: if True, the service container will be forcefully stopped
    """
    self._stop_daemon(force)

get_logs(follow=False, tail=None)

Retrieve the service logs.

Parameters:

Name Type Description Default
follow bool

if True, the logs will be streamed as they are written

False
tail Optional[int]

only retrieve the last NUM lines of log output.

None

Yields:

Type Description
str

A generator that can be accessed to get the service logs.

Source code in src/zenml/services/container/container_service.py
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
def get_logs(
    self, follow: bool = False, tail: Optional[int] = None
) -> Generator[str, bool, None]:
    """Retrieve the service logs.

    Args:
        follow: if True, the logs will be streamed as they are written
        tail: only retrieve the last NUM lines of log output.

    Yields:
        A generator that can be accessed to get the service logs.
    """
    if not self.status.log_file or not os.path.exists(
        self.status.log_file
    ):
        return

    with open(self.status.log_file, "r") as f:
        if tail:
            # TODO[ENG-864]: implement a more efficient tailing mechanism that
            #   doesn't read the entire file
            lines = f.readlines()[-tail:]
            for line in lines:
                yield line.rstrip("\n")
            if not follow:
                return
        line = ""
        while True:
            partial_line = f.readline()
            if partial_line:
                line += partial_line
                if line.endswith("\n"):
                    stop = yield line.rstrip("\n")
                    if stop:
                        break
                    line = ""
            elif follow:
                time.sleep(1)
            else:
                break

get_service_status_message()

Get a message about the current operational state of the service.

Returns:

Type Description
str

A message providing information about the current operational

str

state of the service.

Source code in src/zenml/services/container/container_service.py
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
def get_service_status_message(self) -> str:
    """Get a message about the current operational state of the service.

    Returns:
        A message providing information about the current operational
        state of the service.
    """
    msg = super().get_service_status_message()
    msg += f"  Container ID: `{self.container_id}`\n"
    if self.status.log_file:
        msg += (
            f"For more information on the service status, please see the "
            f"following log file: {self.status.log_file}\n"
        )
    return msg

provision()

Provision the service.

Source code in src/zenml/services/container/container_service.py
479
480
481
def provision(self) -> None:
    """Provision the service."""
    self._start_container()

run() abstractmethod

Run the containerized service logic associated with this service.

Subclasses must implement this method to provide the containerized service functionality. This method will be executed in the context of the running container, not in the context of the process that calls the start method.

Source code in src/zenml/services/container/container_service.py
532
533
534
535
536
537
538
539
540
@abstractmethod
def run(self) -> None:
    """Run the containerized service logic associated with this service.

    Subclasses must implement this method to provide the containerized
    service functionality. This method will be executed in the context of
    the running container, not in the context of the process that calls the
    `start` method.
    """

ContainerServiceConfig

Bases: ServiceConfig

containerized service configuration.

Attributes:

Name Type Description
root_runtime_path Optional[str]

the root path where the service stores its files.

singleton bool

set to True to store the service files directly in the root_runtime_path directory instead of creating a subdirectory for each service instance. Only has effect if the root_runtime_path is also set.

image str

the container image to use for the service.

Source code in src/zenml/services/container/container_service.py
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
class ContainerServiceConfig(ServiceConfig):
    """containerized service configuration.

    Attributes:
        root_runtime_path: the root path where the service stores its files.
        singleton: set to True to store the service files directly in the
            `root_runtime_path` directory instead of creating a subdirectory for
            each service instance. Only has effect if the `root_runtime_path` is
            also set.
        image: the container image to use for the service.
    """

    root_runtime_path: Optional[str] = None
    singleton: bool = False
    image: str = DOCKER_ZENML_SERVER_DEFAULT_IMAGE

ContainerServiceEndpoint

Bases: BaseServiceEndpoint

A service endpoint exposed by a containerized process.

This class extends the base service endpoint class with functionality concerning the life-cycle management and tracking of endpoints exposed by external services implemented as containerized processes.

Attributes:

Name Type Description
config ContainerServiceEndpointConfig

service endpoint configuration

status ContainerServiceEndpointStatus

service endpoint status

monitor Optional[Union[HTTPEndpointHealthMonitor, TCPEndpointHealthMonitor]]

optional service endpoint health monitor

Source code in src/zenml/services/container/container_service_endpoint.py
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
class ContainerServiceEndpoint(BaseServiceEndpoint):
    """A service endpoint exposed by a containerized process.

    This class extends the base service endpoint class with functionality
    concerning the life-cycle management and tracking of endpoints exposed
    by external services implemented as containerized processes.

    Attributes:
        config: service endpoint configuration
        status: service endpoint status
        monitor: optional service endpoint health monitor
    """

    config: ContainerServiceEndpointConfig = Field(
        default_factory=ContainerServiceEndpointConfig
    )
    status: ContainerServiceEndpointStatus = Field(
        default_factory=ContainerServiceEndpointStatus
    )
    monitor: Optional[
        Union[HTTPEndpointHealthMonitor, TCPEndpointHealthMonitor]
    ] = Field(..., discriminator="type", union_mode="left_to_right")

    def _lookup_free_port(self) -> int:
        """Search for a free TCP port for the service endpoint.

        If a preferred TCP port value is explicitly requested through the
        endpoint configuration, it will be checked first. If a port was
        previously used the last time the service was running (i.e. as
        indicated in the service endpoint status), it will be checked next for
        availability.

        As a last resort, this call will search for a free TCP port, if
        `allocate_port` is set to True in the endpoint configuration.

        Returns:
            An available TCP port number

        Raises:
            IOError: if the preferred TCP port is busy and `allocate_port` is
                disabled in the endpoint configuration, or if no free TCP port
                could be otherwise allocated.
        """
        # If a port value is explicitly configured, attempt to use it first
        if self.config.port:
            if port_available(self.config.port):
                return self.config.port
            if not self.config.allocate_port:
                raise IOError(f"TCP port {self.config.port} is not available.")

        # Attempt to reuse the port used when the services was last running
        if self.status.port and port_available(self.status.port):
            return self.status.port

        port = scan_for_available_port()
        if port:
            return port
        raise IOError("No free TCP ports found")

    def prepare_for_start(self) -> None:
        """Prepare the service endpoint for starting.

        This method is called before the service is started.
        """
        self.status.protocol = self.config.protocol
        self.status.port = self._lookup_free_port()
        # Container endpoints are always exposed on the local host
        self.status.hostname = DEFAULT_LOCAL_SERVICE_IP_ADDRESS

prepare_for_start()

Prepare the service endpoint for starting.

This method is called before the service is started.

Source code in src/zenml/services/container/container_service_endpoint.py
121
122
123
124
125
126
127
128
129
def prepare_for_start(self) -> None:
    """Prepare the service endpoint for starting.

    This method is called before the service is started.
    """
    self.status.protocol = self.config.protocol
    self.status.port = self._lookup_free_port()
    # Container endpoints are always exposed on the local host
    self.status.hostname = DEFAULT_LOCAL_SERVICE_IP_ADDRESS

ContainerServiceEndpointConfig

Bases: ServiceEndpointConfig

Local daemon service endpoint configuration.

Attributes:

Name Type Description
protocol ServiceEndpointProtocol

the TCP protocol implemented by the service endpoint

port Optional[int]

preferred TCP port value for the service endpoint. If the port is in use when the service is started, setting allocate_port to True will also try to allocate a new port value, otherwise an exception will be raised.

allocate_port bool

set to True to allocate a free TCP port for the service endpoint automatically.

Source code in src/zenml/services/container/container_service_endpoint.py
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
class ContainerServiceEndpointConfig(ServiceEndpointConfig):
    """Local daemon service endpoint configuration.

    Attributes:
        protocol: the TCP protocol implemented by the service endpoint
        port: preferred TCP port value for the service endpoint. If the port
            is in use when the service is started, setting `allocate_port` to
            True will also try to allocate a new port value, otherwise an
            exception will be raised.
        allocate_port: set to True to allocate a free TCP port for the
            service endpoint automatically.
    """

    protocol: ServiceEndpointProtocol = ServiceEndpointProtocol.TCP
    port: Optional[int] = None
    allocate_port: bool = True

ContainerServiceEndpointStatus

Bases: ServiceEndpointStatus

Local daemon service endpoint status.

Source code in src/zenml/services/container/container_service_endpoint.py
58
59
class ContainerServiceEndpointStatus(ServiceEndpointStatus):
    """Local daemon service endpoint status."""

ContainerServiceStatus

Bases: ServiceStatus

containerized service status.

Attributes:

Name Type Description
runtime_path Optional[str]

the path where the service files (e.g. the configuration file used to start the service daemon and the logfile) are located

Source code in src/zenml/services/container/container_service.py
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
class ContainerServiceStatus(ServiceStatus):
    """containerized service status.

    Attributes:
        runtime_path: the path where the service files (e.g. the configuration
            file used to start the service daemon and the logfile) are located
    """

    runtime_path: Optional[str] = None

    @property
    def config_file(self) -> Optional[str]:
        """Get the path to the service configuration file.

        Returns:
            The path to the configuration file, or None, if the
            service has never been started before.
        """
        if not self.runtime_path:
            return None
        return os.path.join(self.runtime_path, SERVICE_CONFIG_FILE_NAME)

    @property
    def log_file(self) -> Optional[str]:
        """Get the path to the log file where the service output is/has been logged.

        Returns:
            The path to the log file, or None, if the service has never been
            started before.
        """
        if not self.runtime_path:
            return None
        return os.path.join(self.runtime_path, SERVICE_LOG_FILE_NAME)

config_file property

Get the path to the service configuration file.

Returns:

Type Description
Optional[str]

The path to the configuration file, or None, if the

Optional[str]

service has never been started before.

log_file property

Get the path to the log file where the service output is/has been logged.

Returns:

Type Description
Optional[str]

The path to the log file, or None, if the service has never been

Optional[str]

started before.

HTTPEndpointHealthMonitor

Bases: BaseServiceEndpointHealthMonitor

HTTP service endpoint health monitor.

Attributes:

Name Type Description
config HTTPEndpointHealthMonitorConfig

health monitor configuration for HTTP endpoint

Source code in src/zenml/services/service_monitor.py
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
class HTTPEndpointHealthMonitor(BaseServiceEndpointHealthMonitor):
    """HTTP service endpoint health monitor.

    Attributes:
        config: health monitor configuration for HTTP endpoint
    """

    config: HTTPEndpointHealthMonitorConfig = Field(
        default_factory=HTTPEndpointHealthMonitorConfig
    )

    def get_healthcheck_uri(
        self, endpoint: "BaseServiceEndpoint"
    ) -> Optional[str]:
        """Get the healthcheck URI for the given service endpoint.

        Args:
            endpoint: service endpoint to get the healthcheck URI for

        Returns:
            The healthcheck URI for the given service endpoint or None, if
            the service endpoint doesn't have a healthcheck URI.
        """
        uri = endpoint.status.uri
        if not uri:
            return None
        if not self.config.healthcheck_uri_path:
            return uri
        return (
            f"{uri.rstrip('/')}/{self.config.healthcheck_uri_path.lstrip('/')}"
        )

    def check_endpoint_status(
        self, endpoint: "BaseServiceEndpoint"
    ) -> Tuple[ServiceState, str]:
        """Run a HTTP endpoint API healthcheck.

        Args:
            endpoint: service endpoint to check.

        Returns:
            The operational state of the external HTTP endpoint and an
            optional message describing that state (e.g. an error message,
            if an error is encountered while checking the HTTP endpoint
            status).
        """
        from zenml.services.service_endpoint import ServiceEndpointProtocol

        if endpoint.status.protocol not in [
            ServiceEndpointProtocol.HTTP,
            ServiceEndpointProtocol.HTTPS,
        ]:
            return (
                ServiceState.ERROR,
                "endpoint protocol is not HTTP nor HTTPS.",
            )

        check_uri = self.get_healthcheck_uri(endpoint)
        if not check_uri:
            return ServiceState.ERROR, "no HTTP healthcheck URI available"

        logger.debug("Running HTTP healthcheck for URI: %s", check_uri)

        try:
            if self.config.use_head_request:
                r = requests.head(
                    check_uri,
                    timeout=self.config.http_timeout,
                )
            else:
                r = requests.get(
                    check_uri,
                    timeout=self.config.http_timeout,
                )
            if r.status_code == self.config.http_status_code:
                # the endpoint is healthy
                return ServiceState.ACTIVE, ""
            error = f"HTTP endpoint healthcheck returned unexpected status code: {r.status_code}"
        except requests.ConnectionError as e:
            error = f"HTTP endpoint healthcheck connection error: {str(e)}"
        except requests.Timeout as e:
            error = f"HTTP endpoint healthcheck request timed out: {str(e)}"
        except requests.RequestException as e:
            error = (
                f"unexpected error encountered while running HTTP endpoint "
                f"healthcheck: {str(e)}"
            )

        return ServiceState.ERROR, error

check_endpoint_status(endpoint)

Run a HTTP endpoint API healthcheck.

Parameters:

Name Type Description Default
endpoint BaseServiceEndpoint

service endpoint to check.

required

Returns:

Type Description
ServiceState

The operational state of the external HTTP endpoint and an

str

optional message describing that state (e.g. an error message,

Tuple[ServiceState, str]

if an error is encountered while checking the HTTP endpoint

Tuple[ServiceState, str]

status).

Source code in src/zenml/services/service_monitor.py
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
def check_endpoint_status(
    self, endpoint: "BaseServiceEndpoint"
) -> Tuple[ServiceState, str]:
    """Run a HTTP endpoint API healthcheck.

    Args:
        endpoint: service endpoint to check.

    Returns:
        The operational state of the external HTTP endpoint and an
        optional message describing that state (e.g. an error message,
        if an error is encountered while checking the HTTP endpoint
        status).
    """
    from zenml.services.service_endpoint import ServiceEndpointProtocol

    if endpoint.status.protocol not in [
        ServiceEndpointProtocol.HTTP,
        ServiceEndpointProtocol.HTTPS,
    ]:
        return (
            ServiceState.ERROR,
            "endpoint protocol is not HTTP nor HTTPS.",
        )

    check_uri = self.get_healthcheck_uri(endpoint)
    if not check_uri:
        return ServiceState.ERROR, "no HTTP healthcheck URI available"

    logger.debug("Running HTTP healthcheck for URI: %s", check_uri)

    try:
        if self.config.use_head_request:
            r = requests.head(
                check_uri,
                timeout=self.config.http_timeout,
            )
        else:
            r = requests.get(
                check_uri,
                timeout=self.config.http_timeout,
            )
        if r.status_code == self.config.http_status_code:
            # the endpoint is healthy
            return ServiceState.ACTIVE, ""
        error = f"HTTP endpoint healthcheck returned unexpected status code: {r.status_code}"
    except requests.ConnectionError as e:
        error = f"HTTP endpoint healthcheck connection error: {str(e)}"
    except requests.Timeout as e:
        error = f"HTTP endpoint healthcheck request timed out: {str(e)}"
    except requests.RequestException as e:
        error = (
            f"unexpected error encountered while running HTTP endpoint "
            f"healthcheck: {str(e)}"
        )

    return ServiceState.ERROR, error

get_healthcheck_uri(endpoint)

Get the healthcheck URI for the given service endpoint.

Parameters:

Name Type Description Default
endpoint BaseServiceEndpoint

service endpoint to get the healthcheck URI for

required

Returns:

Type Description
Optional[str]

The healthcheck URI for the given service endpoint or None, if

Optional[str]

the service endpoint doesn't have a healthcheck URI.

Source code in src/zenml/services/service_monitor.py
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
def get_healthcheck_uri(
    self, endpoint: "BaseServiceEndpoint"
) -> Optional[str]:
    """Get the healthcheck URI for the given service endpoint.

    Args:
        endpoint: service endpoint to get the healthcheck URI for

    Returns:
        The healthcheck URI for the given service endpoint or None, if
        the service endpoint doesn't have a healthcheck URI.
    """
    uri = endpoint.status.uri
    if not uri:
        return None
    if not self.config.healthcheck_uri_path:
        return uri
    return (
        f"{uri.rstrip('/')}/{self.config.healthcheck_uri_path.lstrip('/')}"
    )

HTTPEndpointHealthMonitorConfig

Bases: ServiceEndpointHealthMonitorConfig

HTTP service endpoint health monitor configuration.

Attributes:

Name Type Description
healthcheck_uri_path str

URI subpath to use to perform service endpoint healthchecks. If not set, the service endpoint URI will be used instead.

use_head_request bool

set to True to use a HEAD request instead of a GET when calling the healthcheck URI.

http_status_code int

HTTP status code to expect in the health check response.

http_timeout int

HTTP health check request timeout in seconds.

Source code in src/zenml/services/service_monitor.py
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
class HTTPEndpointHealthMonitorConfig(ServiceEndpointHealthMonitorConfig):
    """HTTP service endpoint health monitor configuration.

    Attributes:
        healthcheck_uri_path: URI subpath to use to perform service endpoint
            healthchecks. If not set, the service endpoint URI will be used
            instead.
        use_head_request: set to True to use a HEAD request instead of a GET
            when calling the healthcheck URI.
        http_status_code: HTTP status code to expect in the health check
            response.
        http_timeout: HTTP health check request timeout in seconds.
    """

    healthcheck_uri_path: str = ""
    use_head_request: bool = False
    http_status_code: int = 200
    http_timeout: int = DEFAULT_HTTP_HEALTHCHECK_TIMEOUT

LocalDaemonService

Bases: BaseService

A service represented by a local daemon process.

This class extends the base service class with functionality concerning the life-cycle management and tracking of external services implemented as local daemon processes.

To define a local daemon service, subclass this class and implement the run method. Upon start, the service will spawn a daemon process that ends up calling the run method.

For example,


from zenml.services import ServiceType, LocalDaemonService, LocalDaemonServiceConfig
import time

class SleepingDaemonConfig(LocalDaemonServiceConfig):

    wake_up_after: int

class SleepingDaemon(LocalDaemonService):

    SERVICE_TYPE = ServiceType(
        name="sleeper",
        description="Sleeping daemon",
        type="daemon",
        flavor="sleeping",
    )
    config: SleepingDaemonConfig

    def run(self) -> None:
        time.sleep(self.config.wake_up_after)

daemon = SleepingDaemon(config=SleepingDaemonConfig(wake_up_after=10))
daemon.start()

NOTE: the SleepingDaemon class and its parent module have to be discoverable as part of a ZenML Integration, otherwise the daemon will fail with the following error:

TypeError: Cannot load service with unregistered service type:
name='sleeper' type='daemon' flavor='sleeping' description='Sleeping daemon'

Attributes:

Name Type Description
config LocalDaemonServiceConfig

service configuration

status LocalDaemonServiceStatus

service status

endpoint Optional[LocalDaemonServiceEndpoint]

optional service endpoint

Source code in src/zenml/services/local/local_service.py
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
class LocalDaemonService(BaseService):
    """A service represented by a local daemon process.

    This class extends the base service class with functionality concerning
    the life-cycle management and tracking of external services implemented as
    local daemon processes.

    To define a local daemon service, subclass this class and implement the
    `run` method. Upon `start`, the service will spawn a daemon process that
    ends up calling the `run` method.

    For example,

    ```python

    from zenml.services import ServiceType, LocalDaemonService, LocalDaemonServiceConfig
    import time

    class SleepingDaemonConfig(LocalDaemonServiceConfig):

        wake_up_after: int

    class SleepingDaemon(LocalDaemonService):

        SERVICE_TYPE = ServiceType(
            name="sleeper",
            description="Sleeping daemon",
            type="daemon",
            flavor="sleeping",
        )
        config: SleepingDaemonConfig

        def run(self) -> None:
            time.sleep(self.config.wake_up_after)

    daemon = SleepingDaemon(config=SleepingDaemonConfig(wake_up_after=10))
    daemon.start()
    ```

    NOTE: the `SleepingDaemon` class and its parent module have to be
    discoverable as part of a ZenML `Integration`, otherwise the daemon will
    fail with the following error:

    ```
    TypeError: Cannot load service with unregistered service type:
    name='sleeper' type='daemon' flavor='sleeping' description='Sleeping daemon'
    ```


    Attributes:
        config: service configuration
        status: service status
        endpoint: optional service endpoint
    """

    config: LocalDaemonServiceConfig = Field(
        default_factory=LocalDaemonServiceConfig
    )
    status: LocalDaemonServiceStatus = Field(
        default_factory=LocalDaemonServiceStatus
    )
    # TODO [ENG-705]: allow multiple endpoints per service
    endpoint: Optional[LocalDaemonServiceEndpoint] = None

    def get_service_status_message(self) -> str:
        """Get a message about the current operational state of the service.

        Returns:
            A message providing information about the current operational
            state of the service.
        """
        msg = super().get_service_status_message()
        pid = self.status.pid
        if pid:
            msg += f"  Daemon PID: `{self.status.pid}`\n"
        if self.status.log_file:
            msg += (
                f"For more information on the service status, please see the "
                f"following log file: {self.status.log_file}\n"
            )
        return msg

    def check_status(self) -> Tuple[ServiceState, str]:
        """Check the the current operational state of the daemon process.

        Returns:
            The operational state of the daemon process and a message
            providing additional information about that state (e.g. a
            description of the error, if one is encountered).
        """
        if not self.status.pid:
            return ServiceState.INACTIVE, "service daemon is not running"

        # the daemon is running
        return ServiceState.ACTIVE, ""

    def _get_daemon_cmd(self) -> Tuple[List[str], Dict[str, str]]:
        """Get the command to run the service daemon.

        The default implementation provided by this class is the following:

          * this LocalDaemonService instance and its configuration
          are serialized as JSON and saved to a temporary file
          * the local_daemon_entrypoint.py script is launched as a subprocess
          and pointed to the serialized service file
          * the entrypoint script re-creates the LocalDaemonService instance
          from the serialized configuration, reconfigures itself as a daemon
          and detaches itself from the parent process, then calls the `run`
          method that must be implemented by the subclass

        Subclasses that need a different command to launch the service daemon
        should override this method.

        Returns:
            Command needed to launch the daemon process and the environment
            variables to set, in the formats accepted by subprocess.Popen.
        """
        # to avoid circular imports, import here
        import zenml.services.local.local_daemon_entrypoint as daemon_entrypoint

        self.status.silent_daemon = self.config.silent_daemon
        # reuse the config file and logfile location from a previous run,
        # if available
        if not self.status.runtime_path or not os.path.exists(
            self.status.runtime_path
        ):
            if self.config.root_runtime_path:
                if self.config.singleton:
                    self.status.runtime_path = self.config.root_runtime_path
                else:
                    self.status.runtime_path = os.path.join(
                        self.config.root_runtime_path,
                        str(self.uuid),
                    )
                create_dir_recursive_if_not_exists(self.status.runtime_path)
            else:
                self.status.runtime_path = tempfile.mkdtemp(
                    prefix="zenml-service-"
                )

        assert self.status.config_file is not None
        assert self.status.pid_file is not None

        with open(self.status.config_file, "w") as f:
            f.write(self.model_dump_json(indent=4))

        # delete the previous PID file, in case a previous daemon process
        # crashed and left a stale PID file
        if os.path.exists(self.status.pid_file):
            os.remove(self.status.pid_file)

        command = [
            sys.executable,
            "-m",
            daemon_entrypoint.__name__,
            "--config-file",
            self.status.config_file,
            "--pid-file",
            self.status.pid_file,
        ]
        if self.status.log_file:
            pathlib.Path(self.status.log_file).touch()
            command += ["--log-file", self.status.log_file]

        command_env = os.environ.copy()

        return command, command_env

    def _start_daemon(self) -> None:
        """Start the service daemon process associated with this service."""
        pid = self.status.pid
        if pid:
            # service daemon is already running
            logger.debug(
                "Daemon process for service '%s' is already running with PID %d",
                self,
                pid,
            )
            return

        logger.debug("Starting daemon for service '%s'...", self)

        if self.endpoint:
            self.endpoint.prepare_for_start()

        command, command_env = self._get_daemon_cmd()
        logger.debug(
            "Running command to start daemon for service '%s': %s",
            self,
            " ".join(command),
        )
        p = subprocess.Popen(command, env=command_env)
        p.wait()
        pid = self.status.pid
        if pid:
            logger.debug(
                "Daemon process for service '%s' started with PID: %d",
                self,
                pid,
            )
        else:
            logger.error(
                "Daemon process for service '%s' failed to start.",
                self,
            )

    def _stop_daemon(self, force: bool = False) -> None:
        """Stop the service daemon process associated with this service.

        Args:
            force: if True, the service daemon will be forcefully stopped
        """
        pid = self.status.pid
        if not pid:
            # service daemon is not running
            logger.debug(
                "Daemon process for service '%s' no longer running",
                self,
            )
            return

        logger.debug("Stopping daemon for service '%s' ...", self)
        try:
            p = psutil.Process(pid)
        except psutil.Error:
            logger.error(
                "Could not find process for service '%s' ...",
                self,
            )
            return
        if force:
            p.kill()
        else:
            p.terminate()

    def provision(self) -> None:
        """Provision the service."""
        self._start_daemon()

    def deprovision(self, force: bool = False) -> None:
        """Deprovision the service.

        Args:
            force: if True, the service daemon will be forcefully stopped
        """
        self._stop_daemon(force)

    def start(self, timeout: int = 0) -> None:
        """Start the service and optionally wait for it to become active.

        Args:
            timeout: amount of time to wait for the service to become active.
                If set to 0, the method will return immediately after checking
                the service status.
        """
        if not self.config.blocking:
            super().start(timeout)
        else:
            self.run()

    def get_logs(
        self, follow: bool = False, tail: Optional[int] = None
    ) -> Generator[str, bool, None]:
        """Retrieve the service logs.

        Args:
            follow: if True, the logs will be streamed as they are written
            tail: only retrieve the last NUM lines of log output.

        Yields:
            A generator that can be accessed to get the service logs.
        """
        if not self.status.log_file or not os.path.exists(
            self.status.log_file
        ):
            return

        with open(self.status.log_file, "r") as f:
            if tail:
                # TODO[ENG-864]: implement a more efficient tailing mechanism that
                #   doesn't read the entire file
                lines = f.readlines()[-tail:]
                for line in lines:
                    yield line.rstrip("\n")
                if not follow:
                    return
            line = ""
            while True:
                partial_line = f.readline()
                if partial_line:
                    line += partial_line
                    if line.endswith("\n"):
                        stop = yield line.rstrip("\n")
                        if stop:
                            break
                        line = ""
                elif follow:
                    time.sleep(1)
                else:
                    break

    @abstractmethod
    def run(self) -> None:
        """Run the service daemon process associated with this service.

        Subclasses must implement this method to provide the service daemon
        functionality. This method will be executed in the context of the
        running daemon, not in the context of the process that calls the
        `start` method.
        """

check_status()

Check the the current operational state of the daemon process.

Returns:

Type Description
ServiceState

The operational state of the daemon process and a message

str

providing additional information about that state (e.g. a

Tuple[ServiceState, str]

description of the error, if one is encountered).

Source code in src/zenml/services/local/local_service.py
273
274
275
276
277
278
279
280
281
282
283
284
285
def check_status(self) -> Tuple[ServiceState, str]:
    """Check the the current operational state of the daemon process.

    Returns:
        The operational state of the daemon process and a message
        providing additional information about that state (e.g. a
        description of the error, if one is encountered).
    """
    if not self.status.pid:
        return ServiceState.INACTIVE, "service daemon is not running"

    # the daemon is running
    return ServiceState.ACTIVE, ""

deprovision(force=False)

Deprovision the service.

Parameters:

Name Type Description Default
force bool

if True, the service daemon will be forcefully stopped

False
Source code in src/zenml/services/local/local_service.py
430
431
432
433
434
435
436
def deprovision(self, force: bool = False) -> None:
    """Deprovision the service.

    Args:
        force: if True, the service daemon will be forcefully stopped
    """
    self._stop_daemon(force)

get_logs(follow=False, tail=None)

Retrieve the service logs.

Parameters:

Name Type Description Default
follow bool

if True, the logs will be streamed as they are written

False
tail Optional[int]

only retrieve the last NUM lines of log output.

None

Yields:

Type Description
str

A generator that can be accessed to get the service logs.

Source code in src/zenml/services/local/local_service.py
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
def get_logs(
    self, follow: bool = False, tail: Optional[int] = None
) -> Generator[str, bool, None]:
    """Retrieve the service logs.

    Args:
        follow: if True, the logs will be streamed as they are written
        tail: only retrieve the last NUM lines of log output.

    Yields:
        A generator that can be accessed to get the service logs.
    """
    if not self.status.log_file or not os.path.exists(
        self.status.log_file
    ):
        return

    with open(self.status.log_file, "r") as f:
        if tail:
            # TODO[ENG-864]: implement a more efficient tailing mechanism that
            #   doesn't read the entire file
            lines = f.readlines()[-tail:]
            for line in lines:
                yield line.rstrip("\n")
            if not follow:
                return
        line = ""
        while True:
            partial_line = f.readline()
            if partial_line:
                line += partial_line
                if line.endswith("\n"):
                    stop = yield line.rstrip("\n")
                    if stop:
                        break
                    line = ""
            elif follow:
                time.sleep(1)
            else:
                break

get_service_status_message()

Get a message about the current operational state of the service.

Returns:

Type Description
str

A message providing information about the current operational

str

state of the service.

Source code in src/zenml/services/local/local_service.py
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
def get_service_status_message(self) -> str:
    """Get a message about the current operational state of the service.

    Returns:
        A message providing information about the current operational
        state of the service.
    """
    msg = super().get_service_status_message()
    pid = self.status.pid
    if pid:
        msg += f"  Daemon PID: `{self.status.pid}`\n"
    if self.status.log_file:
        msg += (
            f"For more information on the service status, please see the "
            f"following log file: {self.status.log_file}\n"
        )
    return msg

provision()

Provision the service.

Source code in src/zenml/services/local/local_service.py
426
427
428
def provision(self) -> None:
    """Provision the service."""
    self._start_daemon()

run() abstractmethod

Run the service daemon process associated with this service.

Subclasses must implement this method to provide the service daemon functionality. This method will be executed in the context of the running daemon, not in the context of the process that calls the start method.

Source code in src/zenml/services/local/local_service.py
492
493
494
495
496
497
498
499
500
@abstractmethod
def run(self) -> None:
    """Run the service daemon process associated with this service.

    Subclasses must implement this method to provide the service daemon
    functionality. This method will be executed in the context of the
    running daemon, not in the context of the process that calls the
    `start` method.
    """

start(timeout=0)

Start the service and optionally wait for it to become active.

Parameters:

Name Type Description Default
timeout int

amount of time to wait for the service to become active. If set to 0, the method will return immediately after checking the service status.

0
Source code in src/zenml/services/local/local_service.py
438
439
440
441
442
443
444
445
446
447
448
449
def start(self, timeout: int = 0) -> None:
    """Start the service and optionally wait for it to become active.

    Args:
        timeout: amount of time to wait for the service to become active.
            If set to 0, the method will return immediately after checking
            the service status.
    """
    if not self.config.blocking:
        super().start(timeout)
    else:
        self.run()

LocalDaemonServiceConfig

Bases: ServiceConfig

Local daemon service configuration.

Attributes:

Name Type Description
silent_daemon bool

set to True to suppress the output of the daemon (i.e. redirect stdout and stderr to /dev/null). If False, the daemon output will be redirected to a logfile.

root_runtime_path Optional[str]

the root path where the service daemon will store service configuration files

singleton bool

set to True to store the service daemon configuration files directly in the root_runtime_path directory instead of creating a subdirectory for each service instance. Only has effect if the root_runtime_path is also set.

blocking bool

set to True to run the service the context of the current process and block until the service is stopped instead of running the service as a daemon process. Useful for operating systems that do not support daemon processes.

Source code in src/zenml/services/local/local_service.py
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
class LocalDaemonServiceConfig(ServiceConfig):
    """Local daemon service configuration.

    Attributes:
        silent_daemon: set to True to suppress the output of the daemon
            (i.e. redirect stdout and stderr to /dev/null). If False, the
            daemon output will be redirected to a logfile.
        root_runtime_path: the root path where the service daemon will store
            service configuration files
        singleton: set to True to store the service daemon configuration files
            directly in the `root_runtime_path` directory instead of creating
            a subdirectory for each service instance. Only has effect if the
            `root_runtime_path` is also set.
        blocking: set to True to run the service the context of the current
            process and block until the service is stopped instead of running
            the service as a daemon process. Useful for operating systems
            that do not support daemon processes.
    """

    silent_daemon: bool = False
    root_runtime_path: Optional[str] = None
    singleton: bool = False
    blocking: bool = False

LocalDaemonServiceEndpoint

Bases: BaseServiceEndpoint

A service endpoint exposed by a local daemon process.

This class extends the base service endpoint class with functionality concerning the life-cycle management and tracking of endpoints exposed by external services implemented as local daemon processes.

Attributes:

Name Type Description
config LocalDaemonServiceEndpointConfig

service endpoint configuration

status LocalDaemonServiceEndpointStatus

service endpoint status

monitor Optional[Union[HTTPEndpointHealthMonitor, TCPEndpointHealthMonitor]]

optional service endpoint health monitor

Source code in src/zenml/services/local/local_service_endpoint.py
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
class LocalDaemonServiceEndpoint(BaseServiceEndpoint):
    """A service endpoint exposed by a local daemon process.

    This class extends the base service endpoint class with functionality
    concerning the life-cycle management and tracking of endpoints exposed
    by external services implemented as local daemon processes.

    Attributes:
        config: service endpoint configuration
        status: service endpoint status
        monitor: optional service endpoint health monitor
    """

    config: LocalDaemonServiceEndpointConfig = Field(
        default_factory=LocalDaemonServiceEndpointConfig
    )
    status: LocalDaemonServiceEndpointStatus = Field(
        default_factory=LocalDaemonServiceEndpointStatus
    )
    monitor: Optional[
        Union[HTTPEndpointHealthMonitor, TCPEndpointHealthMonitor]
    ] = Field(..., discriminator="type", union_mode="left_to_right")

    def _lookup_free_port(self) -> int:
        """Search for a free TCP port for the service endpoint.

        If a preferred TCP port value is explicitly requested through the
        endpoint configuration, it will be checked first. If a port was
        previously used the last time the service was running (i.e. as
        indicated in the service endpoint status), it will be checked next for
        availability.

        As a last resort, this call will search for a free TCP port, if
        `allocate_port` is set to True in the endpoint configuration.

        Returns:
            An available TCP port number

        Raises:
            IOError: if the preferred TCP port is busy and `allocate_port` is
                disabled in the endpoint configuration, or if no free TCP port
                could be otherwise allocated.
        """
        # If a port value is explicitly configured, attempt to use it first
        if self.config.port:
            if port_available(self.config.port, self.config.ip_address):
                return self.config.port
            if not self.config.allocate_port:
                raise IOError(f"TCP port {self.config.port} is not available.")

        # Attempt to reuse the port used when the services was last running
        if self.status.port and port_available(self.status.port):
            return self.status.port

        port = scan_for_available_port()
        if port:
            return port
        raise IOError("No free TCP ports found")

    def prepare_for_start(self) -> None:
        """Prepare the service endpoint for starting.

        This method is called before the service is started.
        """
        self.status.protocol = self.config.protocol
        self.status.hostname = self.config.ip_address
        self.status.port = self._lookup_free_port()

prepare_for_start()

Prepare the service endpoint for starting.

This method is called before the service is started.

Source code in src/zenml/services/local/local_service_endpoint.py
124
125
126
127
128
129
130
131
def prepare_for_start(self) -> None:
    """Prepare the service endpoint for starting.

    This method is called before the service is started.
    """
    self.status.protocol = self.config.protocol
    self.status.hostname = self.config.ip_address
    self.status.port = self._lookup_free_port()

LocalDaemonServiceEndpointConfig

Bases: ServiceEndpointConfig

Local daemon service endpoint configuration.

Attributes:

Name Type Description
protocol ServiceEndpointProtocol

the TCP protocol implemented by the service endpoint

port Optional[int]

preferred TCP port value for the service endpoint. If the port is in use when the service is started, setting allocate_port to True will also try to allocate a new port value, otherwise an exception will be raised.

ip_address str

the IP address of the service endpoint. If not set, the default localhost IP address will be used.

allocate_port bool

set to True to allocate a free TCP port for the service endpoint automatically.

Source code in src/zenml/services/local/local_service_endpoint.py
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
class LocalDaemonServiceEndpointConfig(ServiceEndpointConfig):
    """Local daemon service endpoint configuration.

    Attributes:
        protocol: the TCP protocol implemented by the service endpoint
        port: preferred TCP port value for the service endpoint. If the port
            is in use when the service is started, setting `allocate_port` to
            True will also try to allocate a new port value, otherwise an
            exception will be raised.
        ip_address: the IP address of the service endpoint. If not set, the
            default localhost IP address will be used.
        allocate_port: set to True to allocate a free TCP port for the
            service endpoint automatically.
    """

    protocol: ServiceEndpointProtocol = ServiceEndpointProtocol.TCP
    port: Optional[int] = None
    ip_address: str = DEFAULT_LOCAL_SERVICE_IP_ADDRESS
    allocate_port: bool = True

LocalDaemonServiceEndpointStatus

Bases: ServiceEndpointStatus

Local daemon service endpoint status.

Source code in src/zenml/services/local/local_service_endpoint.py
61
62
class LocalDaemonServiceEndpointStatus(ServiceEndpointStatus):
    """Local daemon service endpoint status."""

LocalDaemonServiceStatus

Bases: ServiceStatus

Local daemon service status.

Attributes:

Name Type Description
runtime_path Optional[str]

the path where the service daemon runtime files (the configuration file used to start the service daemon and the logfile) are located

silent_daemon bool

flag indicating whether the output of the daemon is suppressed (redirected to /dev/null).

Source code in src/zenml/services/local/local_service.py
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
class LocalDaemonServiceStatus(ServiceStatus):
    """Local daemon service status.

    Attributes:
        runtime_path: the path where the service daemon runtime files (the
            configuration file used to start the service daemon and the
            logfile) are located
        silent_daemon: flag indicating whether the output of the daemon
            is suppressed (redirected to /dev/null).
    """

    runtime_path: Optional[str] = None
    # TODO [ENG-704]: remove field duplication between XServiceStatus and
    #   XServiceConfig (e.g. keep a private reference to the config in the
    #   status)
    silent_daemon: bool = False

    @property
    def config_file(self) -> Optional[str]:
        """Get the path to the configuration file used to start the service daemon.

        Returns:
            The path to the configuration file, or None, if the
            service has never been started before.
        """
        if not self.runtime_path:
            return None
        return os.path.join(self.runtime_path, SERVICE_DAEMON_CONFIG_FILE_NAME)

    @property
    def log_file(self) -> Optional[str]:
        """Get the path to the log file where the service output is/has been logged.

        Returns:
            The path to the log file, or None, if the service has never been
            started before, or if the service daemon output is suppressed.
        """
        if not self.runtime_path or self.silent_daemon:
            return None
        return os.path.join(self.runtime_path, SERVICE_DAEMON_LOG_FILE_NAME)

    @property
    def pid_file(self) -> Optional[str]:
        """Get the path to a daemon PID file.

        This is where the last known PID of the daemon process is stored.

        Returns:
            The path to the PID file, or None, if the service has never been
            started before.
        """
        if not self.runtime_path or self.silent_daemon:
            return None
        return os.path.join(self.runtime_path, SERVICE_DAEMON_PID_FILE_NAME)

    @property
    def pid(self) -> Optional[int]:
        """Return the PID of the currently running daemon.

        Returns:
            The PID of the daemon, or None, if the service has never been
            started before.
        """
        pid_file = self.pid_file
        if not pid_file:
            return None
        if sys.platform == "win32":
            logger.warning(
                "Daemon functionality is currently not supported on Windows."
            )
            return None
        else:
            import zenml.services.local.local_daemon_entrypoint as daemon_entrypoint
            from zenml.utils.daemon import get_daemon_pid_if_running

            logger.debug(f"Checking PID file {pid_file}.")

            pid = get_daemon_pid_if_running(pid_file)

            if not pid:
                logger.debug(
                    f"Process with PID file {pid_file} is no longer running."
                )
                return None

            # let's be extra careful here and check that the PID really
            # belongs to a process that is a local ZenML daemon.
            # this avoids the situation where a PID file is left over from
            # a previous daemon run, but another process is using the same
            # PID.
            try:
                p = psutil.Process(pid)
                cmd_line = p.cmdline()

                # Empty cmd_line implies no process
                if not cmd_line:
                    logger.debug(f"Process with PID {pid} not found!")
                    return None

                config_file = self.config_file
                if (
                    config_file is not None
                    and (
                        daemon_entrypoint.__name__ not in cmd_line
                        or config_file not in cmd_line
                    )
                    and (
                        daemon_entrypoint.__name__ not in cmd_line[0]
                        or config_file not in cmd_line[0]
                    )
                ):
                    logger.warning(
                        f"Process with PID {pid} is not a ZenML local daemon "
                        f"service."
                    )
                return pid
            except NoSuchProcess:
                return None

config_file property

Get the path to the configuration file used to start the service daemon.

Returns:

Type Description
Optional[str]

The path to the configuration file, or None, if the

Optional[str]

service has never been started before.

log_file property

Get the path to the log file where the service output is/has been logged.

Returns:

Type Description
Optional[str]

The path to the log file, or None, if the service has never been

Optional[str]

started before, or if the service daemon output is suppressed.

pid property

Return the PID of the currently running daemon.

Returns:

Type Description
Optional[int]

The PID of the daemon, or None, if the service has never been

Optional[int]

started before.

pid_file property

Get the path to a daemon PID file.

This is where the last known PID of the daemon process is stored.

Returns:

Type Description
Optional[str]

The path to the PID file, or None, if the service has never been

Optional[str]

started before.

ServiceConfig

Bases: BaseTypedModel

Generic service configuration.

Concrete service classes should extend this class and add additional attributes that they want to see reflected and used in the service configuration.

Attributes:

Name Type Description
name str

name for the service instance

description str

description of the service

pipeline_name str

name of the pipeline that spun up the service

pipeline_step_name str

name of the pipeline step that spun up the service

run_name str

name of the pipeline run that spun up the service.

Source code in src/zenml/services/service.py
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
class ServiceConfig(BaseTypedModel):
    """Generic service configuration.

    Concrete service classes should extend this class and add additional
    attributes that they want to see reflected and used in the service
    configuration.

    Attributes:
        name: name for the service instance
        description: description of the service
        pipeline_name: name of the pipeline that spun up the service
        pipeline_step_name: name of the pipeline step that spun up the service
        run_name: name of the pipeline run that spun up the service.
    """

    name: str = ""
    description: str = ""
    pipeline_name: str = ""
    pipeline_step_name: str = ""
    model_name: str = ""
    model_version: str = ""
    service_name: str = ""

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

    def __init__(self, **data: Any):
        """Initialize the service configuration.

        Args:
            **data: keyword arguments.

        Raises:
            ValueError: if neither 'name' nor 'model_name' is set.
        """
        super().__init__(**data)
        if self.name or self.model_name:
            self.service_name = data.get(
                "service_name",
                f"{ZENM_ENDPOINT_PREFIX}{self.name or self.model_name}",
            )
        else:
            raise ValueError("Either 'name' or 'model_name' must be set.")

    def get_service_labels(self) -> Dict[str, str]:
        """Get the service labels.

        Returns:
            a dictionary of service labels.
        """
        labels = {}
        for k, v in self.model_dump().items():
            label = f"zenml_{k}".upper()
            labels[label] = str(v)
        return labels

__init__(**data)

Initialize the service configuration.

Parameters:

Name Type Description Default
**data Any

keyword arguments.

{}

Raises:

Type Description
ValueError

if neither 'name' nor 'model_name' is set.

Source code in src/zenml/services/service.py
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
def __init__(self, **data: Any):
    """Initialize the service configuration.

    Args:
        **data: keyword arguments.

    Raises:
        ValueError: if neither 'name' nor 'model_name' is set.
    """
    super().__init__(**data)
    if self.name or self.model_name:
        self.service_name = data.get(
            "service_name",
            f"{ZENM_ENDPOINT_PREFIX}{self.name or self.model_name}",
        )
    else:
        raise ValueError("Either 'name' or 'model_name' must be set.")

get_service_labels()

Get the service labels.

Returns:

Type Description
Dict[str, str]

a dictionary of service labels.

Source code in src/zenml/services/service.py
149
150
151
152
153
154
155
156
157
158
159
def get_service_labels(self) -> Dict[str, str]:
    """Get the service labels.

    Returns:
        a dictionary of service labels.
    """
    labels = {}
    for k, v in self.model_dump().items():
        label = f"zenml_{k}".upper()
        labels[label] = str(v)
    return labels

ServiceEndpointConfig

Bases: BaseTypedModel

Generic service endpoint configuration.

Concrete service classes should extend this class and add additional attributes that they want to see reflected and use in the endpoint configuration.

Attributes:

Name Type Description
name str

unique name for the service endpoint

description str

description of the service endpoint

Source code in src/zenml/services/service_endpoint.py
37
38
39
40
41
42
43
44
45
46
47
48
49
50
class ServiceEndpointConfig(BaseTypedModel):
    """Generic service endpoint configuration.

    Concrete service classes should extend this class and add additional
    attributes that they want to see reflected and use in the endpoint
    configuration.

    Attributes:
        name: unique name for the service endpoint
        description: description of the service endpoint
    """

    name: str = ""
    description: str = ""

ServiceEndpointHealthMonitorConfig

Bases: BaseTypedModel

Generic service health monitor configuration.

Concrete service classes should extend this class and add additional attributes that they want to see reflected and use in the health monitor configuration.

Source code in src/zenml/services/service_monitor.py
37
38
39
40
41
42
43
class ServiceEndpointHealthMonitorConfig(BaseTypedModel):
    """Generic service health monitor configuration.

    Concrete service classes should extend this class and add additional
    attributes that they want to see reflected and use in the health monitor
    configuration.
    """

ServiceEndpointProtocol

Bases: StrEnum

Possible endpoint protocol values.

Source code in src/zenml/services/service_endpoint.py
29
30
31
32
33
34
class ServiceEndpointProtocol(StrEnum):
    """Possible endpoint protocol values."""

    TCP = "tcp"
    HTTP = "http"
    HTTPS = "https"

ServiceEndpointStatus

Bases: ServiceStatus

Status information describing the operational state of a service endpoint.

For example, this could be a HTTP/HTTPS API or generic TCP endpoint exposed by a service. Concrete service classes should extend this class and add additional attributes that make up the operational state of the service endpoint.

Attributes:

Name Type Description
protocol ServiceEndpointProtocol

the TCP protocol used by the service endpoint

hostname Optional[str]

the hostname where the service endpoint is accessible

port Optional[int]

the current TCP port where the service endpoint is accessible

Source code in src/zenml/services/service_endpoint.py
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
class ServiceEndpointStatus(ServiceStatus):
    """Status information describing the operational state of a service endpoint.

    For example, this could be a HTTP/HTTPS API or generic TCP endpoint exposed
    by a service. Concrete service classes should extend this class and add
    additional attributes that make up the operational state of the service
    endpoint.

    Attributes:
        protocol: the TCP protocol used by the service endpoint
        hostname: the hostname where the service endpoint is accessible
        port: the current TCP port where the service endpoint is accessible
    """

    protocol: ServiceEndpointProtocol = ServiceEndpointProtocol.TCP
    hostname: Optional[str] = None
    port: Optional[int] = None

    @property
    def uri(self) -> Optional[str]:
        """Get the URI of the service endpoint.

        Returns:
            The URI of the service endpoint or None, if the service endpoint
            operational status doesn't have the required information.
        """
        if not self.hostname or not self.port or not self.protocol:
            # the service is not yet in a state in which the endpoint hostname
            # port and protocol are known
            return None

        hostname = self.hostname
        if hostname == "0.0.0.0":  # nosec
            hostname = DEFAULT_LOCAL_SERVICE_IP_ADDRESS

        return f"{self.protocol.value}://{hostname}:{self.port}"

uri property

Get the URI of the service endpoint.

Returns:

Type Description
Optional[str]

The URI of the service endpoint or None, if the service endpoint

Optional[str]

operational status doesn't have the required information.

ServiceStatus

Bases: BaseTypedModel

Information about the status of a service or process.

This information describes the operational status of an external process or service tracked by ZenML. This could be a process, container, Kubernetes deployment etc.

Concrete service classes should extend this class and add additional attributes that make up the operational state of the service.

Attributes:

Name Type Description
state ServiceState

the current operational state

last_state ServiceState

the operational state prior to the last status update

last_error str

the error encountered during the last status update

Source code in src/zenml/services/service_status.py
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
class ServiceStatus(BaseTypedModel):
    """Information about the status of a service or process.

    This information describes the operational status of an external process or
    service tracked by ZenML. This could be a process, container, Kubernetes
    deployment etc.

    Concrete service classes should extend this class and add additional
    attributes that make up the operational state of the service.

    Attributes:
        state: the current operational state
        last_state: the operational state prior to the last status update
        last_error: the error encountered during the last status update
    """

    state: ServiceState = ServiceState.INACTIVE
    last_state: ServiceState = ServiceState.INACTIVE
    last_error: str = ""

    def update_state(
        self,
        new_state: Optional[ServiceState] = None,
        error: str = "",
    ) -> None:
        """Update the current operational state to reflect a new state value and/or error.

        Args:
            new_state: new operational state discovered by the last service
                status update
            error: error message describing an operational failure encountered
                during the last service status update
        """
        if new_state and self.state != new_state:
            self.last_state = self.state
            self.state = new_state
        if error:
            self.last_error = error

    def clear_error(self) -> None:
        """Clear the last error message."""
        self.last_error = ""

clear_error()

Clear the last error message.

Source code in src/zenml/services/service_status.py
64
65
66
def clear_error(self) -> None:
    """Clear the last error message."""
    self.last_error = ""

update_state(new_state=None, error='')

Update the current operational state to reflect a new state value and/or error.

Parameters:

Name Type Description Default
new_state Optional[ServiceState]

new operational state discovered by the last service status update

None
error str

error message describing an operational failure encountered during the last service status update

''
Source code in src/zenml/services/service_status.py
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
def update_state(
    self,
    new_state: Optional[ServiceState] = None,
    error: str = "",
) -> None:
    """Update the current operational state to reflect a new state value and/or error.

    Args:
        new_state: new operational state discovered by the last service
            status update
        error: error message describing an operational failure encountered
            during the last service status update
    """
    if new_state and self.state != new_state:
        self.last_state = self.state
        self.state = new_state
    if error:
        self.last_error = error

TCPEndpointHealthMonitor

Bases: BaseServiceEndpointHealthMonitor

TCP service endpoint health monitor.

Attributes:

Name Type Description
config TCPEndpointHealthMonitorConfig

health monitor configuration for TCP endpoint

Source code in src/zenml/services/service_monitor.py
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
class TCPEndpointHealthMonitor(BaseServiceEndpointHealthMonitor):
    """TCP service endpoint health monitor.

    Attributes:
        config: health monitor configuration for TCP endpoint
    """

    config: TCPEndpointHealthMonitorConfig

    def check_endpoint_status(
        self, endpoint: "BaseServiceEndpoint"
    ) -> Tuple[ServiceState, str]:
        """Run a TCP endpoint healthcheck.

        Args:
            endpoint: service endpoint to check.

        Returns:
            The operational state of the external TCP endpoint and an
            optional message describing that state (e.g. an error message,
            if an error is encountered while checking the TCP endpoint
            status).
        """
        if not endpoint.status.port or not endpoint.status.hostname:
            return (
                ServiceState.ERROR,
                "TCP port and hostname values are not known",
            )

        logger.debug(
            "Running TCP healthcheck for TCP port: %d", endpoint.status.port
        )

        if port_is_open(endpoint.status.hostname, endpoint.status.port):
            # the endpoint is healthy
            return ServiceState.ACTIVE, ""

        return (
            ServiceState.ERROR,
            "TCP endpoint healthcheck error: TCP port is not "
            "open or not accessible",
        )

check_endpoint_status(endpoint)

Run a TCP endpoint healthcheck.

Parameters:

Name Type Description Default
endpoint BaseServiceEndpoint

service endpoint to check.

required

Returns:

Type Description
ServiceState

The operational state of the external TCP endpoint and an

str

optional message describing that state (e.g. an error message,

Tuple[ServiceState, str]

if an error is encountered while checking the TCP endpoint

Tuple[ServiceState, str]

status).

Source code in src/zenml/services/service_monitor.py
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
def check_endpoint_status(
    self, endpoint: "BaseServiceEndpoint"
) -> Tuple[ServiceState, str]:
    """Run a TCP endpoint healthcheck.

    Args:
        endpoint: service endpoint to check.

    Returns:
        The operational state of the external TCP endpoint and an
        optional message describing that state (e.g. an error message,
        if an error is encountered while checking the TCP endpoint
        status).
    """
    if not endpoint.status.port or not endpoint.status.hostname:
        return (
            ServiceState.ERROR,
            "TCP port and hostname values are not known",
        )

    logger.debug(
        "Running TCP healthcheck for TCP port: %d", endpoint.status.port
    )

    if port_is_open(endpoint.status.hostname, endpoint.status.port):
        # the endpoint is healthy
        return ServiceState.ACTIVE, ""

    return (
        ServiceState.ERROR,
        "TCP endpoint healthcheck error: TCP port is not "
        "open or not accessible",
    )

TCPEndpointHealthMonitorConfig

Bases: ServiceEndpointHealthMonitorConfig

TCP service endpoint health monitor configuration.

Source code in src/zenml/services/service_monitor.py
187
188
class TCPEndpointHealthMonitorConfig(ServiceEndpointHealthMonitorConfig):
    """TCP service endpoint health monitor configuration."""

Stack Deployments

ZenML Stack Deployments.

Stack

Initialization of the ZenML Stack.

The stack is essentially all the configuration for the infrastructure of your MLOps platform.

A stack is made up of multiple components. Some examples are:

  • An Artifact Store
  • An Orchestrator
  • A Step Operator (Optional)
  • A Container Registry (Optional)

Flavor

Class for ZenML Flavors.

Source code in src/zenml/stack/flavor.py
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
class Flavor:
    """Class for ZenML Flavors."""

    @property
    @abstractmethod
    def name(self) -> str:
        """The flavor name.

        Returns:
            The flavor name.
        """

    @property
    def docs_url(self) -> Optional[str]:
        """A url to point at docs explaining this flavor.

        Returns:
            A flavor docs url.
        """
        return None

    @property
    def sdk_docs_url(self) -> Optional[str]:
        """A url to point at SDK docs explaining this flavor.

        Returns:
            A flavor SDK docs url.
        """
        return None

    @property
    def logo_url(self) -> Optional[str]:
        """A url to represent the flavor in the dashboard.

        Returns:
            The flavor logo.
        """
        return None

    @property
    @abstractmethod
    def type(self) -> StackComponentType:
        """The stack component type.

        Returns:
            The stack component type.
        """

    @property
    @abstractmethod
    def implementation_class(self) -> Type[StackComponent]:
        """Implementation class for this flavor.

        Returns:
            The implementation class for this flavor.
        """

    @property
    @abstractmethod
    def config_class(self) -> Type[StackComponentConfig]:
        """Returns `StackComponentConfig` config class.

        Returns:
            The config class.
        """

    @property
    def config_schema(self) -> Dict[str, Any]:
        """The config schema for a flavor.

        Returns:
            The config schema.
        """
        return self.config_class.model_json_schema()

    @property
    def service_connector_requirements(
        self,
    ) -> Optional[ServiceConnectorRequirements]:
        """Service connector resource requirements for service connectors.

        Specifies resource requirements that are used to filter the available
        service connector types that are compatible with this flavor.

        Returns:
            Requirements for compatible service connectors, if a service
            connector is required for this flavor.
        """
        return None

    @classmethod
    def from_model(cls, flavor_model: FlavorResponse) -> "Flavor":
        """Loads a flavor from a model.

        Args:
            flavor_model: The model to load from.

        Raises:
            CustomFlavorImportError: If the custom flavor can't be imported.
            ImportError: If the flavor can't be imported.

        Returns:
            The loaded flavor.
        """
        try:
            flavor = source_utils.load(flavor_model.source)()
        except (ModuleNotFoundError, ImportError, NotImplementedError) as err:
            if flavor_model.is_custom:
                flavor_module, _ = flavor_model.source.rsplit(".", maxsplit=1)
                expected_file_path = os.path.join(
                    source_utils.get_source_root(),
                    flavor_module.replace(".", os.path.sep),
                )
                raise CustomFlavorImportError(
                    f"Couldn't import custom flavor {flavor_model.name}: "
                    f"{err}. Make sure the custom flavor class "
                    f"`{flavor_model.source}` is importable. If it is part of "
                    "a library, make sure it is installed. If "
                    "it is a local code file, make sure it exists at "
                    f"`{expected_file_path}.py`."
                )
            else:
                raise ImportError(
                    f"Couldn't import flavor {flavor_model.name}: {err}"
                )
        return cast(Flavor, flavor)

    def to_model(
        self,
        integration: Optional[str] = None,
        is_custom: bool = True,
    ) -> FlavorRequest:
        """Converts a flavor to a model.

        Args:
            integration: The integration to use for the model.
            is_custom: Whether the flavor is a custom flavor.

        Returns:
            The model.
        """
        connector_requirements = self.service_connector_requirements
        connector_type = (
            connector_requirements.connector_type
            if connector_requirements
            else None
        )
        resource_type = (
            connector_requirements.resource_type
            if connector_requirements
            else None
        )
        resource_id_attr = (
            connector_requirements.resource_id_attr
            if connector_requirements
            else None
        )

        model = FlavorRequest(
            name=self.name,
            type=self.type,
            source=source_utils.resolve(self.__class__).import_path,
            config_schema=self.config_schema,
            connector_type=connector_type,
            connector_resource_type=resource_type,
            connector_resource_id_attr=resource_id_attr,
            integration=integration,
            logo_url=self.logo_url,
            docs_url=self.docs_url,
            sdk_docs_url=self.sdk_docs_url,
            is_custom=is_custom,
        )
        return model

    def generate_default_docs_url(self) -> str:
        """Generate the doc urls for all inbuilt and integration flavors.

        Note that this method is not going to be useful for custom flavors,
        which do not have any docs in the main zenml docs.

        Returns:
            The complete url to the zenml documentation
        """
        from zenml import __version__

        component_type = self.type.plural.replace("_", "-")
        name = self.name.replace("_", "-")

        try:
            is_latest = is_latest_zenml_version()
        except RuntimeError:
            # We assume in error cases that we are on the latest version
            is_latest = True

        if is_latest:
            base = "https://docs.zenml.io"
        else:
            base = f"https://zenml-io.gitbook.io/zenml-legacy-documentation/v/{__version__}"
        return f"{base}/stack-components/{component_type}/{name}"

    def generate_default_sdk_docs_url(self) -> str:
        """Generate SDK docs url for a flavor.

        Returns:
            The complete url to the zenml SDK docs
        """
        from zenml import __version__

        base = f"https://sdkdocs.zenml.io/{__version__}"

        component_type = self.type.plural

        if "zenml.integrations" in self.__module__:
            # Get integration name out of module path which will look something
            #  like this "zenml.integrations.<integration>....
            integration = self.__module__.split(
                "zenml.integrations.", maxsplit=1
            )[1].split(".")[0]

            return (
                f"{base}/integration_code_docs"
                f"/integrations-{integration}/#{self.__module__}"
            )

        else:
            return (
                f"{base}/core_code_docs/core-{component_type}/"
                f"#{self.__module__}"
            )

config_class abstractmethod property

Returns StackComponentConfig config class.

Returns:

Type Description
Type[StackComponentConfig]

The config class.

config_schema property

The config schema for a flavor.

Returns:

Type Description
Dict[str, Any]

The config schema.

docs_url property

A url to point at docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor docs url.

implementation_class abstractmethod property

Implementation class for this flavor.

Returns:

Type Description
Type[StackComponent]

The implementation class for this flavor.

logo_url property

A url to represent the flavor in the dashboard.

Returns:

Type Description
Optional[str]

The flavor logo.

name abstractmethod property

The flavor name.

Returns:

Type Description
str

The flavor name.

sdk_docs_url property

A url to point at SDK docs explaining this flavor.

Returns:

Type Description
Optional[str]

A flavor SDK docs url.

service_connector_requirements property

Service connector resource requirements for service connectors.

Specifies resource requirements that are used to filter the available service connector types that are compatible with this flavor.

Returns:

Type Description
Optional[ServiceConnectorRequirements]

Requirements for compatible service connectors, if a service

Optional[ServiceConnectorRequirements]

connector is required for this flavor.

type abstractmethod property

The stack component type.

Returns:

Type Description
StackComponentType

The stack component type.

from_model(flavor_model) classmethod

Loads a flavor from a model.

Parameters:

Name Type Description Default
flavor_model FlavorResponse

The model to load from.

required

Raises:

Type Description
CustomFlavorImportError

If the custom flavor can't be imported.

ImportError

If the flavor can't be imported.

Returns:

Type Description
Flavor

The loaded flavor.

Source code in src/zenml/stack/flavor.py
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
@classmethod
def from_model(cls, flavor_model: FlavorResponse) -> "Flavor":
    """Loads a flavor from a model.

    Args:
        flavor_model: The model to load from.

    Raises:
        CustomFlavorImportError: If the custom flavor can't be imported.
        ImportError: If the flavor can't be imported.

    Returns:
        The loaded flavor.
    """
    try:
        flavor = source_utils.load(flavor_model.source)()
    except (ModuleNotFoundError, ImportError, NotImplementedError) as err:
        if flavor_model.is_custom:
            flavor_module, _ = flavor_model.source.rsplit(".", maxsplit=1)
            expected_file_path = os.path.join(
                source_utils.get_source_root(),
                flavor_module.replace(".", os.path.sep),
            )
            raise CustomFlavorImportError(
                f"Couldn't import custom flavor {flavor_model.name}: "
                f"{err}. Make sure the custom flavor class "
                f"`{flavor_model.source}` is importable. If it is part of "
                "a library, make sure it is installed. If "
                "it is a local code file, make sure it exists at "
                f"`{expected_file_path}.py`."
            )
        else:
            raise ImportError(
                f"Couldn't import flavor {flavor_model.name}: {err}"
            )
    return cast(Flavor, flavor)

generate_default_docs_url()

Generate the doc urls for all inbuilt and integration flavors.

Note that this method is not going to be useful for custom flavors, which do not have any docs in the main zenml docs.

Returns:

Type Description
str

The complete url to the zenml documentation

Source code in src/zenml/stack/flavor.py
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
def generate_default_docs_url(self) -> str:
    """Generate the doc urls for all inbuilt and integration flavors.

    Note that this method is not going to be useful for custom flavors,
    which do not have any docs in the main zenml docs.

    Returns:
        The complete url to the zenml documentation
    """
    from zenml import __version__

    component_type = self.type.plural.replace("_", "-")
    name = self.name.replace("_", "-")

    try:
        is_latest = is_latest_zenml_version()
    except RuntimeError:
        # We assume in error cases that we are on the latest version
        is_latest = True

    if is_latest:
        base = "https://docs.zenml.io"
    else:
        base = f"https://zenml-io.gitbook.io/zenml-legacy-documentation/v/{__version__}"
    return f"{base}/stack-components/{component_type}/{name}"

generate_default_sdk_docs_url()

Generate SDK docs url for a flavor.

Returns:

Type Description
str

The complete url to the zenml SDK docs

Source code in src/zenml/stack/flavor.py
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
def generate_default_sdk_docs_url(self) -> str:
    """Generate SDK docs url for a flavor.

    Returns:
        The complete url to the zenml SDK docs
    """
    from zenml import __version__

    base = f"https://sdkdocs.zenml.io/{__version__}"

    component_type = self.type.plural

    if "zenml.integrations" in self.__module__:
        # Get integration name out of module path which will look something
        #  like this "zenml.integrations.<integration>....
        integration = self.__module__.split(
            "zenml.integrations.", maxsplit=1
        )[1].split(".")[0]

        return (
            f"{base}/integration_code_docs"
            f"/integrations-{integration}/#{self.__module__}"
        )

    else:
        return (
            f"{base}/core_code_docs/core-{component_type}/"
            f"#{self.__module__}"
        )

to_model(integration=None, is_custom=True)

Converts a flavor to a model.

Parameters:

Name Type Description Default
integration Optional[str]

The integration to use for the model.

None
is_custom bool

Whether the flavor is a custom flavor.

True

Returns:

Type Description
FlavorRequest

The model.

Source code in src/zenml/stack/flavor.py
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
def to_model(
    self,
    integration: Optional[str] = None,
    is_custom: bool = True,
) -> FlavorRequest:
    """Converts a flavor to a model.

    Args:
        integration: The integration to use for the model.
        is_custom: Whether the flavor is a custom flavor.

    Returns:
        The model.
    """
    connector_requirements = self.service_connector_requirements
    connector_type = (
        connector_requirements.connector_type
        if connector_requirements
        else None
    )
    resource_type = (
        connector_requirements.resource_type
        if connector_requirements
        else None
    )
    resource_id_attr = (
        connector_requirements.resource_id_attr
        if connector_requirements
        else None
    )

    model = FlavorRequest(
        name=self.name,
        type=self.type,
        source=source_utils.resolve(self.__class__).import_path,
        config_schema=self.config_schema,
        connector_type=connector_type,
        connector_resource_type=resource_type,
        connector_resource_id_attr=resource_id_attr,
        integration=integration,
        logo_url=self.logo_url,
        docs_url=self.docs_url,
        sdk_docs_url=self.sdk_docs_url,
        is_custom=is_custom,
    )
    return model

Stack

ZenML stack class.

A ZenML stack is a collection of multiple stack components that are required to run ZenML pipelines. Some of these components (orchestrator, and artifact store) are required to run any kind of pipeline, other components like the container registry are only required if other stack components depend on them.

Source code in src/zenml/stack/stack.py
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
class Stack:
    """ZenML stack class.

    A ZenML stack is a collection of multiple stack components that are
    required to run ZenML pipelines. Some of these components (orchestrator,
    and artifact store) are required to run any kind of
    pipeline, other components like the container registry are only required
    if other stack components depend on them.
    """

    def __init__(
        self,
        id: UUID,
        name: str,
        *,
        orchestrator: "BaseOrchestrator",
        artifact_store: "BaseArtifactStore",
        container_registry: Optional["BaseContainerRegistry"] = None,
        step_operator: Optional["BaseStepOperator"] = None,
        feature_store: Optional["BaseFeatureStore"] = None,
        model_deployer: Optional["BaseModelDeployer"] = None,
        experiment_tracker: Optional["BaseExperimentTracker"] = None,
        alerter: Optional["BaseAlerter"] = None,
        annotator: Optional["BaseAnnotator"] = None,
        data_validator: Optional["BaseDataValidator"] = None,
        image_builder: Optional["BaseImageBuilder"] = None,
        model_registry: Optional["BaseModelRegistry"] = None,
    ):
        """Initializes and validates a stack instance.

        Args:
            id: Unique ID of the stack.
            name: Name of the stack.
            orchestrator: Orchestrator component of the stack.
            artifact_store: Artifact store component of the stack.
            container_registry: Container registry component of the stack.
            step_operator: Step operator component of the stack.
            feature_store: Feature store component of the stack.
            model_deployer: Model deployer component of the stack.
            experiment_tracker: Experiment tracker component of the stack.
            alerter: Alerter component of the stack.
            annotator: Annotator component of the stack.
            data_validator: Data validator component of the stack.
            image_builder: Image builder component of the stack.
            model_registry: Model registry component of the stack.
        """
        self._id = id
        self._name = name
        self._orchestrator = orchestrator
        self._artifact_store = artifact_store
        self._container_registry = container_registry
        self._step_operator = step_operator
        self._feature_store = feature_store
        self._model_deployer = model_deployer
        self._experiment_tracker = experiment_tracker
        self._alerter = alerter
        self._annotator = annotator
        self._data_validator = data_validator
        self._model_registry = model_registry
        self._image_builder = image_builder

    @classmethod
    def from_model(cls, stack_model: "StackResponse") -> "Stack":
        """Creates a Stack instance from a StackModel.

        Args:
            stack_model: The StackModel to create the Stack from.

        Returns:
            The created Stack instance.
        """
        global _STACK_CACHE
        key = (stack_model.id, stack_model.updated)
        if key in _STACK_CACHE:
            return _STACK_CACHE[key]

        from zenml.stack import StackComponent

        # Run a hydrated list call once to avoid one request per component
        component_models = pagination_utils.depaginate(
            Client().list_stack_components,
            stack_id=stack_model.id,
            hydrate=True,
        )

        stack_components = {
            model.type: StackComponent.from_model(model)
            for model in component_models
        }
        stack = Stack.from_components(
            id=stack_model.id,
            name=stack_model.name,
            components=stack_components,
        )
        _STACK_CACHE[key] = stack

        client = Client()
        if stack_model.id == client.active_stack_model.id:
            if stack_model.updated > client.active_stack_model.updated:
                if client._config:
                    client._config.set_active_stack(stack_model)
                else:
                    GlobalConfiguration().set_active_stack(stack_model)

        return stack

    @classmethod
    def from_components(
        cls,
        id: UUID,
        name: str,
        components: Dict[StackComponentType, "StackComponent"],
    ) -> "Stack":
        """Creates a stack instance from a dict of stack components.

        # noqa: DAR402

        Args:
            id: Unique ID of the stack.
            name: The name of the stack.
            components: The components of the stack.

        Returns:
            A stack instance consisting of the given components.

        Raises:
            TypeError: If a required component is missing or a component
                doesn't inherit from the expected base class.
        """
        from zenml.alerter import BaseAlerter
        from zenml.annotators import BaseAnnotator
        from zenml.artifact_stores import BaseArtifactStore
        from zenml.container_registries import BaseContainerRegistry
        from zenml.data_validators import BaseDataValidator
        from zenml.experiment_trackers import BaseExperimentTracker
        from zenml.feature_stores import BaseFeatureStore
        from zenml.image_builders import BaseImageBuilder
        from zenml.model_deployers import BaseModelDeployer
        from zenml.model_registries import BaseModelRegistry
        from zenml.orchestrators import BaseOrchestrator
        from zenml.step_operators import BaseStepOperator

        def _raise_type_error(
            component: Optional["StackComponent"], expected_class: Type[Any]
        ) -> NoReturn:
            """Raises a TypeError that the component has an unexpected type.

            Args:
                component: The component that has an unexpected type.
                expected_class: The expected type of the component.

            Raises:
                TypeError: If the component has an unexpected type.
            """
            raise TypeError(
                f"Unable to create stack: Wrong stack component type "
                f"`{component.__class__.__name__}` (expected: subclass "
                f"of `{expected_class.__name__}`)"
            )

        orchestrator = components.get(StackComponentType.ORCHESTRATOR)
        if not isinstance(orchestrator, BaseOrchestrator):
            _raise_type_error(orchestrator, BaseOrchestrator)

        artifact_store = components.get(StackComponentType.ARTIFACT_STORE)
        if not isinstance(artifact_store, BaseArtifactStore):
            _raise_type_error(artifact_store, BaseArtifactStore)

        container_registry = components.get(
            StackComponentType.CONTAINER_REGISTRY
        )
        if container_registry is not None and not isinstance(
            container_registry, BaseContainerRegistry
        ):
            _raise_type_error(container_registry, BaseContainerRegistry)

        step_operator = components.get(StackComponentType.STEP_OPERATOR)
        if step_operator is not None and not isinstance(
            step_operator, BaseStepOperator
        ):
            _raise_type_error(step_operator, BaseStepOperator)

        feature_store = components.get(StackComponentType.FEATURE_STORE)
        if feature_store is not None and not isinstance(
            feature_store, BaseFeatureStore
        ):
            _raise_type_error(feature_store, BaseFeatureStore)

        model_deployer = components.get(StackComponentType.MODEL_DEPLOYER)
        if model_deployer is not None and not isinstance(
            model_deployer, BaseModelDeployer
        ):
            _raise_type_error(model_deployer, BaseModelDeployer)

        experiment_tracker = components.get(
            StackComponentType.EXPERIMENT_TRACKER
        )
        if experiment_tracker is not None and not isinstance(
            experiment_tracker, BaseExperimentTracker
        ):
            _raise_type_error(experiment_tracker, BaseExperimentTracker)

        alerter = components.get(StackComponentType.ALERTER)
        if alerter is not None and not isinstance(alerter, BaseAlerter):
            _raise_type_error(alerter, BaseAlerter)

        annotator = components.get(StackComponentType.ANNOTATOR)
        if annotator is not None and not isinstance(annotator, BaseAnnotator):
            _raise_type_error(annotator, BaseAnnotator)

        data_validator = components.get(StackComponentType.DATA_VALIDATOR)
        if data_validator is not None and not isinstance(
            data_validator, BaseDataValidator
        ):
            _raise_type_error(data_validator, BaseDataValidator)

        image_builder = components.get(StackComponentType.IMAGE_BUILDER)
        if image_builder is not None and not isinstance(
            image_builder, BaseImageBuilder
        ):
            _raise_type_error(image_builder, BaseImageBuilder)

        model_registry = components.get(StackComponentType.MODEL_REGISTRY)
        if model_registry is not None and not isinstance(
            model_registry, BaseModelRegistry
        ):
            _raise_type_error(model_registry, BaseModelRegistry)

        return Stack(
            id=id,
            name=name,
            orchestrator=orchestrator,
            artifact_store=artifact_store,
            container_registry=container_registry,
            step_operator=step_operator,
            feature_store=feature_store,
            model_deployer=model_deployer,
            experiment_tracker=experiment_tracker,
            alerter=alerter,
            annotator=annotator,
            data_validator=data_validator,
            image_builder=image_builder,
            model_registry=model_registry,
        )

    @property
    def components(self) -> Dict[StackComponentType, "StackComponent"]:
        """All components of the stack.

        Returns:
            A dictionary of all components of the stack.
        """
        return {
            component.type: component
            for component in [
                self.orchestrator,
                self.artifact_store,
                self.container_registry,
                self.step_operator,
                self.feature_store,
                self.model_deployer,
                self.experiment_tracker,
                self.alerter,
                self.annotator,
                self.data_validator,
                self.image_builder,
                self.model_registry,
            ]
            if component is not None
        }

    @property
    def id(self) -> UUID:
        """The ID of the stack.

        Returns:
            The ID of the stack.
        """
        return self._id

    @property
    def name(self) -> str:
        """The name of the stack.

        Returns:
            str: The name of the stack.
        """
        return self._name

    @property
    def orchestrator(self) -> "BaseOrchestrator":
        """The orchestrator of the stack.

        Returns:
            The orchestrator of the stack.
        """
        return self._orchestrator

    @property
    def artifact_store(self) -> "BaseArtifactStore":
        """The artifact store of the stack.

        Returns:
            The artifact store of the stack.
        """
        return self._artifact_store

    @property
    def container_registry(self) -> Optional["BaseContainerRegistry"]:
        """The container registry of the stack.

        Returns:
            The container registry of the stack or None if the stack does not
            have a container registry.
        """
        return self._container_registry

    @property
    def step_operator(self) -> Optional["BaseStepOperator"]:
        """The step operator of the stack.

        Returns:
            The step operator of the stack.
        """
        return self._step_operator

    @property
    def feature_store(self) -> Optional["BaseFeatureStore"]:
        """The feature store of the stack.

        Returns:
            The feature store of the stack.
        """
        return self._feature_store

    @property
    def model_deployer(self) -> Optional["BaseModelDeployer"]:
        """The model deployer of the stack.

        Returns:
            The model deployer of the stack.
        """
        return self._model_deployer

    @property
    def experiment_tracker(self) -> Optional["BaseExperimentTracker"]:
        """The experiment tracker of the stack.

        Returns:
            The experiment tracker of the stack.
        """
        return self._experiment_tracker

    @property
    def alerter(self) -> Optional["BaseAlerter"]:
        """The alerter of the stack.

        Returns:
            The alerter of the stack.
        """
        return self._alerter

    @property
    def annotator(self) -> Optional["BaseAnnotator"]:
        """The annotator of the stack.

        Returns:
            The annotator of the stack.
        """
        return self._annotator

    @property
    def data_validator(self) -> Optional["BaseDataValidator"]:
        """The data validator of the stack.

        Returns:
            The data validator of the stack.
        """
        return self._data_validator

    @property
    def image_builder(self) -> Optional["BaseImageBuilder"]:
        """The image builder of the stack.

        Returns:
            The image builder of the stack.
        """
        return self._image_builder

    @property
    def model_registry(self) -> Optional["BaseModelRegistry"]:
        """The model registry of the stack.

        Returns:
            The model registry of the stack.
        """
        return self._model_registry

    def dict(self) -> Dict[str, str]:
        """Converts the stack into a dictionary.

        Returns:
            A dictionary containing the stack components.
        """
        component_dict = {
            component_type.value: json.dumps(
                component.config.model_dump(mode="json"), sort_keys=True
            )
            for component_type, component in self.components.items()
        }
        component_dict.update({"name": self.name})
        return component_dict

    def requirements(
        self,
        exclude_components: Optional[AbstractSet[StackComponentType]] = None,
    ) -> Set[str]:
        """Set of PyPI requirements for the stack.

        This method combines the requirements of all stack components (except
        the ones specified in `exclude_components`).

        Args:
            exclude_components: Set of component types for which the
                requirements should not be included in the output.

        Returns:
            Set of PyPI requirements.
        """
        exclude_components = exclude_components or set()
        requirements = [
            component.requirements
            for component in self.components.values()
            if component.type not in exclude_components
        ]
        return set.union(*requirements) if requirements else set()

    @property
    def apt_packages(self) -> List[str]:
        """List of APT package requirements for the stack.

        Returns:
            A list of APT package requirements for the stack.
        """
        return [
            package
            for component in self.components.values()
            for package in component.apt_packages
        ]

    def check_local_paths(self) -> bool:
        """Checks if the stack has local paths.

        Returns:
            True if the stack has local paths, False otherwise.

        Raises:
            ValueError: If the stack has local paths that do not conform to
                the convention that all local path must be relative to the
                local stores directory.
        """
        from zenml.config.global_config import GlobalConfiguration

        local_stores_path = GlobalConfiguration().local_stores_path

        # go through all stack components and identify those that advertise
        # a local path where they persist information that they need to be
        # available when running pipelines.
        has_local_paths = False
        for stack_comp in self.components.values():
            local_path = stack_comp.local_path
            if not local_path:
                continue
            # double-check this convention, just in case it wasn't respected
            # as documented in `StackComponent.local_path`
            if not local_path.startswith(local_stores_path):
                raise ValueError(
                    f"Local path {local_path} for component "
                    f"{stack_comp.name} is not in the local stores "
                    f"directory ({local_stores_path})."
                )
            has_local_paths = True

        return has_local_paths

    @property
    def required_secrets(self) -> Set["secret_utils.SecretReference"]:
        """All required secrets for this stack.

        Returns:
            The required secrets of this stack.
        """
        secrets = [
            component.config.required_secrets
            for component in self.components.values()
        ]
        return set.union(*secrets) if secrets else set()

    @property
    def setting_classes(self) -> Dict[str, Type["BaseSettings"]]:
        """Setting classes of all components of this stack.

        Returns:
            All setting classes and their respective keys.
        """
        setting_classes = {}
        for component in self.components.values():
            if component.settings_class:
                key = settings_utils.get_stack_component_setting_key(component)
                setting_classes[key] = component.settings_class
        return setting_classes

    @property
    def requires_remote_server(self) -> bool:
        """If the stack requires a remote ZenServer to run.

        This is the case if any code is getting executed remotely. This is the
        case for both remote orchestrators as well as remote step operators.

        Returns:
            If the stack requires a remote ZenServer to run.
        """
        return self.orchestrator.config.is_remote or (
            self.step_operator is not None
            and self.step_operator.config.is_remote
        )

    def _validate_secrets(self, raise_exception: bool) -> None:
        """Validates that all secrets of the stack exists.

        Args:
            raise_exception: If `True`, raises an exception if a secret is
                missing. Otherwise a warning is logged.

        # noqa: DAR402
        Raises:
            StackValidationError: If a secret is missing.
        """
        env_value = os.getenv(
            ENV_ZENML_SECRET_VALIDATION_LEVEL,
            default=SecretValidationLevel.SECRET_AND_KEY_EXISTS.value,
        )
        secret_validation_level = SecretValidationLevel(env_value)

        required_secrets = self.required_secrets
        if (
            secret_validation_level != SecretValidationLevel.NONE
            and required_secrets
        ):

            def _handle_error(message: str) -> None:
                """Handles the error by raising an exception or logging.

                Args:
                    message: The error message.

                Raises:
                    StackValidationError: If called and `raise_exception` of
                        the outer method is `True`.
                """
                if raise_exception:
                    raise StackValidationError(message)
                else:
                    message += (
                        "\nYou need to solve this issue before running "
                        "a pipeline on this stack."
                    )
                    logger.warning(message)

            client = Client()

            # Attempt to resolve secrets through the secrets store
            for secret_ref in required_secrets.copy():
                try:
                    secret = client.get_secret(secret_ref.name)
                    if (
                        secret_validation_level
                        == SecretValidationLevel.SECRET_AND_KEY_EXISTS
                    ):
                        _ = secret.values[secret_ref.key]
                except (KeyError, NotImplementedError):
                    pass
                else:
                    # Drop this secret from the list of required secrets
                    required_secrets.remove(secret_ref)

            if not required_secrets:
                return

            secrets_msg = ", ".join(
                [
                    f"{secret_ref.name}.{secret_ref.key}"
                    for secret_ref in required_secrets
                ]
            )

            _handle_error(
                f"Some components in the `{self.name}` stack reference secrets "
                f"or secret keys that do not exist in the secret store: "
                f"{secrets_msg}.\nTo register the "
                "missing secrets for this stack, run `zenml stack "
                f"register-secrets {self.name}`\nIf you want to "
                "adjust the degree to which ZenML validates the existence "
                "of secrets in your stack, you can do so by setting the "
                f"environment variable {ENV_ZENML_SECRET_VALIDATION_LEVEL} "
                "to one of the following values: "
                f"{SecretValidationLevel.values()}."
            )

    def validate(
        self,
        fail_if_secrets_missing: bool = False,
    ) -> None:
        """Checks whether the stack configuration is valid.

        To check if a stack configuration is valid, the following criteria must
        be met:
        - the stack must have an image builder if other components require it
        - the `StackValidator` of each stack component has to validate the
            stack to make sure all the components are compatible with each other
        - the required secrets of all components need to exist

        Args:
            fail_if_secrets_missing: If this is `True`, an error will be raised
                if a secret for a component is missing. Otherwise, only a
                warning will be logged.
        """
        if handle_bool_env_var(ENV_ZENML_SKIP_STACK_VALIDATION, default=False):
            logger.debug("Skipping stack validation.")
            return

        self.validate_image_builder()
        for component in self.components.values():
            if component.validator:
                component.validator.validate(stack=self)

        self._validate_secrets(raise_exception=fail_if_secrets_missing)

    def validate_image_builder(self) -> None:
        """Validates that the stack has an image builder if required.

        If the stack requires an image builder, but none is specified, a
        local image builder will be created and assigned to the stack to
        ensure backwards compatibility.
        """
        requires_image_builder = (
            self.orchestrator.flavor != "local"
            or self.step_operator
            or (self.model_deployer and self.model_deployer.flavor != "mlflow")
        )
        skip_default_image_builder = handle_bool_env_var(
            ENV_ZENML_SKIP_IMAGE_BUILDER_DEFAULT, default=False
        )
        if (
            requires_image_builder
            and not skip_default_image_builder
            and not self.image_builder
        ):
            from uuid import uuid4

            from zenml.image_builders import (
                LocalImageBuilder,
                LocalImageBuilderConfig,
                LocalImageBuilderFlavor,
            )

            flavor = LocalImageBuilderFlavor()

            now = utc_now()
            image_builder = LocalImageBuilder(
                id=uuid4(),
                name="temporary_default",
                flavor=flavor.name,
                type=flavor.type,
                config=LocalImageBuilderConfig(),
                user=Client().active_user.id,
                created=now,
                updated=now,
            )

            self._image_builder = image_builder

    def prepare_pipeline_deployment(
        self, deployment: "PipelineDeploymentResponse"
    ) -> None:
        """Prepares the stack for a pipeline deployment.

        This method is called before a pipeline is deployed.

        Args:
            deployment: The pipeline deployment

        Raises:
            RuntimeError: If trying to deploy a pipeline that requires a remote
                ZenML server with a local one.
        """
        self.validate(fail_if_secrets_missing=True)

        if self.requires_remote_server and Client().zen_store.is_local_store():
            raise RuntimeError(
                "Stacks with remote components such as remote orchestrators "
                "and step operators require a remote "
                "ZenML server. To run a pipeline with this stack you need to "
                "connect to a remote ZenML server first. Check out "
                "https://docs.zenml.io/getting-started/deploying-zenml "
                "for more information on how to deploy ZenML."
            )

        for component in self.components.values():
            component.prepare_pipeline_deployment(
                deployment=deployment, stack=self
            )

    def get_docker_builds(
        self, deployment: "PipelineDeploymentBase"
    ) -> List["BuildConfiguration"]:
        """Gets the Docker builds required for the stack.

        Args:
            deployment: The pipeline deployment for which to get the builds.

        Returns:
            The required Docker builds.
        """
        return list(
            itertools.chain.from_iterable(
                component.get_docker_builds(deployment=deployment)
                for component in self.components.values()
            )
        )

    def deploy_pipeline(
        self,
        deployment: "PipelineDeploymentResponse",
        placeholder_run: Optional["PipelineRunResponse"] = None,
    ) -> None:
        """Deploys a pipeline on this stack.

        Args:
            deployment: The pipeline deployment.
            placeholder_run: An optional placeholder run for the deployment.
        """
        self.orchestrator.run(
            deployment=deployment, stack=self, placeholder_run=placeholder_run
        )

    def _get_active_components_for_step(
        self, step_config: "StepConfiguration"
    ) -> Dict[StackComponentType, "StackComponent"]:
        """Gets all the active stack components for a stack.

        Args:
            step_config: Configuration of the step for which to get the active
                components.

        Returns:
            Dictionary of active stack components.
        """

        def _is_active(component: "StackComponent") -> bool:
            """Checks whether a stack component is actively used in the step.

            Args:
                component: The component to check.

            Returns:
                If the component is used in this step.
            """
            if component.type == StackComponentType.STEP_OPERATOR:
                return component.name == step_config.step_operator

            if component.type == StackComponentType.EXPERIMENT_TRACKER:
                return component.name == step_config.experiment_tracker

            return True

        return {
            component_type: component
            for component_type, component in self.components.items()
            if _is_active(component)
        }

    def prepare_step_run(self, info: "StepRunInfo") -> None:
        """Prepares running a step.

        Args:
            info: Info about the step that will be executed.
        """
        for component in self._get_active_components_for_step(
            info.config
        ).values():
            component.prepare_step_run(info=info)

    def get_pipeline_run_metadata(
        self, run_id: UUID
    ) -> Dict[UUID, Dict[str, MetadataType]]:
        """Get general component-specific metadata for a pipeline run.

        Args:
            run_id: ID of the pipeline run.

        Returns:
            A dictionary mapping component IDs to the metadata they created.
        """
        pipeline_run_metadata: Dict[UUID, Dict[str, MetadataType]] = {}
        for component in self.components.values():
            try:
                component_metadata = component.get_pipeline_run_metadata(
                    run_id=run_id
                )
                if component_metadata:
                    pipeline_run_metadata[component.id] = component_metadata
            except Exception as e:
                logger.warning(
                    f"Extracting pipeline run metadata failed for component "
                    f"'{component.name}' of type '{component.type}': {e}"
                )
        return pipeline_run_metadata

    def get_step_run_metadata(
        self, info: "StepRunInfo"
    ) -> Dict[UUID, Dict[str, MetadataType]]:
        """Get component-specific metadata for a step run.

        Args:
            info: Info about the step that was executed.

        Returns:
            A dictionary mapping component IDs to the metadata they created.
        """
        step_run_metadata: Dict[UUID, Dict[str, MetadataType]] = {}
        for component in self._get_active_components_for_step(
            info.config
        ).values():
            try:
                component_metadata = component.get_step_run_metadata(info=info)
                if component_metadata:
                    step_run_metadata[component.id] = component_metadata
            except Exception as e:
                logger.warning(
                    f"Extracting step run metadata failed for component "
                    f"'{component.name}' of type '{component.type}': {e}"
                )
        return step_run_metadata

    def cleanup_step_run(self, info: "StepRunInfo", step_failed: bool) -> None:
        """Cleans up resources after the step run is finished.

        Args:
            info: Info about the step that was executed.
            step_failed: Whether the step failed.
        """
        for component in self._get_active_components_for_step(
            info.config
        ).values():
            component.cleanup_step_run(info=info, step_failed=step_failed)

alerter property

The alerter of the stack.

Returns:

Type Description
Optional[BaseAlerter]

The alerter of the stack.

annotator property

The annotator of the stack.

Returns:

Type Description
Optional[BaseAnnotator]

The annotator of the stack.

apt_packages property

List of APT package requirements for the stack.

Returns:

Type Description
List[str]

A list of APT package requirements for the stack.

artifact_store property

The artifact store of the stack.

Returns:

Type Description
BaseArtifactStore

The artifact store of the stack.

components property

All components of the stack.

Returns:

Type Description
Dict[StackComponentType, StackComponent]

A dictionary of all components of the stack.

container_registry property

The container registry of the stack.

Returns:

Type Description
Optional[BaseContainerRegistry]

The container registry of the stack or None if the stack does not

Optional[BaseContainerRegistry]

have a container registry.

data_validator property

The data validator of the stack.

Returns:

Type Description
Optional[BaseDataValidator]

The data validator of the stack.

experiment_tracker property

The experiment tracker of the stack.

Returns:

Type Description
Optional[BaseExperimentTracker]

The experiment tracker of the stack.

feature_store property

The feature store of the stack.

Returns:

Type Description
Optional[BaseFeatureStore]

The feature store of the stack.

id property

The ID of the stack.

Returns:

Type Description
UUID

The ID of the stack.

image_builder property

The image builder of the stack.

Returns:

Type Description
Optional[BaseImageBuilder]

The image builder of the stack.

model_deployer property

The model deployer of the stack.

Returns:

Type Description
Optional[BaseModelDeployer]

The model deployer of the stack.

model_registry property

The model registry of the stack.

Returns:

Type Description
Optional[BaseModelRegistry]

The model registry of the stack.

name property

The name of the stack.

Returns:

Name Type Description
str str

The name of the stack.

orchestrator property

The orchestrator of the stack.

Returns:

Type Description
BaseOrchestrator

The orchestrator of the stack.

required_secrets property

All required secrets for this stack.

Returns:

Type Description
Set[SecretReference]

The required secrets of this stack.

requires_remote_server property

If the stack requires a remote ZenServer to run.

This is the case if any code is getting executed remotely. This is the case for both remote orchestrators as well as remote step operators.

Returns:

Type Description
bool

If the stack requires a remote ZenServer to run.

setting_classes property

Setting classes of all components of this stack.

Returns:

Type Description
Dict[str, Type[BaseSettings]]

All setting classes and their respective keys.

step_operator property

The step operator of the stack.

Returns:

Type Description
Optional[BaseStepOperator]

The step operator of the stack.

__init__(id, name, *, orchestrator, artifact_store, container_registry=None, step_operator=None, feature_store=None, model_deployer=None, experiment_tracker=None, alerter=None, annotator=None, data_validator=None, image_builder=None, model_registry=None)

Initializes and validates a stack instance.

Parameters:

Name Type Description Default
id UUID

Unique ID of the stack.

required
name str

Name of the stack.

required
orchestrator BaseOrchestrator

Orchestrator component of the stack.

required
artifact_store BaseArtifactStore

Artifact store component of the stack.

required
container_registry Optional[BaseContainerRegistry]

Container registry component of the stack.

None
step_operator Optional[BaseStepOperator]

Step operator component of the stack.

None
feature_store Optional[BaseFeatureStore]

Feature store component of the stack.

None
model_deployer Optional[BaseModelDeployer]

Model deployer component of the stack.

None
experiment_tracker Optional[BaseExperimentTracker]

Experiment tracker component of the stack.

None
alerter Optional[BaseAlerter]

Alerter component of the stack.

None
annotator Optional[BaseAnnotator]

Annotator component of the stack.

None
data_validator Optional[BaseDataValidator]

Data validator component of the stack.

None
image_builder Optional[BaseImageBuilder]

Image builder component of the stack.

None
model_registry Optional[BaseModelRegistry]

Model registry component of the stack.

None
Source code in src/zenml/stack/stack.py
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
def __init__(
    self,
    id: UUID,
    name: str,
    *,
    orchestrator: "BaseOrchestrator",
    artifact_store: "BaseArtifactStore",
    container_registry: Optional["BaseContainerRegistry"] = None,
    step_operator: Optional["BaseStepOperator"] = None,
    feature_store: Optional["BaseFeatureStore"] = None,
    model_deployer: Optional["BaseModelDeployer"] = None,
    experiment_tracker: Optional["BaseExperimentTracker"] = None,
    alerter: Optional["BaseAlerter"] = None,
    annotator: Optional["BaseAnnotator"] = None,
    data_validator: Optional["BaseDataValidator"] = None,
    image_builder: Optional["BaseImageBuilder"] = None,
    model_registry: Optional["BaseModelRegistry"] = None,
):
    """Initializes and validates a stack instance.

    Args:
        id: Unique ID of the stack.
        name: Name of the stack.
        orchestrator: Orchestrator component of the stack.
        artifact_store: Artifact store component of the stack.
        container_registry: Container registry component of the stack.
        step_operator: Step operator component of the stack.
        feature_store: Feature store component of the stack.
        model_deployer: Model deployer component of the stack.
        experiment_tracker: Experiment tracker component of the stack.
        alerter: Alerter component of the stack.
        annotator: Annotator component of the stack.
        data_validator: Data validator component of the stack.
        image_builder: Image builder component of the stack.
        model_registry: Model registry component of the stack.
    """
    self._id = id
    self._name = name
    self._orchestrator = orchestrator
    self._artifact_store = artifact_store
    self._container_registry = container_registry
    self._step_operator = step_operator
    self._feature_store = feature_store
    self._model_deployer = model_deployer
    self._experiment_tracker = experiment_tracker
    self._alerter = alerter
    self._annotator = annotator
    self._data_validator = data_validator
    self._model_registry = model_registry
    self._image_builder = image_builder

check_local_paths()

Checks if the stack has local paths.

Returns:

Type Description
bool

True if the stack has local paths, False otherwise.

Raises:

Type Description
ValueError

If the stack has local paths that do not conform to the convention that all local path must be relative to the local stores directory.

Source code in src/zenml/stack/stack.py
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
def check_local_paths(self) -> bool:
    """Checks if the stack has local paths.

    Returns:
        True if the stack has local paths, False otherwise.

    Raises:
        ValueError: If the stack has local paths that do not conform to
            the convention that all local path must be relative to the
            local stores directory.
    """
    from zenml.config.global_config import GlobalConfiguration

    local_stores_path = GlobalConfiguration().local_stores_path

    # go through all stack components and identify those that advertise
    # a local path where they persist information that they need to be
    # available when running pipelines.
    has_local_paths = False
    for stack_comp in self.components.values():
        local_path = stack_comp.local_path
        if not local_path:
            continue
        # double-check this convention, just in case it wasn't respected
        # as documented in `StackComponent.local_path`
        if not local_path.startswith(local_stores_path):
            raise ValueError(
                f"Local path {local_path} for component "
                f"{stack_comp.name} is not in the local stores "
                f"directory ({local_stores_path})."
            )
        has_local_paths = True

    return has_local_paths

cleanup_step_run(info, step_failed)

Cleans up resources after the step run is finished.

Parameters:

Name Type Description Default
info StepRunInfo

Info about the step that was executed.

required
step_failed bool

Whether the step failed.

required
Source code in src/zenml/stack/stack.py
928
929
930
931
932
933
934
935
936
937
938
def cleanup_step_run(self, info: "StepRunInfo", step_failed: bool) -> None:
    """Cleans up resources after the step run is finished.

    Args:
        info: Info about the step that was executed.
        step_failed: Whether the step failed.
    """
    for component in self._get_active_components_for_step(
        info.config
    ).values():
        component.cleanup_step_run(info=info, step_failed=step_failed)

deploy_pipeline(deployment, placeholder_run=None)

Deploys a pipeline on this stack.

Parameters:

Name Type Description Default
deployment PipelineDeploymentResponse

The pipeline deployment.

required
placeholder_run Optional[PipelineRunResponse]

An optional placeholder run for the deployment.

None
Source code in src/zenml/stack/stack.py
814
815
816
817
818
819
820
821
822
823
824
825
826
827
def deploy_pipeline(
    self,
    deployment: "PipelineDeploymentResponse",
    placeholder_run: Optional["PipelineRunResponse"] = None,
) -> None:
    """Deploys a pipeline on this stack.

    Args:
        deployment: The pipeline deployment.
        placeholder_run: An optional placeholder run for the deployment.
    """
    self.orchestrator.run(
        deployment=deployment, stack=self, placeholder_run=placeholder_run
    )

dict()

Converts the stack into a dictionary.

Returns:

Type Description
Dict[str, str]

A dictionary containing the stack components.

Source code in src/zenml/stack/stack.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
def dict(self) -> Dict[str, str]:
    """Converts the stack into a dictionary.

    Returns:
        A dictionary containing the stack components.
    """
    component_dict = {
        component_type.value: json.dumps(
            component.config.model_dump(mode="json"), sort_keys=True
        )
        for component_type, component in self.components.items()
    }
    component_dict.update({"name": self.name})
    return component_dict

from_components(id, name, components) classmethod

Creates a stack instance from a dict of stack components.

noqa: DAR402

Parameters:

Name Type Description Default
id UUID

Unique ID of the stack.

required
name str

The name of the stack.

required
components Dict[StackComponentType, StackComponent]

The components of the stack.

required

Returns:

Type Description
Stack

A stack instance consisting of the given components.

Raises:

Type Description
TypeError

If a required component is missing or a component doesn't inherit from the expected base class.

Source code in src/zenml/stack/stack.py
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
@classmethod
def from_components(
    cls,
    id: UUID,
    name: str,
    components: Dict[StackComponentType, "StackComponent"],
) -> "Stack":
    """Creates a stack instance from a dict of stack components.

    # noqa: DAR402

    Args:
        id: Unique ID of the stack.
        name: The name of the stack.
        components: The components of the stack.

    Returns:
        A stack instance consisting of the given components.

    Raises:
        TypeError: If a required component is missing or a component
            doesn't inherit from the expected base class.
    """
    from zenml.alerter import BaseAlerter
    from zenml.annotators import BaseAnnotator
    from zenml.artifact_stores import BaseArtifactStore
    from zenml.container_registries import BaseContainerRegistry
    from zenml.data_validators import BaseDataValidator
    from zenml.experiment_trackers import BaseExperimentTracker
    from zenml.feature_stores import BaseFeatureStore
    from zenml.image_builders import BaseImageBuilder
    from zenml.model_deployers import BaseModelDeployer
    from zenml.model_registries import BaseModelRegistry
    from zenml.orchestrators import BaseOrchestrator
    from zenml.step_operators import BaseStepOperator

    def _raise_type_error(
        component: Optional["StackComponent"], expected_class: Type[Any]
    ) -> NoReturn:
        """Raises a TypeError that the component has an unexpected type.

        Args:
            component: The component that has an unexpected type.
            expected_class: The expected type of the component.

        Raises:
            TypeError: If the component has an unexpected type.
        """
        raise TypeError(
            f"Unable to create stack: Wrong stack component type "
            f"`{component.__class__.__name__}` (expected: subclass "
            f"of `{expected_class.__name__}`)"
        )

    orchestrator = components.get(StackComponentType.ORCHESTRATOR)
    if not isinstance(orchestrator, BaseOrchestrator):
        _raise_type_error(orchestrator, BaseOrchestrator)

    artifact_store = components.get(StackComponentType.ARTIFACT_STORE)
    if not isinstance(artifact_store, BaseArtifactStore):
        _raise_type_error(artifact_store, BaseArtifactStore)

    container_registry = components.get(
        StackComponentType.CONTAINER_REGISTRY
    )
    if container_registry is not None and not isinstance(
        container_registry, BaseContainerRegistry
    ):
        _raise_type_error(container_registry, BaseContainerRegistry)

    step_operator = components.get(StackComponentType.STEP_OPERATOR)
    if step_operator is not None and not isinstance(
        step_operator, BaseStepOperator
    ):
        _raise_type_error(step_operator, BaseStepOperator)

    feature_store = components.get(StackComponentType.FEATURE_STORE)
    if feature_store is not None and not isinstance(
        feature_store, BaseFeatureStore
    ):
        _raise_type_error(feature_store, BaseFeatureStore)

    model_deployer = components.get(StackComponentType.MODEL_DEPLOYER)
    if model_deployer is not None and not isinstance(
        model_deployer, BaseModelDeployer
    ):
        _raise_type_error(model_deployer, BaseModelDeployer)

    experiment_tracker = components.get(
        StackComponentType.EXPERIMENT_TRACKER
    )
    if experiment_tracker is not None and not isinstance(
        experiment_tracker, BaseExperimentTracker
    ):
        _raise_type_error(experiment_tracker, BaseExperimentTracker)

    alerter = components.get(StackComponentType.ALERTER)
    if alerter is not None and not isinstance(alerter, BaseAlerter):
        _raise_type_error(alerter, BaseAlerter)

    annotator = components.get(StackComponentType.ANNOTATOR)
    if annotator is not None and not isinstance(annotator, BaseAnnotator):
        _raise_type_error(annotator, BaseAnnotator)

    data_validator = components.get(StackComponentType.DATA_VALIDATOR)
    if data_validator is not None and not isinstance(
        data_validator, BaseDataValidator
    ):
        _raise_type_error(data_validator, BaseDataValidator)

    image_builder = components.get(StackComponentType.IMAGE_BUILDER)
    if image_builder is not None and not isinstance(
        image_builder, BaseImageBuilder
    ):
        _raise_type_error(image_builder, BaseImageBuilder)

    model_registry = components.get(StackComponentType.MODEL_REGISTRY)
    if model_registry is not None and not isinstance(
        model_registry, BaseModelRegistry
    ):
        _raise_type_error(model_registry, BaseModelRegistry)

    return Stack(
        id=id,
        name=name,
        orchestrator=orchestrator,
        artifact_store=artifact_store,
        container_registry=container_registry,
        step_operator=step_operator,
        feature_store=feature_store,
        model_deployer=model_deployer,
        experiment_tracker=experiment_tracker,
        alerter=alerter,
        annotator=annotator,
        data_validator=data_validator,
        image_builder=image_builder,
        model_registry=model_registry,
    )

from_model(stack_model) classmethod

Creates a Stack instance from a StackModel.

Parameters:

Name Type Description Default
stack_model StackResponse

The StackModel to create the Stack from.

required

Returns:

Type Description
Stack

The created Stack instance.

Source code in src/zenml/stack/stack.py
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
@classmethod
def from_model(cls, stack_model: "StackResponse") -> "Stack":
    """Creates a Stack instance from a StackModel.

    Args:
        stack_model: The StackModel to create the Stack from.

    Returns:
        The created Stack instance.
    """
    global _STACK_CACHE
    key = (stack_model.id, stack_model.updated)
    if key in _STACK_CACHE:
        return _STACK_CACHE[key]

    from zenml.stack import StackComponent

    # Run a hydrated list call once to avoid one request per component
    component_models = pagination_utils.depaginate(
        Client().list_stack_components,
        stack_id=stack_model.id,
        hydrate=True,
    )

    stack_components = {
        model.type: StackComponent.from_model(model)
        for model in component_models
    }
    stack = Stack.from_components(
        id=stack_model.id,
        name=stack_model.name,
        components=stack_components,
    )
    _STACK_CACHE[key] = stack

    client = Client()
    if stack_model.id == client.active_stack_model.id:
        if stack_model.updated > client.active_stack_model.updated:
            if client._config:
                client._config.set_active_stack(stack_model)
            else:
                GlobalConfiguration().set_active_stack(stack_model)

    return stack

get_docker_builds(deployment)

Gets the Docker builds required for the stack.

Parameters:

Name Type Description Default
deployment PipelineDeploymentBase

The pipeline deployment for which to get the builds.

required

Returns:

Type Description
List[BuildConfiguration]

The required Docker builds.

Source code in src/zenml/stack/stack.py
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
def get_docker_builds(
    self, deployment: "PipelineDeploymentBase"
) -> List["BuildConfiguration"]:
    """Gets the Docker builds required for the stack.

    Args:
        deployment: The pipeline deployment for which to get the builds.

    Returns:
        The required Docker builds.
    """
    return list(
        itertools.chain.from_iterable(
            component.get_docker_builds(deployment=deployment)
            for component in self.components.values()
        )
    )

get_pipeline_run_metadata(run_id)

Get general component-specific metadata for a pipeline run.

Parameters:

Name Type Description Default
run_id UUID

ID of the pipeline run.

required

Returns:

Type Description
Dict[UUID, Dict[str, MetadataType]]

A dictionary mapping component IDs to the metadata they created.

Source code in src/zenml/stack/stack.py
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
def get_pipeline_run_metadata(
    self, run_id: UUID
) -> Dict[UUID, Dict[str, MetadataType]]:
    """Get general component-specific metadata for a pipeline run.

    Args:
        run_id: ID of the pipeline run.

    Returns:
        A dictionary mapping component IDs to the metadata they created.
    """
    pipeline_run_metadata: Dict[UUID, Dict[str, MetadataType]] = {}
    for component in self.components.values():
        try:
            component_metadata = component.get_pipeline_run_metadata(
                run_id=run_id
            )
            if component_metadata:
                pipeline_run_metadata[component.id] = component_metadata
        except Exception as e:
            logger.warning(
                f"Extracting pipeline run metadata failed for component "
                f"'{component.name}' of type '{component.type}': {e}"
            )
    return pipeline_run_metadata

get_step_run_metadata(info)

Get component-specific metadata for a step run.

Parameters:

Name Type Description Default
info StepRunInfo

Info about the step that was executed.

required

Returns:

Type Description
Dict[UUID, Dict[str, MetadataType]]

A dictionary mapping component IDs to the metadata they created.

Source code in src/zenml/stack/stack.py
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
def get_step_run_metadata(
    self, info: "StepRunInfo"
) -> Dict[UUID, Dict[str, MetadataType]]:
    """Get component-specific metadata for a step run.

    Args:
        info: Info about the step that was executed.

    Returns:
        A dictionary mapping component IDs to the metadata they created.
    """
    step_run_metadata: Dict[UUID, Dict[str, MetadataType]] = {}
    for component in self._get_active_components_for_step(
        info.config
    ).values():
        try:
            component_metadata = component.get_step_run_metadata(info=info)
            if component_metadata:
                step_run_metadata[component.id] = component_metadata
        except Exception as e:
            logger.warning(
                f"Extracting step run metadata failed for component "
                f"'{component.name}' of type '{component.type}': {e}"
            )
    return step_run_metadata

prepare_pipeline_deployment(deployment)

Prepares the stack for a pipeline deployment.

This method is called before a pipeline is deployed.

Parameters:

Name Type Description Default
deployment PipelineDeploymentResponse

The pipeline deployment

required

Raises:

Type Description
RuntimeError

If trying to deploy a pipeline that requires a remote ZenML server with a local one.

Source code in src/zenml/stack/stack.py
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
def prepare_pipeline_deployment(
    self, deployment: "PipelineDeploymentResponse"
) -> None:
    """Prepares the stack for a pipeline deployment.

    This method is called before a pipeline is deployed.

    Args:
        deployment: The pipeline deployment

    Raises:
        RuntimeError: If trying to deploy a pipeline that requires a remote
            ZenML server with a local one.
    """
    self.validate(fail_if_secrets_missing=True)

    if self.requires_remote_server and Client().zen_store.is_local_store():
        raise RuntimeError(
            "Stacks with remote components such as remote orchestrators "
            "and step operators require a remote "
            "ZenML server. To run a pipeline with this stack you need to "
            "connect to a remote ZenML server first. Check out "
            "https://docs.zenml.io/getting-started/deploying-zenml "
            "for more information on how to deploy ZenML."
        )

    for component in self.components.values():
        component.prepare_pipeline_deployment(
            deployment=deployment, stack=self
        )

prepare_step_run(info)

Prepares running a step.

Parameters:

Name Type Description Default
info StepRunInfo

Info about the step that will be executed.

required
Source code in src/zenml/stack/stack.py
865
866
867
868
869
870
871
872
873
874
def prepare_step_run(self, info: "StepRunInfo") -> None:
    """Prepares running a step.

    Args:
        info: Info about the step that will be executed.
    """
    for component in self._get_active_components_for_step(
        info.config
    ).values():
        component.prepare_step_run(info=info)

requirements(exclude_components=None)

Set of PyPI requirements for the stack.

This method combines the requirements of all stack components (except the ones specified in exclude_components).

Parameters:

Name Type Description Default
exclude_components Optional[AbstractSet[StackComponentType]]

Set of component types for which the requirements should not be included in the output.

None

Returns:

Type Description
Set[str]

Set of PyPI requirements.

Source code in src/zenml/stack/stack.py
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
def requirements(
    self,
    exclude_components: Optional[AbstractSet[StackComponentType]] = None,
) -> Set[str]:
    """Set of PyPI requirements for the stack.

    This method combines the requirements of all stack components (except
    the ones specified in `exclude_components`).

    Args:
        exclude_components: Set of component types for which the
            requirements should not be included in the output.

    Returns:
        Set of PyPI requirements.
    """
    exclude_components = exclude_components or set()
    requirements = [
        component.requirements
        for component in self.components.values()
        if component.type not in exclude_components
    ]
    return set.union(*requirements) if requirements else set()

validate(fail_if_secrets_missing=False)

Checks whether the stack configuration is valid.

To check if a stack configuration is valid, the following criteria must be met: - the stack must have an image builder if other components require it - the StackValidator of each stack component has to validate the stack to make sure all the components are compatible with each other - the required secrets of all components need to exist

Parameters:

Name Type Description Default
fail_if_secrets_missing bool

If this is True, an error will be raised if a secret for a component is missing. Otherwise, only a warning will be logged.

False
Source code in src/zenml/stack/stack.py
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
def validate(
    self,
    fail_if_secrets_missing: bool = False,
) -> None:
    """Checks whether the stack configuration is valid.

    To check if a stack configuration is valid, the following criteria must
    be met:
    - the stack must have an image builder if other components require it
    - the `StackValidator` of each stack component has to validate the
        stack to make sure all the components are compatible with each other
    - the required secrets of all components need to exist

    Args:
        fail_if_secrets_missing: If this is `True`, an error will be raised
            if a secret for a component is missing. Otherwise, only a
            warning will be logged.
    """
    if handle_bool_env_var(ENV_ZENML_SKIP_STACK_VALIDATION, default=False):
        logger.debug("Skipping stack validation.")
        return

    self.validate_image_builder()
    for component in self.components.values():
        if component.validator:
            component.validator.validate(stack=self)

    self._validate_secrets(raise_exception=fail_if_secrets_missing)

validate_image_builder()

Validates that the stack has an image builder if required.

If the stack requires an image builder, but none is specified, a local image builder will be created and assigned to the stack to ensure backwards compatibility.

Source code in src/zenml/stack/stack.py
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
def validate_image_builder(self) -> None:
    """Validates that the stack has an image builder if required.

    If the stack requires an image builder, but none is specified, a
    local image builder will be created and assigned to the stack to
    ensure backwards compatibility.
    """
    requires_image_builder = (
        self.orchestrator.flavor != "local"
        or self.step_operator
        or (self.model_deployer and self.model_deployer.flavor != "mlflow")
    )
    skip_default_image_builder = handle_bool_env_var(
        ENV_ZENML_SKIP_IMAGE_BUILDER_DEFAULT, default=False
    )
    if (
        requires_image_builder
        and not skip_default_image_builder
        and not self.image_builder
    ):
        from uuid import uuid4

        from zenml.image_builders import (
            LocalImageBuilder,
            LocalImageBuilderConfig,
            LocalImageBuilderFlavor,
        )

        flavor = LocalImageBuilderFlavor()

        now = utc_now()
        image_builder = LocalImageBuilder(
            id=uuid4(),
            name="temporary_default",
            flavor=flavor.name,
            type=flavor.type,
            config=LocalImageBuilderConfig(),
            user=Client().active_user.id,
            created=now,
            updated=now,
        )

        self._image_builder = image_builder

StackComponent

Abstract StackComponent class for all components of a ZenML stack.

Source code in src/zenml/stack/stack_component.py
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
class StackComponent:
    """Abstract StackComponent class for all components of a ZenML stack."""

    def __init__(
        self,
        name: str,
        id: UUID,
        config: StackComponentConfig,
        flavor: str,
        type: StackComponentType,
        user: Optional[UUID],
        created: datetime,
        updated: datetime,
        labels: Optional[Dict[str, Any]] = None,
        connector_requirements: Optional[ServiceConnectorRequirements] = None,
        connector: Optional[UUID] = None,
        connector_resource_id: Optional[str] = None,
        *args: Any,
        **kwargs: Any,
    ):
        """Initializes a StackComponent.

        Args:
            name: The name of the component.
            id: The unique ID of the component.
            config: The config of the component.
            flavor: The flavor of the component.
            type: The type of the component.
            user: The ID of the user who created the component.
            created: The creation time of the component.
            updated: The last update time of the component.
            labels: The labels of the component.
            connector_requirements: The requirements for the connector.
            connector: The ID of a connector linked to the component.
            connector_resource_id: The custom resource ID to access through
                the connector.
            *args: Additional positional arguments.
            **kwargs: Additional keyword arguments.

        Raises:
            ValueError: If a secret reference is passed as name.
        """
        if secret_utils.is_secret_reference(name):
            raise ValueError(
                "Passing the `name` attribute of a stack component as a "
                "secret reference is not allowed."
            )

        self.id = id
        self.name = name
        self._config = config
        self.flavor = flavor
        self.type = type
        self.user = user
        self.created = created
        self.updated = updated
        self.labels = labels
        self.connector_requirements = connector_requirements
        self.connector = connector
        self.connector_resource_id = connector_resource_id
        self._connector_instance: Optional[ServiceConnector] = None

    @classmethod
    def from_model(
        cls, component_model: "ComponentResponse"
    ) -> "StackComponent":
        """Creates a StackComponent from a ComponentModel.

        Args:
            component_model: The ComponentModel to create the StackComponent

        Returns:
            The created StackComponent.

        Raises:
            ImportError: If the flavor can't be imported.
        """
        from zenml.stack import Flavor

        flavor_model = component_model.flavor
        flavor = Flavor.from_model(flavor_model)

        configuration = flavor.config_class(**component_model.configuration)

        user_id = component_model.user_id

        try:
            return flavor.implementation_class(
                user=user_id,
                name=component_model.name,
                id=component_model.id,
                config=configuration,
                labels=component_model.labels,
                flavor=component_model.flavor_name,
                type=component_model.type,
                created=component_model.created,
                updated=component_model.updated,
                connector_requirements=flavor.service_connector_requirements,
                connector=component_model.connector.id
                if component_model.connector
                else None,
                connector_resource_id=component_model.connector_resource_id,
            )
        except ImportError as e:
            from zenml.integrations.registry import integration_registry

            integration_requirements = " ".join(
                integration_registry.select_integration_requirements(
                    flavor_model.integration
                )
            )

            if integration_registry.is_installed(flavor_model.integration):
                raise ImportError(
                    f"{e}\n\n"
                    f"Something went wrong while trying to import from the "
                    f"`{flavor_model.integration}` integration. Please make "
                    "sure that all its requirements are installed properly by "
                    "reinstalling the integration either through our CLI: "
                    f"`zenml integration install {flavor_model.integration} "
                    "-y` or by manually installing its requirements: "
                    f"`pip install {integration_requirements}`. If the error "
                    "persists, please contact the ZenML team."
                ) from e
            else:
                raise ImportError(
                    f"{e}\n\n"
                    f"The `{flavor_model.integration}` integration that you "
                    "are trying to use is not installed in your current "
                    "environment. Please make sure that it is installed by "
                    "either using our CLI: `zenml integration install "
                    f"{flavor_model.integration}` or by manually installing "
                    f"its requirements: `pip install "
                    f"{integration_requirements}`"
                ) from e

    @property
    def config(self) -> StackComponentConfig:
        """Returns the configuration of the stack component.

        This should be overwritten by any subclasses that define custom configs
        to return the correct config class.

        Returns:
            The configuration of the stack component.
        """
        return self._config

    @property
    def settings_class(self) -> Optional[Type["BaseSettings"]]:
        """Class specifying available settings for this component.

        Returns:
            Optional settings class.
        """
        return None

    def get_settings(
        self,
        container: Union[
            "Step",
            "StepRunResponse",
            "StepRunInfo",
            "PipelineDeploymentBase",
            "PipelineDeploymentResponse",
            "PipelineRunResponse",
        ],
    ) -> "BaseSettings":
        """Gets settings for this stack component.

        This will return `None` if the stack component doesn't specify a
        settings class or the container doesn't contain runtime
        options for this component.

        Args:
            container: The `Step`, `StepRunInfo` or `PipelineDeployment` from
                which to get the settings.

        Returns:
            Settings for this stack component.

        Raises:
            RuntimeError: If the stack component does not specify a settings
                class.
        """
        if not self.settings_class:
            raise RuntimeError(
                f"Unable to get settings for component {self} because this "
                "component does not have an associated settings class. "
                "Return a settings class from the `@settings_class` property "
                "and try again."
            )

        key = settings_utils.get_stack_component_setting_key(self)

        all_settings = (
            container.config.settings
            if isinstance(
                container,
                (Step, StepRunResponse, StepRunInfo, PipelineRunResponse),
            )
            else container.pipeline_configuration.settings
        )

        # Use the current config as a base
        settings_dict = self.config.model_dump()

        if key in all_settings:
            settings_dict.update(dict(all_settings[key]))

        return self.settings_class.model_validate(settings_dict)

    def connector_has_expired(self) -> bool:
        """Checks whether the connector linked to this stack component has expired.

        Returns:
            Whether the connector linked to this stack component has expired, or isn't linked to a connector.
        """
        if self.connector is None:
            # The stack component isn't linked to a connector
            return False

        if self._connector_instance is None:
            return True

        return self._connector_instance.has_expired()

    def get_connector(self) -> Optional["ServiceConnector"]:
        """Returns the connector linked to this stack component.

        Returns:
            The connector linked to this stack component.

        Raises:
            RuntimeError: If the stack component does not specify connector
                requirements or if the connector linked to the component is not
                compatible or not found.
        """
        from zenml.client import Client

        if self.connector is None:
            return None

        if self._connector_instance is not None:
            # If the connector instance is still valid, return it. Otherwise,
            # we'll try to get a new one.
            if not self._connector_instance.has_expired():
                return self._connector_instance

        if self.connector_requirements is None:
            raise RuntimeError(
                f"Unable to get connector for component {self} because this "
                "component does not declare any connector requirements in its. "
                "flavor specification. Override the "
                "`service_connector_requirements` method in its flavor class "
                "to return a connector requirements specification and try "
                "again."
            )

        if self.connector_requirements.resource_id_attr is not None:
            # Check if an attribute is set in the component configuration
            resource_id = getattr(
                self.config, self.connector_requirements.resource_id_attr
            )
        else:
            # Otherwise, use the resource ID configured in the component
            resource_id = self.connector_resource_id

        client = Client()
        try:
            self._connector_instance = client.get_service_connector_client(
                name_id_or_prefix=self.connector,
                resource_type=self.connector_requirements.resource_type,
                resource_id=resource_id,
            )
        except KeyError:
            raise RuntimeError(
                f"The connector with ID {self.connector} linked "
                f"to the '{self.name}' {self.type} stack component could not "
                f"be found or is not accessible. Please verify that the "
                f"connector exists and that you have access to it."
            )
        except ValueError as e:
            raise RuntimeError(
                f"The connector with ID {self.connector} linked "
                f"to the '{self.name}' {self.type} stack component could not "
                f"be correctly configured: {e}."
            )
        except AuthorizationException as e:
            raise RuntimeError(
                f"The connector with ID {self.connector} linked "
                f"to the '{self.name}' {self.type} stack component could not "
                f"be accessed due to an authorization error: {e}. Please "
                f"verify that you have access to the connector and try again."
            )

        return self._connector_instance

    @property
    def log_file(self) -> Optional[str]:
        """Optional path to a log file for the stack component.

        Returns:
            Optional path to a log file for the stack component.
        """
        # TODO [ENG-136]: Add support for multiple log files for a stack
        #  component. E.g. let each component return a generator that yields
        #  logs instead of specifying a single file path.
        return None

    @property
    def requirements(self) -> Set[str]:
        """Set of PyPI requirements for the component.

        Returns:
            A set of PyPI requirements for the component.
        """
        from zenml.integrations.utils import get_requirements_for_module

        return set(get_requirements_for_module(self.__module__))

    @property
    def apt_packages(self) -> List[str]:
        """List of APT package requirements for the component.

        Returns:
            A list of APT package requirements for the component.
        """
        from zenml.integrations.utils import get_integration_for_module

        integration = get_integration_for_module(self.__module__)
        return integration.APT_PACKAGES if integration else []

    @property
    def local_path(self) -> Optional[str]:
        """Path to a local directory to store persistent information.

        This property should only be implemented by components that need to
        store persistent information in a directory on the local machine and
        also need that information to be available during pipeline runs.

        IMPORTANT: the path returned by this property must always be a path
        that is relative to the ZenML local store's directory. The local
        orchestrators rely on this convention to correctly mount the
        local folders in the containers. This is an example of a valid
        path:

        ```python
        from zenml.config.global_config import GlobalConfiguration

        ...

        @property
        def local_path(self) -> Optional[str]:

            return os.path.join(
                GlobalConfiguration().local_stores_path,
                str(self.uuid),
            )
        ```

        Returns:
            A path to a local directory used by the component to store
            persistent information.
        """
        return None

    def get_docker_builds(
        self, deployment: "PipelineDeploymentBase"
    ) -> List["BuildConfiguration"]:
        """Gets the Docker builds required for the component.

        Args:
            deployment: The pipeline deployment for which to get the builds.

        Returns:
            The required Docker builds.
        """
        return []

    def prepare_pipeline_deployment(
        self,
        deployment: "PipelineDeploymentResponse",
        stack: "Stack",
    ) -> None:
        """Prepares deploying the pipeline.

        This method gets called immediately before a pipeline is deployed.
        Subclasses should override it if they require runtime configuration
        options or if they need to run code before the pipeline deployment.

        Args:
            deployment: The pipeline deployment configuration.
            stack: The stack on which the pipeline will be deployed.
        """

    def get_pipeline_run_metadata(
        self, run_id: UUID
    ) -> Dict[str, "MetadataType"]:
        """Get general component-specific metadata for a pipeline run.

        Args:
            run_id: The ID of the pipeline run.

        Returns:
            A dictionary of metadata.
        """
        return {}

    def prepare_step_run(self, info: "StepRunInfo") -> None:
        """Prepares running a step.

        Args:
            info: Info about the step that will be executed.
        """

    def get_step_run_metadata(
        self, info: "StepRunInfo"
    ) -> Dict[str, "MetadataType"]:
        """Get component- and step-specific metadata after a step ran.

        Args:
            info: Info about the step that was executed.

        Returns:
            A dictionary of metadata.
        """
        return {}

    def cleanup_step_run(self, info: "StepRunInfo", step_failed: bool) -> None:
        """Cleans up resources after the step run is finished.

        Args:
            info: Info about the step that was executed.
            step_failed: Whether the step failed.
        """

    @property
    def post_registration_message(self) -> Optional[str]:
        """Optional message printed after the stack component is registered.

        Returns:
            An optional message.
        """
        return None

    @property
    def validator(self) -> Optional["StackValidator"]:
        """The optional validator of the stack component.

        This validator will be called each time a stack with the stack
        component is initialized. Subclasses should override this property
        and return a `StackValidator` that makes sure they're not included in
        any stack that they're not compatible with.

        Returns:
            An optional `StackValidator` instance.
        """
        return None

    def cleanup(self) -> None:
        """Cleans up the component after it has been used."""
        pass

    def __repr__(self) -> str:
        """String representation of the stack component.

        Returns:
            A string representation of the stack component.
        """
        attribute_representation = ", ".join(
            f"{key}={value}" for key, value in self.config.model_dump().items()
        )
        return (
            f"{self.__class__.__qualname__}(type={self.type}, "
            f"flavor={self.flavor}, {attribute_representation})"
        )

    def __str__(self) -> str:
        """String representation of the stack component.

        Returns:
            A string representation of the stack component.
        """
        return self.__repr__()

apt_packages property

List of APT package requirements for the component.

Returns:

Type Description
List[str]

A list of APT package requirements for the component.

config property

Returns the configuration of the stack component.

This should be overwritten by any subclasses that define custom configs to return the correct config class.

Returns:

Type Description
StackComponentConfig

The configuration of the stack component.

local_path property

Path to a local directory to store persistent information.

This property should only be implemented by components that need to store persistent information in a directory on the local machine and also need that information to be available during pipeline runs.

IMPORTANT: the path returned by this property must always be a path that is relative to the ZenML local store's directory. The local orchestrators rely on this convention to correctly mount the local folders in the containers. This is an example of a valid path:

from zenml.config.global_config import GlobalConfiguration

...

@property
def local_path(self) -> Optional[str]:

    return os.path.join(
        GlobalConfiguration().local_stores_path,
        str(self.uuid),
    )

Returns:

Type Description
Optional[str]

A path to a local directory used by the component to store

Optional[str]

persistent information.

log_file property

Optional path to a log file for the stack component.

Returns:

Type Description
Optional[str]

Optional path to a log file for the stack component.

post_registration_message property

Optional message printed after the stack component is registered.

Returns:

Type Description
Optional[str]

An optional message.

requirements property

Set of PyPI requirements for the component.

Returns:

Type Description
Set[str]

A set of PyPI requirements for the component.

settings_class property

Class specifying available settings for this component.

Returns:

Type Description
Optional[Type[BaseSettings]]

Optional settings class.

validator property

The optional validator of the stack component.

This validator will be called each time a stack with the stack component is initialized. Subclasses should override this property and return a StackValidator that makes sure they're not included in any stack that they're not compatible with.

Returns:

Type Description
Optional[StackValidator]

An optional StackValidator instance.

__init__(name, id, config, flavor, type, user, created, updated, labels=None, connector_requirements=None, connector=None, connector_resource_id=None, *args, **kwargs)

Initializes a StackComponent.

Parameters:

Name Type Description Default
name str

The name of the component.

required
id UUID

The unique ID of the component.

required
config StackComponentConfig

The config of the component.

required
flavor str

The flavor of the component.

required
type StackComponentType

The type of the component.

required
user Optional[UUID]

The ID of the user who created the component.

required
created datetime

The creation time of the component.

required
updated datetime

The last update time of the component.

required
labels Optional[Dict[str, Any]]

The labels of the component.

None
connector_requirements Optional[ServiceConnectorRequirements]

The requirements for the connector.

None
connector Optional[UUID]

The ID of a connector linked to the component.

None
connector_resource_id Optional[str]

The custom resource ID to access through the connector.

None
*args Any

Additional positional arguments.

()
**kwargs Any

Additional keyword arguments.

{}

Raises:

Type Description
ValueError

If a secret reference is passed as name.

Source code in src/zenml/stack/stack_component.py
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
def __init__(
    self,
    name: str,
    id: UUID,
    config: StackComponentConfig,
    flavor: str,
    type: StackComponentType,
    user: Optional[UUID],
    created: datetime,
    updated: datetime,
    labels: Optional[Dict[str, Any]] = None,
    connector_requirements: Optional[ServiceConnectorRequirements] = None,
    connector: Optional[UUID] = None,
    connector_resource_id: Optional[str] = None,
    *args: Any,
    **kwargs: Any,
):
    """Initializes a StackComponent.

    Args:
        name: The name of the component.
        id: The unique ID of the component.
        config: The config of the component.
        flavor: The flavor of the component.
        type: The type of the component.
        user: The ID of the user who created the component.
        created: The creation time of the component.
        updated: The last update time of the component.
        labels: The labels of the component.
        connector_requirements: The requirements for the connector.
        connector: The ID of a connector linked to the component.
        connector_resource_id: The custom resource ID to access through
            the connector.
        *args: Additional positional arguments.
        **kwargs: Additional keyword arguments.

    Raises:
        ValueError: If a secret reference is passed as name.
    """
    if secret_utils.is_secret_reference(name):
        raise ValueError(
            "Passing the `name` attribute of a stack component as a "
            "secret reference is not allowed."
        )

    self.id = id
    self.name = name
    self._config = config
    self.flavor = flavor
    self.type = type
    self.user = user
    self.created = created
    self.updated = updated
    self.labels = labels
    self.connector_requirements = connector_requirements
    self.connector = connector
    self.connector_resource_id = connector_resource_id
    self._connector_instance: Optional[ServiceConnector] = None

__repr__()

String representation of the stack component.

Returns:

Type Description
str

A string representation of the stack component.

Source code in src/zenml/stack/stack_component.py
789
790
791
792
793
794
795
796
797
798
799
800
801
def __repr__(self) -> str:
    """String representation of the stack component.

    Returns:
        A string representation of the stack component.
    """
    attribute_representation = ", ".join(
        f"{key}={value}" for key, value in self.config.model_dump().items()
    )
    return (
        f"{self.__class__.__qualname__}(type={self.type}, "
        f"flavor={self.flavor}, {attribute_representation})"
    )

__str__()

String representation of the stack component.

Returns:

Type Description
str

A string representation of the stack component.

Source code in src/zenml/stack/stack_component.py
803
804
805
806
807
808
809
def __str__(self) -> str:
    """String representation of the stack component.

    Returns:
        A string representation of the stack component.
    """
    return self.__repr__()

cleanup()

Cleans up the component after it has been used.

Source code in src/zenml/stack/stack_component.py
785
786
787
def cleanup(self) -> None:
    """Cleans up the component after it has been used."""
    pass

cleanup_step_run(info, step_failed)

Cleans up resources after the step run is finished.

Parameters:

Name Type Description Default
info StepRunInfo

Info about the step that was executed.

required
step_failed bool

Whether the step failed.

required
Source code in src/zenml/stack/stack_component.py
754
755
756
757
758
759
760
def cleanup_step_run(self, info: "StepRunInfo", step_failed: bool) -> None:
    """Cleans up resources after the step run is finished.

    Args:
        info: Info about the step that was executed.
        step_failed: Whether the step failed.
    """

connector_has_expired()

Checks whether the connector linked to this stack component has expired.

Returns:

Type Description
bool

Whether the connector linked to this stack component has expired, or isn't linked to a connector.

Source code in src/zenml/stack/stack_component.py
537
538
539
540
541
542
543
544
545
546
547
548
549
550
def connector_has_expired(self) -> bool:
    """Checks whether the connector linked to this stack component has expired.

    Returns:
        Whether the connector linked to this stack component has expired, or isn't linked to a connector.
    """
    if self.connector is None:
        # The stack component isn't linked to a connector
        return False

    if self._connector_instance is None:
        return True

    return self._connector_instance.has_expired()

from_model(component_model) classmethod

Creates a StackComponent from a ComponentModel.

Parameters:

Name Type Description Default
component_model ComponentResponse

The ComponentModel to create the StackComponent

required

Returns:

Type Description
StackComponent

The created StackComponent.

Raises:

Type Description
ImportError

If the flavor can't be imported.

Source code in src/zenml/stack/stack_component.py
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
@classmethod
def from_model(
    cls, component_model: "ComponentResponse"
) -> "StackComponent":
    """Creates a StackComponent from a ComponentModel.

    Args:
        component_model: The ComponentModel to create the StackComponent

    Returns:
        The created StackComponent.

    Raises:
        ImportError: If the flavor can't be imported.
    """
    from zenml.stack import Flavor

    flavor_model = component_model.flavor
    flavor = Flavor.from_model(flavor_model)

    configuration = flavor.config_class(**component_model.configuration)

    user_id = component_model.user_id

    try:
        return flavor.implementation_class(
            user=user_id,
            name=component_model.name,
            id=component_model.id,
            config=configuration,
            labels=component_model.labels,
            flavor=component_model.flavor_name,
            type=component_model.type,
            created=component_model.created,
            updated=component_model.updated,
            connector_requirements=flavor.service_connector_requirements,
            connector=component_model.connector.id
            if component_model.connector
            else None,
            connector_resource_id=component_model.connector_resource_id,
        )
    except ImportError as e:
        from zenml.integrations.registry import integration_registry

        integration_requirements = " ".join(
            integration_registry.select_integration_requirements(
                flavor_model.integration
            )
        )

        if integration_registry.is_installed(flavor_model.integration):
            raise ImportError(
                f"{e}\n\n"
                f"Something went wrong while trying to import from the "
                f"`{flavor_model.integration}` integration. Please make "
                "sure that all its requirements are installed properly by "
                "reinstalling the integration either through our CLI: "
                f"`zenml integration install {flavor_model.integration} "
                "-y` or by manually installing its requirements: "
                f"`pip install {integration_requirements}`. If the error "
                "persists, please contact the ZenML team."
            ) from e
        else:
            raise ImportError(
                f"{e}\n\n"
                f"The `{flavor_model.integration}` integration that you "
                "are trying to use is not installed in your current "
                "environment. Please make sure that it is installed by "
                "either using our CLI: `zenml integration install "
                f"{flavor_model.integration}` or by manually installing "
                f"its requirements: `pip install "
                f"{integration_requirements}`"
            ) from e

get_connector()

Returns the connector linked to this stack component.

Returns:

Type Description
Optional[ServiceConnector]

The connector linked to this stack component.

Raises:

Type Description
RuntimeError

If the stack component does not specify connector requirements or if the connector linked to the component is not compatible or not found.

Source code in src/zenml/stack/stack_component.py
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
def get_connector(self) -> Optional["ServiceConnector"]:
    """Returns the connector linked to this stack component.

    Returns:
        The connector linked to this stack component.

    Raises:
        RuntimeError: If the stack component does not specify connector
            requirements or if the connector linked to the component is not
            compatible or not found.
    """
    from zenml.client import Client

    if self.connector is None:
        return None

    if self._connector_instance is not None:
        # If the connector instance is still valid, return it. Otherwise,
        # we'll try to get a new one.
        if not self._connector_instance.has_expired():
            return self._connector_instance

    if self.connector_requirements is None:
        raise RuntimeError(
            f"Unable to get connector for component {self} because this "
            "component does not declare any connector requirements in its. "
            "flavor specification. Override the "
            "`service_connector_requirements` method in its flavor class "
            "to return a connector requirements specification and try "
            "again."
        )

    if self.connector_requirements.resource_id_attr is not None:
        # Check if an attribute is set in the component configuration
        resource_id = getattr(
            self.config, self.connector_requirements.resource_id_attr
        )
    else:
        # Otherwise, use the resource ID configured in the component
        resource_id = self.connector_resource_id

    client = Client()
    try:
        self._connector_instance = client.get_service_connector_client(
            name_id_or_prefix=self.connector,
            resource_type=self.connector_requirements.resource_type,
            resource_id=resource_id,
        )
    except KeyError:
        raise RuntimeError(
            f"The connector with ID {self.connector} linked "
            f"to the '{self.name}' {self.type} stack component could not "
            f"be found or is not accessible. Please verify that the "
            f"connector exists and that you have access to it."
        )
    except ValueError as e:
        raise RuntimeError(
            f"The connector with ID {self.connector} linked "
            f"to the '{self.name}' {self.type} stack component could not "
            f"be correctly configured: {e}."
        )
    except AuthorizationException as e:
        raise RuntimeError(
            f"The connector with ID {self.connector} linked "
            f"to the '{self.name}' {self.type} stack component could not "
            f"be accessed due to an authorization error: {e}. Please "
            f"verify that you have access to the connector and try again."
        )

    return self._connector_instance

get_docker_builds(deployment)

Gets the Docker builds required for the component.

Parameters:

Name Type Description Default
deployment PipelineDeploymentBase

The pipeline deployment for which to get the builds.

required

Returns:

Type Description
List[BuildConfiguration]

The required Docker builds.

Source code in src/zenml/stack/stack_component.py
692
693
694
695
696
697
698
699
700
701
702
703
def get_docker_builds(
    self, deployment: "PipelineDeploymentBase"
) -> List["BuildConfiguration"]:
    """Gets the Docker builds required for the component.

    Args:
        deployment: The pipeline deployment for which to get the builds.

    Returns:
        The required Docker builds.
    """
    return []

get_pipeline_run_metadata(run_id)

Get general component-specific metadata for a pipeline run.

Parameters:

Name Type Description Default
run_id UUID

The ID of the pipeline run.

required

Returns:

Type Description
Dict[str, MetadataType]

A dictionary of metadata.

Source code in src/zenml/stack/stack_component.py
721
722
723
724
725
726
727
728
729
730
731
732
def get_pipeline_run_metadata(
    self, run_id: UUID
) -> Dict[str, "MetadataType"]:
    """Get general component-specific metadata for a pipeline run.

    Args:
        run_id: The ID of the pipeline run.

    Returns:
        A dictionary of metadata.
    """
    return {}

get_settings(container)

Gets settings for this stack component.

This will return None if the stack component doesn't specify a settings class or the container doesn't contain runtime options for this component.

Parameters:

Name Type Description Default
container Union[Step, StepRunResponse, StepRunInfo, PipelineDeploymentBase, PipelineDeploymentResponse, PipelineRunResponse]

The Step, StepRunInfo or PipelineDeployment from which to get the settings.

required

Returns:

Type Description
BaseSettings

Settings for this stack component.

Raises:

Type Description
RuntimeError

If the stack component does not specify a settings class.

Source code in src/zenml/stack/stack_component.py
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
def get_settings(
    self,
    container: Union[
        "Step",
        "StepRunResponse",
        "StepRunInfo",
        "PipelineDeploymentBase",
        "PipelineDeploymentResponse",
        "PipelineRunResponse",
    ],
) -> "BaseSettings":
    """Gets settings for this stack component.

    This will return `None` if the stack component doesn't specify a
    settings class or the container doesn't contain runtime
    options for this component.

    Args:
        container: The `Step`, `StepRunInfo` or `PipelineDeployment` from
            which to get the settings.

    Returns:
        Settings for this stack component.

    Raises:
        RuntimeError: If the stack component does not specify a settings
            class.
    """
    if not self.settings_class:
        raise RuntimeError(
            f"Unable to get settings for component {self} because this "
            "component does not have an associated settings class. "
            "Return a settings class from the `@settings_class` property "
            "and try again."
        )

    key = settings_utils.get_stack_component_setting_key(self)

    all_settings = (
        container.config.settings
        if isinstance(
            container,
            (Step, StepRunResponse, StepRunInfo, PipelineRunResponse),
        )
        else container.pipeline_configuration.settings
    )

    # Use the current config as a base
    settings_dict = self.config.model_dump()

    if key in all_settings:
        settings_dict.update(dict(all_settings[key]))

    return self.settings_class.model_validate(settings_dict)

get_step_run_metadata(info)

Get component- and step-specific metadata after a step ran.

Parameters:

Name Type Description Default
info StepRunInfo

Info about the step that was executed.

required

Returns:

Type Description
Dict[str, MetadataType]

A dictionary of metadata.

Source code in src/zenml/stack/stack_component.py
741
742
743
744
745
746
747
748
749
750
751
752
def get_step_run_metadata(
    self, info: "StepRunInfo"
) -> Dict[str, "MetadataType"]:
    """Get component- and step-specific metadata after a step ran.

    Args:
        info: Info about the step that was executed.

    Returns:
        A dictionary of metadata.
    """
    return {}

prepare_pipeline_deployment(deployment, stack)

Prepares deploying the pipeline.

This method gets called immediately before a pipeline is deployed. Subclasses should override it if they require runtime configuration options or if they need to run code before the pipeline deployment.

Parameters:

Name Type Description Default
deployment PipelineDeploymentResponse

The pipeline deployment configuration.

required
stack Stack

The stack on which the pipeline will be deployed.

required
Source code in src/zenml/stack/stack_component.py
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
def prepare_pipeline_deployment(
    self,
    deployment: "PipelineDeploymentResponse",
    stack: "Stack",
) -> None:
    """Prepares deploying the pipeline.

    This method gets called immediately before a pipeline is deployed.
    Subclasses should override it if they require runtime configuration
    options or if they need to run code before the pipeline deployment.

    Args:
        deployment: The pipeline deployment configuration.
        stack: The stack on which the pipeline will be deployed.
    """

prepare_step_run(info)

Prepares running a step.

Parameters:

Name Type Description Default
info StepRunInfo

Info about the step that will be executed.

required
Source code in src/zenml/stack/stack_component.py
734
735
736
737
738
739
def prepare_step_run(self, info: "StepRunInfo") -> None:
    """Prepares running a step.

    Args:
        info: Info about the step that will be executed.
    """

StackComponentConfig

Bases: BaseModel, ABC

Base class for all ZenML stack component configs.

Source code in src/zenml/stack/stack_component.py
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
class StackComponentConfig(BaseModel, ABC):
    """Base class for all ZenML stack component configs."""

    def __init__(
        self, warn_about_plain_text_secrets: bool = False, **kwargs: Any
    ) -> None:
        """Ensures that secret references don't clash with pydantic validation.

        StackComponents allow the specification of all their string attributes
        using secret references of the form `{{secret_name.key}}`. This however
        is only possible when the stack component does not perform any explicit
        validation of this attribute using pydantic validators. If this were
        the case, the validation would run on the secret reference and would
        fail or in the worst case, modify the secret reference and lead to
        unexpected behavior. This method ensures that no attributes that require
        custom pydantic validation are set as secret references.

        Args:
            warn_about_plain_text_secrets: If true, then warns about using
                plain-text secrets.
            **kwargs: Arguments to initialize this stack component.

        Raises:
            ValueError: If an attribute that requires custom pydantic validation
                is passed as a secret reference, or if the `name` attribute
                was passed as a secret reference.
        """
        for key, value in kwargs.items():
            try:
                field = self.__class__.model_fields[key]
            except KeyError:
                # Value for a private attribute or non-existing field, this
                # will fail during the upcoming pydantic validation
                continue

            if value is None:
                continue

            if not secret_utils.is_secret_reference(value):
                if (
                    secret_utils.is_secret_field(field)
                    and warn_about_plain_text_secrets
                ):
                    logger.warning(
                        "You specified a plain-text value for the sensitive "
                        f"attribute `{key}` for a `{self.__class__.__name__}` "
                        "stack component. This is currently only a warning, "
                        "but future versions of ZenML will require you to pass "
                        "in sensitive information as secrets. Check out the "
                        "documentation on how to configure your stack "
                        "components with secrets here: "
                        "https://docs.zenml.io/deploying-zenml/deploying-zenml/secret-management"
                    )
                continue

            if pydantic_utils.has_validators(
                pydantic_class=self.__class__, field_name=key
            ):
                raise ValueError(
                    f"Passing the stack component attribute `{key}` as a "
                    "secret reference is not allowed as additional validation "
                    "is required for this attribute."
                )

        super().__init__(**kwargs)

    @property
    def required_secrets(self) -> Set[secret_utils.SecretReference]:
        """All required secrets for this stack component.

        Returns:
            The required secrets of this stack component.
        """
        return {
            secret_utils.parse_secret_reference(v)
            for v in self.model_dump().values()
            if secret_utils.is_secret_reference(v)
        }

    @property
    def is_remote(self) -> bool:
        """Checks if this stack component is running remotely.

        Concrete stack component configuration classes should override this
        method to return True if the stack component is running in a remote
        location, and it needs to access the ZenML database.

        This designation is used to determine if the stack component can be
        used with a local ZenML database or if it requires a remote ZenML
        server.

        Examples:
          * Orchestrators that are running pipelines in the cloud or in a
          location other than the local host
          * Step Operators that are running steps in the cloud or in a location
          other than the local host

        Returns:
            True if this config is for a remote component, False otherwise.
        """
        return False

    @property
    def is_valid(self) -> bool:
        """Checks if the stack component configurations are valid.

        Concrete stack component configuration classes should override this
        method to return False if the stack component configurations are invalid.

        Returns:
            True if the stack component config is valid, False otherwise.
        """
        return True

    @property
    def is_local(self) -> bool:
        """Checks if this stack component is running locally.

        Concrete stack component configuration classes should override this
        method to return True if the stack component is relying on local
        resources or capabilities (e.g. local filesystem, local database or
        other services).

        Examples:
          * Artifact Stores that store artifacts in the local filesystem
          * Orchestrators that are connected to local orchestration runtime
          services (e.g. local Kubernetes clusters, Docker containers etc).

        Returns:
            True if this config is for a local component, False otherwise.
        """
        return False

    def __custom_getattribute__(self, key: str) -> Any:
        """Returns the (potentially resolved) attribute value for the given key.

        An attribute value may be either specified directly, or as a secret
        reference. In case of a secret reference, this method resolves the
        reference and returns the secret value instead.

        Args:
            key: The key for which to get the attribute value.

        Raises:
            KeyError: If the secret or secret key don't exist.

        Returns:
            The (potentially resolved) attribute value.
        """
        from zenml.client import Client

        value = super().__getattribute__(key)

        if not secret_utils.is_secret_reference(value):
            return value

        secret_ref = secret_utils.parse_secret_reference(value)

        # Try to resolve the secret using the secret store
        try:
            secret = Client().get_secret_by_name_and_private_status(
                name=secret_ref.name,
            )
        except (KeyError, NotImplementedError):
            raise KeyError(
                f"Failed to resolve secret reference for attribute {key} "
                f"of stack component `{self}`: The secret "
                f"{secret_ref.name} does not exist."
            )

        if secret_ref.key not in secret.values:
            raise KeyError(
                f"Failed to resolve secret reference for attribute {key} "
                f"of stack component `{self}`. "
                f"The secret {secret_ref.name} does not contain a value "
                f"for key {secret_ref.key}. Available keys: "
                f"{set(secret.values.keys())}."
            )

        return secret.secret_values[secret_ref.key]

    def _is_part_of_active_stack(self) -> bool:
        """Checks if this config belongs to a component in the active stack.

        Returns:
            True if this config belongs to a component in the active stack,
            False otherwise.
        """
        from zenml.client import Client

        for component in Client().active_stack.components.values():
            if component.config == self:
                return True
        return False

    if not TYPE_CHECKING:
        # When defining __getattribute__, mypy allows accessing non-existent
        # attributes without failing
        # (see https://github.com/python/mypy/issues/13319).
        __getattribute__ = __custom_getattribute__

    @model_validator(mode="before")
    @classmethod
    @pydantic_utils.before_validator_handler
    def _convert_json_strings(cls, data: Dict[str, Any]) -> Dict[str, Any]:
        """Converts potential JSON strings.

        Args:
            data: The model data.

        Returns:
            The potentially converted data.

        Raises:
            ValueError: If any of the values is an invalid JSON string.
        """
        for key, field in cls.model_fields.items():
            if not field.annotation:
                continue

            value = data.get(key, None)

            if isinstance(value, str):
                if typing_utils.is_optional(field.annotation):
                    args = list(typing_utils.get_args(field.annotation))
                    if str in args:
                        # Don't do any type coercion in case str is in the
                        # possible types of the field
                        continue

                    # Remove `NoneType` from the arguments
                    NoneType = type(None)
                    if NoneType in args:
                        args.remove(NoneType)

                    # We just choose the first arg and match against this
                    annotation = args[0]
                else:
                    annotation = field.annotation

                if typing_utils.get_origin(annotation) in {
                    dict,
                    list,
                    Mapping,
                    Sequence,
                }:
                    try:
                        data[key] = json.loads(value)
                    except json.JSONDecodeError as e:
                        raise ValueError(
                            f"Invalid json string '{value}'"
                        ) from e
                elif isclass(annotation) and issubclass(annotation, BaseModel):
                    data[key] = annotation.model_validate_json(
                        value
                    ).model_dump()

        return data

    model_config = ConfigDict(
        # public attributes are immutable
        frozen=True,
        # prevent extra attributes during model initialization
        extra="ignore",
    )

is_local property

Checks if this stack component is running locally.

Concrete stack component configuration classes should override this method to return True if the stack component is relying on local resources or capabilities (e.g. local filesystem, local database or other services).

Examples:

  • Artifact Stores that store artifacts in the local filesystem
  • Orchestrators that are connected to local orchestration runtime services (e.g. local Kubernetes clusters, Docker containers etc).

Returns:

Type Description
bool

True if this config is for a local component, False otherwise.

is_remote property

Checks if this stack component is running remotely.

Concrete stack component configuration classes should override this method to return True if the stack component is running in a remote location, and it needs to access the ZenML database.

This designation is used to determine if the stack component can be used with a local ZenML database or if it requires a remote ZenML server.

Examples:

  • Orchestrators that are running pipelines in the cloud or in a location other than the local host
  • Step Operators that are running steps in the cloud or in a location other than the local host

Returns:

Type Description
bool

True if this config is for a remote component, False otherwise.

is_valid property

Checks if the stack component configurations are valid.

Concrete stack component configuration classes should override this method to return False if the stack component configurations are invalid.

Returns:

Type Description
bool

True if the stack component config is valid, False otherwise.

required_secrets property

All required secrets for this stack component.

Returns:

Type Description
Set[SecretReference]

The required secrets of this stack component.

__custom_getattribute__(key)

Returns the (potentially resolved) attribute value for the given key.

An attribute value may be either specified directly, or as a secret reference. In case of a secret reference, this method resolves the reference and returns the secret value instead.

Parameters:

Name Type Description Default
key str

The key for which to get the attribute value.

required

Raises:

Type Description
KeyError

If the secret or secret key don't exist.

Returns:

Type Description
Any

The (potentially resolved) attribute value.

Source code in src/zenml/stack/stack_component.py
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
def __custom_getattribute__(self, key: str) -> Any:
    """Returns the (potentially resolved) attribute value for the given key.

    An attribute value may be either specified directly, or as a secret
    reference. In case of a secret reference, this method resolves the
    reference and returns the secret value instead.

    Args:
        key: The key for which to get the attribute value.

    Raises:
        KeyError: If the secret or secret key don't exist.

    Returns:
        The (potentially resolved) attribute value.
    """
    from zenml.client import Client

    value = super().__getattribute__(key)

    if not secret_utils.is_secret_reference(value):
        return value

    secret_ref = secret_utils.parse_secret_reference(value)

    # Try to resolve the secret using the secret store
    try:
        secret = Client().get_secret_by_name_and_private_status(
            name=secret_ref.name,
        )
    except (KeyError, NotImplementedError):
        raise KeyError(
            f"Failed to resolve secret reference for attribute {key} "
            f"of stack component `{self}`: The secret "
            f"{secret_ref.name} does not exist."
        )

    if secret_ref.key not in secret.values:
        raise KeyError(
            f"Failed to resolve secret reference for attribute {key} "
            f"of stack component `{self}`. "
            f"The secret {secret_ref.name} does not contain a value "
            f"for key {secret_ref.key}. Available keys: "
            f"{set(secret.values.keys())}."
        )

    return secret.secret_values[secret_ref.key]

__init__(warn_about_plain_text_secrets=False, **kwargs)

Ensures that secret references don't clash with pydantic validation.

StackComponents allow the specification of all their string attributes using secret references of the form {{secret_name.key}}. This however is only possible when the stack component does not perform any explicit validation of this attribute using pydantic validators. If this were the case, the validation would run on the secret reference and would fail or in the worst case, modify the secret reference and lead to unexpected behavior. This method ensures that no attributes that require custom pydantic validation are set as secret references.

Parameters:

Name Type Description Default
warn_about_plain_text_secrets bool

If true, then warns about using plain-text secrets.

False
**kwargs Any

Arguments to initialize this stack component.

{}

Raises:

Type Description
ValueError

If an attribute that requires custom pydantic validation is passed as a secret reference, or if the name attribute was passed as a secret reference.

Source code in src/zenml/stack/stack_component.py
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
def __init__(
    self, warn_about_plain_text_secrets: bool = False, **kwargs: Any
) -> None:
    """Ensures that secret references don't clash with pydantic validation.

    StackComponents allow the specification of all their string attributes
    using secret references of the form `{{secret_name.key}}`. This however
    is only possible when the stack component does not perform any explicit
    validation of this attribute using pydantic validators. If this were
    the case, the validation would run on the secret reference and would
    fail or in the worst case, modify the secret reference and lead to
    unexpected behavior. This method ensures that no attributes that require
    custom pydantic validation are set as secret references.

    Args:
        warn_about_plain_text_secrets: If true, then warns about using
            plain-text secrets.
        **kwargs: Arguments to initialize this stack component.

    Raises:
        ValueError: If an attribute that requires custom pydantic validation
            is passed as a secret reference, or if the `name` attribute
            was passed as a secret reference.
    """
    for key, value in kwargs.items():
        try:
            field = self.__class__.model_fields[key]
        except KeyError:
            # Value for a private attribute or non-existing field, this
            # will fail during the upcoming pydantic validation
            continue

        if value is None:
            continue

        if not secret_utils.is_secret_reference(value):
            if (
                secret_utils.is_secret_field(field)
                and warn_about_plain_text_secrets
            ):
                logger.warning(
                    "You specified a plain-text value for the sensitive "
                    f"attribute `{key}` for a `{self.__class__.__name__}` "
                    "stack component. This is currently only a warning, "
                    "but future versions of ZenML will require you to pass "
                    "in sensitive information as secrets. Check out the "
                    "documentation on how to configure your stack "
                    "components with secrets here: "
                    "https://docs.zenml.io/deploying-zenml/deploying-zenml/secret-management"
                )
            continue

        if pydantic_utils.has_validators(
            pydantic_class=self.__class__, field_name=key
        ):
            raise ValueError(
                f"Passing the stack component attribute `{key}` as a "
                "secret reference is not allowed as additional validation "
                "is required for this attribute."
            )

    super().__init__(**kwargs)

StackValidator

A StackValidator is used to validate a stack configuration.

Each StackComponent can provide a StackValidator to make sure it is compatible with all components of the stack. The KubeflowOrchestrator for example will always require the stack to have a container registry in order to push the docker images that are required to run a pipeline in Kubeflow Pipelines.

Source code in src/zenml/stack/stack_validator.py
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
class StackValidator:
    """A `StackValidator` is used to validate a stack configuration.

    Each `StackComponent` can provide a `StackValidator` to make sure it is
    compatible with all components of the stack. The `KubeflowOrchestrator`
    for example will always require the stack to have a container registry
    in order to push the docker images that are required to run a pipeline
    in Kubeflow Pipelines.
    """

    def __init__(
        self,
        required_components: Optional[AbstractSet[StackComponentType]] = None,
        custom_validation_function: Optional[
            Callable[["Stack"], Tuple[bool, str]]
        ] = None,
    ):
        """Initializes a `StackValidator` instance.

        Args:
            required_components: Optional set of stack components that must
                exist in the stack.
            custom_validation_function: Optional function that returns whether
                a stack is valid and an error message to show if not valid.
        """
        self._required_components = required_components or set()
        self._custom_validation_function = custom_validation_function

    def validate(self, stack: "Stack") -> None:
        """Validates the given stack.

        Checks if the stack contains all the required components and passes
        the custom validation function of the validator.

        Args:
            stack: The stack to validate.

        Raises:
            StackValidationError: If the stack does not meet all the
                validation criteria.
        """
        missing_components = self._required_components - set(stack.components)
        if missing_components:
            raise StackValidationError(
                f"Missing stack components {missing_components} for "
                f"stack: {stack.name}"
            )

        if self._custom_validation_function:
            valid, err_msg = self._custom_validation_function(stack)
            if not valid:
                raise StackValidationError(
                    f"Custom validation function failed to validate "
                    f"stack '{stack.name}': {err_msg}"
                )

__init__(required_components=None, custom_validation_function=None)

Initializes a StackValidator instance.

Parameters:

Name Type Description Default
required_components Optional[AbstractSet[StackComponentType]]

Optional set of stack components that must exist in the stack.

None
custom_validation_function Optional[Callable[[Stack], Tuple[bool, str]]]

Optional function that returns whether a stack is valid and an error message to show if not valid.

None
Source code in src/zenml/stack/stack_validator.py
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
def __init__(
    self,
    required_components: Optional[AbstractSet[StackComponentType]] = None,
    custom_validation_function: Optional[
        Callable[["Stack"], Tuple[bool, str]]
    ] = None,
):
    """Initializes a `StackValidator` instance.

    Args:
        required_components: Optional set of stack components that must
            exist in the stack.
        custom_validation_function: Optional function that returns whether
            a stack is valid and an error message to show if not valid.
    """
    self._required_components = required_components or set()
    self._custom_validation_function = custom_validation_function

validate(stack)

Validates the given stack.

Checks if the stack contains all the required components and passes the custom validation function of the validator.

Parameters:

Name Type Description Default
stack Stack

The stack to validate.

required

Raises:

Type Description
StackValidationError

If the stack does not meet all the validation criteria.

Source code in src/zenml/stack/stack_validator.py
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
def validate(self, stack: "Stack") -> None:
    """Validates the given stack.

    Checks if the stack contains all the required components and passes
    the custom validation function of the validator.

    Args:
        stack: The stack to validate.

    Raises:
        StackValidationError: If the stack does not meet all the
            validation criteria.
    """
    missing_components = self._required_components - set(stack.components)
    if missing_components:
        raise StackValidationError(
            f"Missing stack components {missing_components} for "
            f"stack: {stack.name}"
        )

    if self._custom_validation_function:
        valid, err_msg = self._custom_validation_function(stack)
        if not valid:
            raise StackValidationError(
                f"Custom validation function failed to validate "
                f"stack '{stack.name}': {err_msg}"
            )

Step Operators

Step operators allow you to run steps on custom infrastructure.

While an orchestrator defines how and where your entire pipeline runs, a step operator defines how and where an individual step runs. This can be useful in a variety of scenarios. An example could be if one step within a pipeline should run on a separate environment equipped with a GPU (like a trainer step).

BaseStepOperator

Bases: StackComponent, ABC

Base class for all ZenML step operators.

Source code in src/zenml/step_operators/base_step_operator.py
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
class BaseStepOperator(StackComponent, ABC):
    """Base class for all ZenML step operators."""

    @property
    def config(self) -> BaseStepOperatorConfig:
        """Returns the config of the step operator.

        Returns:
            The config of the step operator.
        """
        return cast(BaseStepOperatorConfig, self._config)

    @property
    def entrypoint_config_class(
        self,
    ) -> Type[StepOperatorEntrypointConfiguration]:
        """Returns the entrypoint configuration class for this step operator.

        Concrete step operator implementations may override this property
        to return a custom entrypoint configuration class if they need to
        customize the entrypoint configuration.

        Returns:
            The entrypoint configuration class for this step operator.
        """
        return StepOperatorEntrypointConfiguration

    @abstractmethod
    def launch(
        self,
        info: "StepRunInfo",
        entrypoint_command: List[str],
        environment: Dict[str, str],
    ) -> None:
        """Abstract method to execute a step.

        Subclasses must implement this method and launch a **synchronous**
        job that executes the `entrypoint_command`.

        Args:
            info: Information about the step run.
            entrypoint_command: Command that executes the step.
            environment: Environment variables to set in the step operator
                environment.
        """

config property

Returns the config of the step operator.

Returns:

Type Description
BaseStepOperatorConfig

The config of the step operator.

entrypoint_config_class property

Returns the entrypoint configuration class for this step operator.

Concrete step operator implementations may override this property to return a custom entrypoint configuration class if they need to customize the entrypoint configuration.

Returns:

Type Description
Type[StepOperatorEntrypointConfiguration]

The entrypoint configuration class for this step operator.

launch(info, entrypoint_command, environment) abstractmethod

Abstract method to execute a step.

Subclasses must implement this method and launch a synchronous job that executes the entrypoint_command.

Parameters:

Name Type Description Default
info StepRunInfo

Information about the step run.

required
entrypoint_command List[str]

Command that executes the step.

required
environment Dict[str, str]

Environment variables to set in the step operator environment.

required
Source code in src/zenml/step_operators/base_step_operator.py
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
@abstractmethod
def launch(
    self,
    info: "StepRunInfo",
    entrypoint_command: List[str],
    environment: Dict[str, str],
) -> None:
    """Abstract method to execute a step.

    Subclasses must implement this method and launch a **synchronous**
    job that executes the `entrypoint_command`.

    Args:
        info: Information about the step run.
        entrypoint_command: Command that executes the step.
        environment: Environment variables to set in the step operator
            environment.
    """

BaseStepOperatorConfig

Bases: StackComponentConfig

Base config for step operators.

Source code in src/zenml/step_operators/base_step_operator.py
33
34
class BaseStepOperatorConfig(StackComponentConfig):
    """Base config for step operators."""

BaseStepOperatorFlavor

Bases: Flavor

Base class for all ZenML step operator flavors.

Source code in src/zenml/step_operators/base_step_operator.py
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
class BaseStepOperatorFlavor(Flavor):
    """Base class for all ZenML step operator flavors."""

    @property
    def type(self) -> StackComponentType:
        """Returns the flavor type.

        Returns:
            The type of the flavor.
        """
        return StackComponentType.STEP_OPERATOR

    @property
    def config_class(self) -> Type[BaseStepOperatorConfig]:
        """Returns the config class for this flavor.

        Returns:
            The config class for this flavor.
        """
        return BaseStepOperatorConfig

    @property
    @abstractmethod
    def implementation_class(self) -> Type[BaseStepOperator]:
        """Returns the implementation class for this flavor.

        Returns:
            The implementation class for this flavor.
        """

config_class property

Returns the config class for this flavor.

Returns:

Type Description
Type[BaseStepOperatorConfig]

The config class for this flavor.

implementation_class abstractmethod property

Returns the implementation class for this flavor.

Returns:

Type Description
Type[BaseStepOperator]

The implementation class for this flavor.

type property

Returns the flavor type.

Returns:

Type Description
StackComponentType

The type of the flavor.

Steps

Initializer for ZenML steps.

A step is a single piece or stage of a ZenML pipeline. Think of each step as being one of the nodes of a Directed Acyclic Graph (or DAG). Steps are responsible for one aspect of processing or interacting with the data / artifacts in the pipeline.

Conceptually, a Step is a discrete and independent part of a pipeline that is responsible for one particular aspect of data manipulation inside a ZenML pipeline.

Steps can be subclassed from the BaseStep class, or used via our @step decorator.

BaseStep

Abstract base class for all ZenML steps.

Source code in src/zenml/steps/base_step.py
  97
  98
  99
 100
 101
 102
 103
 104
 105
 106
 107
 108
 109
 110
 111
 112
 113
 114
 115
 116
 117
 118
 119
 120
 121
 122
 123
 124
 125
 126
 127
 128
 129
 130
 131
 132
 133
 134
 135
 136
 137
 138
 139
 140
 141
 142
 143
 144
 145
 146
 147
 148
 149
 150
 151
 152
 153
 154
 155
 156
 157
 158
 159
 160
 161
 162
 163
 164
 165
 166
 167
 168
 169
 170
 171
 172
 173
 174
 175
 176
 177
 178
 179
 180
 181
 182
 183
 184
 185
 186
 187
 188
 189
 190
 191
 192
 193
 194
 195
 196
 197
 198
 199
 200
 201
 202
 203
 204
 205
 206
 207
 208
 209
 210
 211
 212
 213
 214
 215
 216
 217
 218
 219
 220
 221
 222
 223
 224
 225
 226
 227
 228
 229
 230
 231
 232
 233
 234
 235
 236
 237
 238
 239
 240
 241
 242
 243
 244
 245
 246
 247
 248
 249
 250
 251
 252
 253
 254
 255
 256
 257
 258
 259
 260
 261
 262
 263
 264
 265
 266
 267
 268
 269
 270
 271
 272
 273
 274
 275
 276
 277
 278
 279
 280
 281
 282
 283
 284
 285
 286
 287
 288
 289
 290
 291
 292
 293
 294
 295
 296
 297
 298
 299
 300
 301
 302
 303
 304
 305
 306
 307
 308
 309
 310
 311
 312
 313
 314
 315
 316
 317
 318
 319
 320
 321
 322
 323
 324
 325
 326
 327
 328
 329
 330
 331
 332
 333
 334
 335
 336
 337
 338
 339
 340
 341
 342
 343
 344
 345
 346
 347
 348
 349
 350
 351
 352
 353
 354
 355
 356
 357
 358
 359
 360
 361
 362
 363
 364
 365
 366
 367
 368
 369
 370
 371
 372
 373
 374
 375
 376
 377
 378
 379
 380
 381
 382
 383
 384
 385
 386
 387
 388
 389
 390
 391
 392
 393
 394
 395
 396
 397
 398
 399
 400
 401
 402
 403
 404
 405
 406
 407
 408
 409
 410
 411
 412
 413
 414
 415
 416
 417
 418
 419
 420
 421
 422
 423
 424
 425
 426
 427
 428
 429
 430
 431
 432
 433
 434
 435
 436
 437
 438
 439
 440
 441
 442
 443
 444
 445
 446
 447
 448
 449
 450
 451
 452
 453
 454
 455
 456
 457
 458
 459
 460
 461
 462
 463
 464
 465
 466
 467
 468
 469
 470
 471
 472
 473
 474
 475
 476
 477
 478
 479
 480
 481
 482
 483
 484
 485
 486
 487
 488
 489
 490
 491
 492
 493
 494
 495
 496
 497
 498
 499
 500
 501
 502
 503
 504
 505
 506
 507
 508
 509
 510
 511
 512
 513
 514
 515
 516
 517
 518
 519
 520
 521
 522
 523
 524
 525
 526
 527
 528
 529
 530
 531
 532
 533
 534
 535
 536
 537
 538
 539
 540
 541
 542
 543
 544
 545
 546
 547
 548
 549
 550
 551
 552
 553
 554
 555
 556
 557
 558
 559
 560
 561
 562
 563
 564
 565
 566
 567
 568
 569
 570
 571
 572
 573
 574
 575
 576
 577
 578
 579
 580
 581
 582
 583
 584
 585
 586
 587
 588
 589
 590
 591
 592
 593
 594
 595
 596
 597
 598
 599
 600
 601
 602
 603
 604
 605
 606
 607
 608
 609
 610
 611
 612
 613
 614
 615
 616
 617
 618
 619
 620
 621
 622
 623
 624
 625
 626
 627
 628
 629
 630
 631
 632
 633
 634
 635
 636
 637
 638
 639
 640
 641
 642
 643
 644
 645
 646
 647
 648
 649
 650
 651
 652
 653
 654
 655
 656
 657
 658
 659
 660
 661
 662
 663
 664
 665
 666
 667
 668
 669
 670
 671
 672
 673
 674
 675
 676
 677
 678
 679
 680
 681
 682
 683
 684
 685
 686
 687
 688
 689
 690
 691
 692
 693
 694
 695
 696
 697
 698
 699
 700
 701
 702
 703
 704
 705
 706
 707
 708
 709
 710
 711
 712
 713
 714
 715
 716
 717
 718
 719
 720
 721
 722
 723
 724
 725
 726
 727
 728
 729
 730
 731
 732
 733
 734
 735
 736
 737
 738
 739
 740
 741
 742
 743
 744
 745
 746
 747
 748
 749
 750
 751
 752
 753
 754
 755
 756
 757
 758
 759
 760
 761
 762
 763
 764
 765
 766
 767
 768
 769
 770
 771
 772
 773
 774
 775
 776
 777
 778
 779
 780
 781
 782
 783
 784
 785
 786
 787
 788
 789
 790
 791
 792
 793
 794
 795
 796
 797
 798
 799
 800
 801
 802
 803
 804
 805
 806
 807
 808
 809
 810
 811
 812
 813
 814
 815
 816
 817
 818
 819
 820
 821
 822
 823
 824
 825
 826
 827
 828
 829
 830
 831
 832
 833
 834
 835
 836
 837
 838
 839
 840
 841
 842
 843
 844
 845
 846
 847
 848
 849
 850
 851
 852
 853
 854
 855
 856
 857
 858
 859
 860
 861
 862
 863
 864
 865
 866
 867
 868
 869
 870
 871
 872
 873
 874
 875
 876
 877
 878
 879
 880
 881
 882
 883
 884
 885
 886
 887
 888
 889
 890
 891
 892
 893
 894
 895
 896
 897
 898
 899
 900
 901
 902
 903
 904
 905
 906
 907
 908
 909
 910
 911
 912
 913
 914
 915
 916
 917
 918
 919
 920
 921
 922
 923
 924
 925
 926
 927
 928
 929
 930
 931
 932
 933
 934
 935
 936
 937
 938
 939
 940
 941
 942
 943
 944
 945
 946
 947
 948
 949
 950
 951
 952
 953
 954
 955
 956
 957
 958
 959
 960
 961
 962
 963
 964
 965
 966
 967
 968
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
class BaseStep:
    """Abstract base class for all ZenML steps."""

    def __init__(
        self,
        name: Optional[str] = None,
        enable_cache: Optional[bool] = None,
        enable_artifact_metadata: Optional[bool] = None,
        enable_artifact_visualization: Optional[bool] = None,
        enable_step_logs: Optional[bool] = None,
        experiment_tracker: Optional[str] = None,
        step_operator: Optional[str] = None,
        parameters: Optional[Dict[str, Any]] = None,
        output_materializers: Optional[
            "OutputMaterializersSpecification"
        ] = None,
        settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
        extra: Optional[Dict[str, Any]] = None,
        on_failure: Optional["HookSpecification"] = None,
        on_success: Optional["HookSpecification"] = None,
        model: Optional["Model"] = None,
        retry: Optional[StepRetryConfig] = None,
        substitutions: Optional[Dict[str, str]] = None,
    ) -> None:
        """Initializes a step.

        Args:
            name: The name of the step.
            enable_cache: If caching should be enabled for this step.
            enable_artifact_metadata: If artifact metadata should be enabled
                for this step.
            enable_artifact_visualization: If artifact visualization should be
                enabled for this step.
            enable_step_logs: Enable step logs for this step.
            experiment_tracker: The experiment tracker to use for this step.
            step_operator: The step operator to use for this step.
            parameters: Function parameters for this step
            output_materializers: Output materializers for this step. If
                given as a dict, the keys must be a subset of the output names
                of this step. If a single value (type or string) is given, the
                materializer will be used for all outputs.
            settings: settings for this step.
            extra: Extra configurations for this step.
            on_failure: Callback function in event of failure of the step. Can
                be a function with a single argument of type `BaseException`, or
                a source path to such a function (e.g. `module.my_function`).
            on_success: Callback function in event of success of the step. Can
                be a function with no arguments, or a source path to such a
                function (e.g. `module.my_function`).
            model: configuration of the model version in the Model Control Plane.
            retry: Configuration for retrying the step in case of failure.
            substitutions: Extra placeholders to use in the name template.
        """
        from zenml.config.step_configurations import PartialStepConfiguration

        self.entrypoint_definition = validate_entrypoint_function(
            self.entrypoint,
            reserved_arguments=["after", "id"],
        )

        name = name or self.__class__.__name__

        logger.debug(
            "Step `%s`: Caching %s.",
            name,
            "enabled" if enable_cache is not False else "disabled",
        )
        logger.debug(
            "Step `%s`: Artifact metadata %s.",
            name,
            "enabled" if enable_artifact_metadata is not False else "disabled",
        )
        logger.debug(
            "Step `%s`: Artifact visualization %s.",
            name,
            "enabled"
            if enable_artifact_visualization is not False
            else "disabled",
        )
        logger.debug(
            "Step `%s`: logs %s.",
            name,
            "enabled" if enable_step_logs is not False else "disabled",
        )
        if model is not None:
            logger.debug(
                "Step `%s`: Is in Model context %s.",
                name,
                {
                    "model": model.name,
                    "version": model.version,
                },
            )

        self._configuration = PartialStepConfiguration(
            name=name,
            enable_cache=enable_cache,
            enable_artifact_metadata=enable_artifact_metadata,
            enable_artifact_visualization=enable_artifact_visualization,
            enable_step_logs=enable_step_logs,
        )
        self.configure(
            experiment_tracker=experiment_tracker,
            step_operator=step_operator,
            output_materializers=output_materializers,
            parameters=parameters,
            settings=settings,
            extra=extra,
            on_failure=on_failure,
            on_success=on_success,
            model=model,
            retry=retry,
            substitutions=substitutions,
        )

        notebook_utils.try_to_save_notebook_cell_code(self.source_object)

    @abstractmethod
    def entrypoint(self, *args: Any, **kwargs: Any) -> Any:
        """Abstract method for core step logic.

        Args:
            *args: Positional arguments passed to the step.
            **kwargs: Keyword arguments passed to the step.

        Returns:
            The output of the step.
        """

    @classmethod
    def load_from_source(cls, source: Union[Source, str]) -> "BaseStep":
        """Loads a step from source.

        Args:
            source: The path to the step source.

        Returns:
            The loaded step.

        Raises:
            ValueError: If the source is not a valid step source.
        """
        obj = source_utils.load(source)

        if isinstance(obj, BaseStep):
            return obj
        elif isinstance(obj, type) and issubclass(obj, BaseStep):
            return obj()
        else:
            raise ValueError("Invalid step source.")

    def resolve(self) -> Source:
        """Resolves the step.

        Returns:
            The step source.
        """
        return source_utils.resolve(self.__class__)

    @property
    def source_object(self) -> Any:
        """The source object of this step.

        Returns:
            The source object of this step.
        """
        return self.__class__

    @property
    def source_code(self) -> str:
        """The source code of this step.

        Returns:
            The source code of this step.
        """
        return inspect.getsource(self.source_object)

    @property
    def docstring(self) -> Optional[str]:
        """The docstring of this step.

        Returns:
            The docstring of this step.
        """
        return self.__doc__

    @property
    def caching_parameters(self) -> Dict[str, Any]:
        """Caching parameters for this step.

        Returns:
            A dictionary containing the caching parameters
        """
        parameters = {
            CODE_HASH_PARAMETER_NAME: source_code_utils.get_hashed_source_code(
                self.source_object
            )
        }
        for name, output in self.configuration.outputs.items():
            if output.materializer_source:
                key = f"{name}_materializer_source"
                hash_ = hashlib.md5()  # nosec

                for source in output.materializer_source:
                    materializer_class = source_utils.load(source)
                    code_hash = source_code_utils.get_hashed_source_code(
                        materializer_class
                    )
                    hash_.update(code_hash.encode())

                parameters[key] = hash_.hexdigest()

        return parameters

    def _parse_call_args(
        self, *args: Any, **kwargs: Any
    ) -> Tuple[
        Dict[str, "StepArtifact"],
        Dict[str, Union["ExternalArtifact", "ArtifactVersionResponse"]],
        Dict[str, "ModelVersionDataLazyLoader"],
        Dict[str, "ClientLazyLoader"],
        Dict[str, Any],
        Dict[str, Any],
    ]:
        """Parses the call args for the step entrypoint.

        Args:
            *args: Entrypoint function arguments.
            **kwargs: Entrypoint function keyword arguments.

        Raises:
            StepInterfaceError: If invalid function arguments were passed.

        Returns:
            The artifacts, external artifacts, model version artifacts/metadata and parameters for the step.
        """
        from zenml.artifacts.external_artifact import ExternalArtifact
        from zenml.metadata.lazy_load import LazyRunMetadataResponse
        from zenml.model.lazy_load import ModelVersionDataLazyLoader
        from zenml.models.v2.core.artifact_version import (
            ArtifactVersionResponse,
            LazyArtifactVersionResponse,
        )

        signature = inspect.signature(self.entrypoint, follow_wrapped=True)

        try:
            bound_args = signature.bind_partial(*args, **kwargs)
        except TypeError as e:
            raise StepInterfaceError(
                f"Wrong arguments when calling step '{self.name}': {e}"
            ) from e

        artifacts = {}
        external_artifacts: Dict[
            str, Union["ExternalArtifact", "ArtifactVersionResponse"]
        ] = {}
        model_artifacts_or_metadata = {}
        client_lazy_loaders = {}
        parameters = {}
        default_parameters = {}

        for key, value in bound_args.arguments.items():
            self.entrypoint_definition.validate_input(key=key, value=value)

            if isinstance(value, StepArtifact):
                artifacts[key] = value
                if key in self.configuration.parameters:
                    logger.warning(
                        "Got duplicate value for step input %s, using value "
                        "provided as artifact.",
                        key,
                    )
            elif isinstance(value, ExternalArtifact):
                external_artifacts[key] = value
                if not value.id:
                    # If the external artifact references a fixed artifact by
                    # ID, caching behaves as expected.
                    logger.warning(
                        "Using an external artifact as step input currently "
                        "invalidates caching for the step and all downstream "
                        "steps. Future releases will introduce hashing of "
                        "artifacts which will improve this behavior."
                    )
            elif isinstance(value, LazyArtifactVersionResponse):
                model_artifacts_or_metadata[key] = ModelVersionDataLazyLoader(
                    model_name=value.lazy_load_model_name,
                    model_version=value.lazy_load_model_version,
                    artifact_name=value.lazy_load_name,
                    artifact_version=value.lazy_load_version,
                    metadata_name=None,
                )
            elif isinstance(value, ArtifactVersionResponse):
                external_artifacts[key] = value
            elif isinstance(value, LazyRunMetadataResponse):
                model_artifacts_or_metadata[key] = ModelVersionDataLazyLoader(
                    model_name=value.lazy_load_model_name,
                    model_version=value.lazy_load_model_version,
                    artifact_name=value.lazy_load_artifact_name,
                    artifact_version=value.lazy_load_artifact_version,
                    metadata_name=value.lazy_load_metadata_name,
                )
            elif isinstance(value, ClientLazyLoader):
                client_lazy_loaders[key] = value
            else:
                parameters[key] = value

        # Above we iterated over the provided arguments which should overwrite
        # any parameters previously defined on the step instance. Now we apply
        # the default values on the entrypoint function and add those as
        # parameters for any argument that has no value yet. If we were to do
        # that in the above loop, we would overwrite previously configured
        # parameters with the default values.
        bound_args.apply_defaults()
        for key, value in bound_args.arguments.items():
            self.entrypoint_definition.validate_input(key=key, value=value)
            if (
                key not in artifacts
                and key not in external_artifacts
                and key not in model_artifacts_or_metadata
                and key not in self.configuration.parameters
                and key not in client_lazy_loaders
            ):
                default_parameters[key] = value

        return (
            artifacts,
            external_artifacts,
            model_artifacts_or_metadata,
            client_lazy_loaders,
            parameters,
            default_parameters,
        )

    def __call__(
        self,
        *args: Any,
        id: Optional[str] = None,
        after: Union[
            str, StepArtifact, Sequence[Union[str, StepArtifact]], None
        ] = None,
        **kwargs: Any,
    ) -> Any:
        """Handle a call of the step.

        This method does one of two things:
        * If there is an active pipeline context, it adds an invocation of the
          step instance to the pipeline.
        * If no pipeline is active, it calls the step entrypoint function.

        Args:
            *args: Entrypoint function arguments.
            id: Invocation ID to use.
            after: Upstream steps for the invocation.
            **kwargs: Entrypoint function keyword arguments.

        Returns:
            The outputs of the entrypoint function call.
        """
        from zenml.pipelines.pipeline_definition import Pipeline

        if not Pipeline.ACTIVE_PIPELINE:
            from zenml import constants, get_step_context

            # If the environment variable was set to explicitly not run on the
            # stack, we do that.
            run_without_stack = handle_bool_env_var(
                ENV_ZENML_RUN_SINGLE_STEPS_WITHOUT_STACK, default=False
            )
            if run_without_stack:
                return self.call_entrypoint(*args, **kwargs)

            try:
                get_step_context()
            except RuntimeError:
                pass
            else:
                # We're currently inside the execution of a different step
                # -> We don't want to launch another single step pipeline here,
                # but instead just call the step function
                return self.call_entrypoint(*args, **kwargs)

            if constants.SHOULD_PREVENT_PIPELINE_EXECUTION:
                logger.info(
                    "Preventing execution of step '%s'.",
                    self.name,
                )
                return

            return run_as_single_step_pipeline(self, *args, **kwargs)

        (
            input_artifacts,
            external_artifacts,
            model_artifacts_or_metadata,
            client_lazy_loaders,
            parameters,
            default_parameters,
        ) = self._parse_call_args(*args, **kwargs)

        upstream_steps = {
            artifact.invocation_id for artifact in input_artifacts.values()
        }
        if isinstance(after, str):
            upstream_steps.add(after)
        elif isinstance(after, StepArtifact):
            upstream_steps.add(after.invocation_id)
        elif isinstance(after, Sequence):
            for item in after:
                if isinstance(item, str):
                    upstream_steps.add(item)
                elif isinstance(item, StepArtifact):
                    upstream_steps.add(item.invocation_id)

        invocation_id = Pipeline.ACTIVE_PIPELINE.add_step_invocation(
            step=self,
            input_artifacts=input_artifacts,
            external_artifacts=external_artifacts,
            model_artifacts_or_metadata=model_artifacts_or_metadata,
            client_lazy_loaders=client_lazy_loaders,
            parameters=parameters,
            default_parameters=default_parameters,
            upstream_steps=upstream_steps,
            custom_id=id,
            allow_id_suffix=not id,
        )

        outputs = []
        for key, annotation in self.entrypoint_definition.outputs.items():
            output = StepArtifact(
                invocation_id=invocation_id,
                output_name=key,
                annotation=annotation,
                pipeline=Pipeline.ACTIVE_PIPELINE,
            )
            outputs.append(output)
        return outputs[0] if len(outputs) == 1 else outputs

    def call_entrypoint(self, *args: Any, **kwargs: Any) -> Any:
        """Calls the entrypoint function of the step.

        Args:
            *args: Entrypoint function arguments.
            **kwargs: Entrypoint function keyword arguments.

        Returns:
            The return value of the entrypoint function.

        Raises:
            StepInterfaceError: If the arguments to the entrypoint function are
                invalid.
        """
        try:
            validated_args = pydantic_utils.validate_function_args(
                self.entrypoint,
                ConfigDict(arbitrary_types_allowed=True),
                *args,
                **kwargs,
            )
        except ValidationError as e:
            raise StepInterfaceError(
                "Invalid step function entrypoint arguments. Check out the "
                "pydantic error above for more details."
            ) from e

        return self.entrypoint(**validated_args)

    @property
    def name(self) -> str:
        """The name of the step.

        Returns:
            The name of the step.
        """
        return self.configuration.name

    @property
    def enable_cache(self) -> Optional[bool]:
        """If caching is enabled for the step.

        Returns:
            If caching is enabled for the step.
        """
        return self.configuration.enable_cache

    @property
    def configuration(self) -> "PartialStepConfiguration":
        """The configuration of the step.

        Returns:
            The configuration of the step.
        """
        return self._configuration

    def configure(
        self: T,
        enable_cache: Optional[bool] = None,
        enable_artifact_metadata: Optional[bool] = None,
        enable_artifact_visualization: Optional[bool] = None,
        enable_step_logs: Optional[bool] = None,
        experiment_tracker: Optional[str] = None,
        step_operator: Optional[str] = None,
        parameters: Optional[Dict[str, Any]] = None,
        output_materializers: Optional[
            "OutputMaterializersSpecification"
        ] = None,
        settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
        extra: Optional[Dict[str, Any]] = None,
        on_failure: Optional["HookSpecification"] = None,
        on_success: Optional["HookSpecification"] = None,
        model: Optional["Model"] = None,
        retry: Optional[StepRetryConfig] = None,
        substitutions: Optional[Dict[str, str]] = None,
        merge: bool = True,
    ) -> T:
        """Configures the step.

        Configuration merging example:
        * `merge==True`:
            step.configure(extra={"key1": 1})
            step.configure(extra={"key2": 2}, merge=True)
            step.configuration.extra # {"key1": 1, "key2": 2}
        * `merge==False`:
            step.configure(extra={"key1": 1})
            step.configure(extra={"key2": 2}, merge=False)
            step.configuration.extra # {"key2": 2}

        Args:
            enable_cache: If caching should be enabled for this step.
            enable_artifact_metadata: If artifact metadata should be enabled
                for this step.
            enable_artifact_visualization: If artifact visualization should be
                enabled for this step.
            enable_step_logs: If step logs should be enabled for this step.
            experiment_tracker: The experiment tracker to use for this step.
            step_operator: The step operator to use for this step.
            parameters: Function parameters for this step
            output_materializers: Output materializers for this step. If
                given as a dict, the keys must be a subset of the output names
                of this step. If a single value (type or string) is given, the
                materializer will be used for all outputs.
            settings: settings for this step.
            extra: Extra configurations for this step.
            on_failure: Callback function in event of failure of the step. Can
                be a function with a single argument of type `BaseException`, or
                a source path to such a function (e.g. `module.my_function`).
            on_success: Callback function in event of success of the step. Can
                be a function with no arguments, or a source path to such a
                function (e.g. `module.my_function`).
            model: Model to use for this step.
            retry: Configuration for retrying the step in case of failure.
            substitutions: Extra placeholders to use in the name template.
            merge: If `True`, will merge the given dictionary configurations
                like `parameters` and `settings` with existing
                configurations. If `False` the given configurations will
                overwrite all existing ones. See the general description of this
                method for an example.

        Returns:
            The step instance that this method was called on.
        """
        from zenml.config.step_configurations import StepConfigurationUpdate
        from zenml.hooks.hook_validators import resolve_and_validate_hook

        def _resolve_if_necessary(
            value: Union[str, Source, Type[Any]],
        ) -> Source:
            if isinstance(value, str):
                return Source.from_import_path(value)
            elif isinstance(value, Source):
                return value
            else:
                return source_utils.resolve(value)

        def _convert_to_tuple(value: Any) -> Tuple[Source, ...]:
            if isinstance(value, str) or not isinstance(value, Sequence):
                return (_resolve_if_necessary(value),)
            else:
                return tuple(_resolve_if_necessary(v) for v in value)

        outputs: Dict[str, Dict[str, Tuple[Source, ...]]] = defaultdict(dict)
        allowed_output_names = set(self.entrypoint_definition.outputs)

        if output_materializers:
            if not isinstance(output_materializers, Mapping):
                sources = _convert_to_tuple(output_materializers)
                output_materializers = {
                    output_name: sources
                    for output_name in allowed_output_names
                }

            for output_name, materializer in output_materializers.items():
                sources = _convert_to_tuple(materializer)
                outputs[output_name]["materializer_source"] = sources

        failure_hook_source = None
        if on_failure:
            # string of on_failure hook function to be used for this step
            failure_hook_source = resolve_and_validate_hook(on_failure)

        success_hook_source = None
        if on_success:
            # string of on_success hook function to be used for this step
            success_hook_source = resolve_and_validate_hook(on_success)

        values = dict_utils.remove_none_values(
            {
                "enable_cache": enable_cache,
                "enable_artifact_metadata": enable_artifact_metadata,
                "enable_artifact_visualization": enable_artifact_visualization,
                "enable_step_logs": enable_step_logs,
                "experiment_tracker": experiment_tracker,
                "step_operator": step_operator,
                "parameters": parameters,
                "settings": settings,
                "outputs": outputs or None,
                "extra": extra,
                "failure_hook_source": failure_hook_source,
                "success_hook_source": success_hook_source,
                "model": model,
                "retry": retry,
                "substitutions": substitutions,
            }
        )
        config = StepConfigurationUpdate(**values)
        self._apply_configuration(config, merge=merge)
        return self

    def with_options(
        self,
        enable_cache: Optional[bool] = None,
        enable_artifact_metadata: Optional[bool] = None,
        enable_artifact_visualization: Optional[bool] = None,
        enable_step_logs: Optional[bool] = None,
        experiment_tracker: Optional[str] = None,
        step_operator: Optional[str] = None,
        parameters: Optional[Dict[str, Any]] = None,
        output_materializers: Optional[
            "OutputMaterializersSpecification"
        ] = None,
        settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
        extra: Optional[Dict[str, Any]] = None,
        on_failure: Optional["HookSpecification"] = None,
        on_success: Optional["HookSpecification"] = None,
        model: Optional["Model"] = None,
        retry: Optional[StepRetryConfig] = None,
        substitutions: Optional[Dict[str, str]] = None,
        merge: bool = True,
    ) -> "BaseStep":
        """Copies the step and applies the given configurations.

        Args:
            enable_cache: If caching should be enabled for this step.
            enable_artifact_metadata: If artifact metadata should be enabled
                for this step.
            enable_artifact_visualization: If artifact visualization should be
                enabled for this step.
            enable_step_logs: If step logs should be enabled for this step.
            experiment_tracker: The experiment tracker to use for this step.
            step_operator: The step operator to use for this step.
            parameters: Function parameters for this step
            output_materializers: Output materializers for this step. If
                given as a dict, the keys must be a subset of the output names
                of this step. If a single value (type or string) is given, the
                materializer will be used for all outputs.
            settings: settings for this step.
            extra: Extra configurations for this step.
            on_failure: Callback function in event of failure of the step. Can
                be a function with a single argument of type `BaseException`, or
                a source path to such a function (e.g. `module.my_function`).
            on_success: Callback function in event of success of the step. Can
                be a function with no arguments, or a source path to such a
                function (e.g. `module.my_function`).
            model: Model to use for this step.
            retry: Configuration for retrying the step in case of failure.
            substitutions: Extra placeholders for the step name.
            merge: If `True`, will merge the given dictionary configurations
                like `parameters` and `settings` with existing
                configurations. If `False` the given configurations will
                overwrite all existing ones. See the general description of this
                method for an example.

        Returns:
            The copied step instance.
        """
        step_copy = self.copy()
        step_copy.configure(
            enable_cache=enable_cache,
            enable_artifact_metadata=enable_artifact_metadata,
            enable_artifact_visualization=enable_artifact_visualization,
            enable_step_logs=enable_step_logs,
            experiment_tracker=experiment_tracker,
            step_operator=step_operator,
            parameters=parameters,
            output_materializers=output_materializers,
            settings=settings,
            extra=extra,
            on_failure=on_failure,
            on_success=on_success,
            model=model,
            retry=retry,
            substitutions=substitutions,
            merge=merge,
        )
        return step_copy

    def copy(self) -> "BaseStep":
        """Copies the step.

        Returns:
            The step copy.
        """
        return copy.deepcopy(self)

    def _apply_configuration(
        self,
        config: "StepConfigurationUpdate",
        merge: bool = True,
        runtime_parameters: Dict[str, Any] = {},
    ) -> None:
        """Applies an update to the step configuration.

        Args:
            config: The configuration update.
            runtime_parameters: Dictionary of parameters passed to a step from runtime
            merge: Whether to merge the updates with the existing configuration
                or not. See the `BaseStep.configure(...)` method for a detailed
                explanation.
        """
        self._validate_configuration(config, runtime_parameters)

        self._configuration = pydantic_utils.update_model(
            self._configuration, update=config, recursive=merge
        )

        logger.debug("Updated step configuration:")
        logger.debug(self._configuration)

    def _validate_configuration(
        self,
        config: "StepConfigurationUpdate",
        runtime_parameters: Dict[str, Any],
    ) -> None:
        """Validates a configuration update.

        Args:
            config: The configuration update to validate.
            runtime_parameters: Dictionary of parameters passed to a step from runtime
        """
        settings_utils.validate_setting_keys(list(config.settings))
        self._validate_function_parameters(
            parameters=config.parameters, runtime_parameters=runtime_parameters
        )
        self._validate_outputs(outputs=config.outputs)

    def _validate_function_parameters(
        self,
        parameters: Dict[str, Any],
        runtime_parameters: Dict[str, Any],
    ) -> None:
        """Validates step function parameters.

        Args:
            parameters: The parameters to validate.
            runtime_parameters: Dictionary of parameters passed to a step from runtime

        Raises:
            StepInterfaceError: If the step requires no function parameters but
                parameters were configured.
            RuntimeError: If the step has parameters configured differently in
                configuration file and code.
        """
        if not parameters:
            return

        conflicting_parameters = {}
        for key, value in parameters.items():
            if key in runtime_parameters:
                runtime_value = runtime_parameters[key]
                if runtime_value != value:
                    conflicting_parameters[key] = (value, runtime_value)
            if key in self.entrypoint_definition.inputs:
                self.entrypoint_definition.validate_input(key=key, value=value)
            else:
                raise StepInterfaceError(
                    f"Unable to find parameter '{key}' in step function "
                    "signature."
                )
        if conflicting_parameters:
            is_plural = "s" if len(conflicting_parameters) > 1 else ""
            msg = f"Configured parameter{is_plural} for the step '{self.name}' conflict{'' if not is_plural else 's'} with parameter{is_plural} passed in runtime:\n"
            for key, values in conflicting_parameters.items():
                msg += (
                    f"`{key}`: config=`{values[0]}` | runtime=`{values[1]}`\n"
                )
            msg += """This happens, if you define values for step parameters in configuration file and pass same parameters from the code. Example:
```
# config.yaml

steps:
    step_name:
        parameters:
            param_name: value1


# pipeline.py

@pipeline
def pipeline_():
    step_name(param_name="other_value")
```
To avoid this consider setting step parameters only in one place (config or code).
"""
            raise RuntimeError(msg)

    def _validate_outputs(
        self, outputs: Mapping[str, "PartialArtifactConfiguration"]
    ) -> None:
        """Validates the step output configuration.

        Args:
            outputs: The configured step outputs.

        Raises:
            StepInterfaceError: If an output for a non-existent name is
                configured of an output artifact/materializer source does not
                resolve to the correct class.
        """
        allowed_output_names = set(self.entrypoint_definition.outputs)
        for output_name, output in outputs.items():
            if output_name not in allowed_output_names:
                raise StepInterfaceError(
                    f"Got unexpected materializers for non-existent "
                    f"output '{output_name}' in step '{self.name}'. "
                    f"Only materializers for the outputs "
                    f"{allowed_output_names} of this step can"
                    f" be registered."
                )

            if output.materializer_source:
                for source in output.materializer_source:
                    if not source_utils.validate_source_class(
                        source, expected_class=BaseMaterializer
                    ):
                        raise StepInterfaceError(
                            f"Materializer source `{source}` "
                            f"for output '{output_name}' of step '{self.name}' "
                            "does not resolve to a `BaseMaterializer` subclass."
                        )

    def _validate_inputs(
        self,
        input_artifacts: Dict[str, "StepArtifact"],
        external_artifacts: Dict[str, "ExternalArtifactConfiguration"],
        model_artifacts_or_metadata: Dict[str, "ModelVersionDataLazyLoader"],
        client_lazy_loaders: Dict[str, "ClientLazyLoader"],
    ) -> None:
        """Validates the step inputs.

        This method makes sure that all inputs are provided either as an
        artifact or parameter.

        Args:
            input_artifacts: The input artifacts.
            external_artifacts: The external input artifacts.
            model_artifacts_or_metadata: The model artifacts or metadata.
            client_lazy_loaders: The client lazy loaders.

        Raises:
            StepInterfaceError: If an entrypoint input is missing.
        """
        for key in self.entrypoint_definition.inputs.keys():
            if (
                key in input_artifacts
                or key in self.configuration.parameters
                or key in external_artifacts
                or key in model_artifacts_or_metadata
                or key in client_lazy_loaders
            ):
                continue
            raise StepInterfaceError(
                f"Missing entrypoint input '{key}' in step '{self.name}'."
            )

    def _finalize_configuration(
        self,
        input_artifacts: Dict[str, "StepArtifact"],
        external_artifacts: Dict[str, "ExternalArtifactConfiguration"],
        model_artifacts_or_metadata: Dict[str, "ModelVersionDataLazyLoader"],
        client_lazy_loaders: Dict[str, "ClientLazyLoader"],
    ) -> "StepConfiguration":
        """Finalizes the configuration after the step was called.

        Once the step was called, we know the outputs of previous steps
        and that no additional user configurations will be made. That means
        we can now collect the remaining artifact and materializer types
        as well as check for the completeness of the step function parameters.

        Args:
            input_artifacts: The input artifacts of this step.
            external_artifacts: The external artifacts of this step.
            model_artifacts_or_metadata: The model artifacts or metadata of
                this step.
            client_lazy_loaders: The client lazy loaders of this step.

        Raises:
            StepInterfaceError: If explicit materializers were specified for an
                output but they do not work for the data type(s) defined by
                the type annotation.

        Returns:
            The finalized step configuration.
        """
        from zenml.config.step_configurations import (
            PartialArtifactConfiguration,
            StepConfiguration,
            StepConfigurationUpdate,
        )

        outputs: Dict[str, Dict[str, Any]] = defaultdict(dict)

        for (
            output_name,
            output_annotation,
        ) in self.entrypoint_definition.outputs.items():
            output = self._configuration.outputs.get(
                output_name, PartialArtifactConfiguration()
            )
            if artifact_config := output_annotation.artifact_config:
                outputs[output_name]["artifact_config"] = artifact_config

            if output.materializer_source:
                # The materializer source was configured by the user. We
                # validate that their configured materializer supports the
                # output type. If the output annotation is a Union, we check
                # that at least one of the specified materializers works with at
                # least one of the types in the Union. If that's not the case,
                # it would be a guaranteed failure at runtime and we fail early
                # here.
                if output_annotation.resolved_annotation is Any:
                    continue

                materializer_classes: List[Type["BaseMaterializer"]] = [
                    source_utils.load(materializer_source)
                    for materializer_source in output.materializer_source
                ]

                for data_type in output_annotation.get_output_types():
                    try:
                        materializer_utils.select_materializer(
                            data_type=data_type,
                            materializer_classes=materializer_classes,
                        )
                        break
                    except RuntimeError:
                        pass
                else:
                    materializer_strings = [
                        materializer_source.import_path
                        for materializer_source in output.materializer_source
                    ]
                    raise StepInterfaceError(
                        "Invalid materializers specified for output "
                        f"{output_name} of step {self.name}. None of the "
                        f"materializers ({materializer_strings}) are "
                        "able to save or load data of the type that is defined "
                        "for the output "
                        f"({output_annotation.resolved_annotation})."
                    )
            else:
                if output_annotation.resolved_annotation is Any:
                    outputs[output_name]["materializer_source"] = ()
                    outputs[output_name]["default_materializer_source"] = (
                        source_utils.resolve(
                            materializer_registry.get_default_materializer()
                        )
                    )
                    continue

                materializer_sources = []

                for output_type in output_annotation.get_output_types():
                    materializer_class = materializer_registry[output_type]
                    materializer_sources.append(
                        source_utils.resolve(materializer_class)
                    )

                outputs[output_name]["materializer_source"] = tuple(
                    materializer_sources
                )

        parameters = self._finalize_parameters()
        self.configure(parameters=parameters, merge=False)
        self._validate_inputs(
            input_artifacts=input_artifacts,
            external_artifacts=external_artifacts,
            model_artifacts_or_metadata=model_artifacts_or_metadata,
            client_lazy_loaders=client_lazy_loaders,
        )

        values = dict_utils.remove_none_values({"outputs": outputs or None})
        config = StepConfigurationUpdate(**values)
        self._apply_configuration(config)

        self._configuration = self._configuration.model_copy(
            update={
                "caching_parameters": self.caching_parameters,
                "external_input_artifacts": external_artifacts,
                "model_artifacts_or_metadata": model_artifacts_or_metadata,
                "client_lazy_loaders": client_lazy_loaders,
            }
        )

        return StepConfiguration.model_validate(
            self._configuration.model_dump()
        )

    def _finalize_parameters(self) -> Dict[str, Any]:
        """Finalizes the config parameters for running this step.

        Returns:
            All parameter values for running this step.
        """
        params = {}
        for key, value in self.configuration.parameters.items():
            if key not in self.entrypoint_definition.inputs:
                continue

            annotation = self.entrypoint_definition.inputs[key].annotation
            annotation = resolve_type_annotation(annotation)
            if inspect.isclass(annotation) and issubclass(
                annotation, BaseModel
            ):
                # Make sure we have all necessary values to instantiate the
                # pydantic model later
                model = annotation(**value)
                params[key] = model.model_dump()
            else:
                params[key] = value

        return params

caching_parameters property

Caching parameters for this step.

Returns:

Type Description
Dict[str, Any]

A dictionary containing the caching parameters

configuration property

The configuration of the step.

Returns:

Type Description
PartialStepConfiguration

The configuration of the step.

docstring property

The docstring of this step.

Returns:

Type Description
Optional[str]

The docstring of this step.

enable_cache property

If caching is enabled for the step.

Returns:

Type Description
Optional[bool]

If caching is enabled for the step.

name property

The name of the step.

Returns:

Type Description
str

The name of the step.

source_code property

The source code of this step.

Returns:

Type Description
str

The source code of this step.

source_object property

The source object of this step.

Returns:

Type Description
Any

The source object of this step.

__call__(*args, id=None, after=None, **kwargs)

Handle a call of the step.

This method does one of two things: * If there is an active pipeline context, it adds an invocation of the step instance to the pipeline. * If no pipeline is active, it calls the step entrypoint function.

Parameters:

Name Type Description Default
*args Any

Entrypoint function arguments.

()
id Optional[str]

Invocation ID to use.

None
after Union[str, StepArtifact, Sequence[Union[str, StepArtifact]], None]

Upstream steps for the invocation.

None
**kwargs Any

Entrypoint function keyword arguments.

{}

Returns:

Type Description
Any

The outputs of the entrypoint function call.

Source code in src/zenml/steps/base_step.py
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
def __call__(
    self,
    *args: Any,
    id: Optional[str] = None,
    after: Union[
        str, StepArtifact, Sequence[Union[str, StepArtifact]], None
    ] = None,
    **kwargs: Any,
) -> Any:
    """Handle a call of the step.

    This method does one of two things:
    * If there is an active pipeline context, it adds an invocation of the
      step instance to the pipeline.
    * If no pipeline is active, it calls the step entrypoint function.

    Args:
        *args: Entrypoint function arguments.
        id: Invocation ID to use.
        after: Upstream steps for the invocation.
        **kwargs: Entrypoint function keyword arguments.

    Returns:
        The outputs of the entrypoint function call.
    """
    from zenml.pipelines.pipeline_definition import Pipeline

    if not Pipeline.ACTIVE_PIPELINE:
        from zenml import constants, get_step_context

        # If the environment variable was set to explicitly not run on the
        # stack, we do that.
        run_without_stack = handle_bool_env_var(
            ENV_ZENML_RUN_SINGLE_STEPS_WITHOUT_STACK, default=False
        )
        if run_without_stack:
            return self.call_entrypoint(*args, **kwargs)

        try:
            get_step_context()
        except RuntimeError:
            pass
        else:
            # We're currently inside the execution of a different step
            # -> We don't want to launch another single step pipeline here,
            # but instead just call the step function
            return self.call_entrypoint(*args, **kwargs)

        if constants.SHOULD_PREVENT_PIPELINE_EXECUTION:
            logger.info(
                "Preventing execution of step '%s'.",
                self.name,
            )
            return

        return run_as_single_step_pipeline(self, *args, **kwargs)

    (
        input_artifacts,
        external_artifacts,
        model_artifacts_or_metadata,
        client_lazy_loaders,
        parameters,
        default_parameters,
    ) = self._parse_call_args(*args, **kwargs)

    upstream_steps = {
        artifact.invocation_id for artifact in input_artifacts.values()
    }
    if isinstance(after, str):
        upstream_steps.add(after)
    elif isinstance(after, StepArtifact):
        upstream_steps.add(after.invocation_id)
    elif isinstance(after, Sequence):
        for item in after:
            if isinstance(item, str):
                upstream_steps.add(item)
            elif isinstance(item, StepArtifact):
                upstream_steps.add(item.invocation_id)

    invocation_id = Pipeline.ACTIVE_PIPELINE.add_step_invocation(
        step=self,
        input_artifacts=input_artifacts,
        external_artifacts=external_artifacts,
        model_artifacts_or_metadata=model_artifacts_or_metadata,
        client_lazy_loaders=client_lazy_loaders,
        parameters=parameters,
        default_parameters=default_parameters,
        upstream_steps=upstream_steps,
        custom_id=id,
        allow_id_suffix=not id,
    )

    outputs = []
    for key, annotation in self.entrypoint_definition.outputs.items():
        output = StepArtifact(
            invocation_id=invocation_id,
            output_name=key,
            annotation=annotation,
            pipeline=Pipeline.ACTIVE_PIPELINE,
        )
        outputs.append(output)
    return outputs[0] if len(outputs) == 1 else outputs

__init__(name=None, enable_cache=None, enable_artifact_metadata=None, enable_artifact_visualization=None, enable_step_logs=None, experiment_tracker=None, step_operator=None, parameters=None, output_materializers=None, settings=None, extra=None, on_failure=None, on_success=None, model=None, retry=None, substitutions=None)

Initializes a step.

Parameters:

Name Type Description Default
name Optional[str]

The name of the step.

None
enable_cache Optional[bool]

If caching should be enabled for this step.

None
enable_artifact_metadata Optional[bool]

If artifact metadata should be enabled for this step.

None
enable_artifact_visualization Optional[bool]

If artifact visualization should be enabled for this step.

None
enable_step_logs Optional[bool]

Enable step logs for this step.

None
experiment_tracker Optional[str]

The experiment tracker to use for this step.

None
step_operator Optional[str]

The step operator to use for this step.

None
parameters Optional[Dict[str, Any]]

Function parameters for this step

None
output_materializers Optional[OutputMaterializersSpecification]

Output materializers for this step. If given as a dict, the keys must be a subset of the output names of this step. If a single value (type or string) is given, the materializer will be used for all outputs.

None
settings Optional[Mapping[str, SettingsOrDict]]

settings for this step.

None
extra Optional[Dict[str, Any]]

Extra configurations for this step.

None
on_failure Optional[HookSpecification]

Callback function in event of failure of the step. Can be a function with a single argument of type BaseException, or a source path to such a function (e.g. module.my_function).

None
on_success Optional[HookSpecification]

Callback function in event of success of the step. Can be a function with no arguments, or a source path to such a function (e.g. module.my_function).

None
model Optional[Model]

configuration of the model version in the Model Control Plane.

None
retry Optional[StepRetryConfig]

Configuration for retrying the step in case of failure.

None
substitutions Optional[Dict[str, str]]

Extra placeholders to use in the name template.

None
Source code in src/zenml/steps/base_step.py
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
def __init__(
    self,
    name: Optional[str] = None,
    enable_cache: Optional[bool] = None,
    enable_artifact_metadata: Optional[bool] = None,
    enable_artifact_visualization: Optional[bool] = None,
    enable_step_logs: Optional[bool] = None,
    experiment_tracker: Optional[str] = None,
    step_operator: Optional[str] = None,
    parameters: Optional[Dict[str, Any]] = None,
    output_materializers: Optional[
        "OutputMaterializersSpecification"
    ] = None,
    settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
    extra: Optional[Dict[str, Any]] = None,
    on_failure: Optional["HookSpecification"] = None,
    on_success: Optional["HookSpecification"] = None,
    model: Optional["Model"] = None,
    retry: Optional[StepRetryConfig] = None,
    substitutions: Optional[Dict[str, str]] = None,
) -> None:
    """Initializes a step.

    Args:
        name: The name of the step.
        enable_cache: If caching should be enabled for this step.
        enable_artifact_metadata: If artifact metadata should be enabled
            for this step.
        enable_artifact_visualization: If artifact visualization should be
            enabled for this step.
        enable_step_logs: Enable step logs for this step.
        experiment_tracker: The experiment tracker to use for this step.
        step_operator: The step operator to use for this step.
        parameters: Function parameters for this step
        output_materializers: Output materializers for this step. If
            given as a dict, the keys must be a subset of the output names
            of this step. If a single value (type or string) is given, the
            materializer will be used for all outputs.
        settings: settings for this step.
        extra: Extra configurations for this step.
        on_failure: Callback function in event of failure of the step. Can
            be a function with a single argument of type `BaseException`, or
            a source path to such a function (e.g. `module.my_function`).
        on_success: Callback function in event of success of the step. Can
            be a function with no arguments, or a source path to such a
            function (e.g. `module.my_function`).
        model: configuration of the model version in the Model Control Plane.
        retry: Configuration for retrying the step in case of failure.
        substitutions: Extra placeholders to use in the name template.
    """
    from zenml.config.step_configurations import PartialStepConfiguration

    self.entrypoint_definition = validate_entrypoint_function(
        self.entrypoint,
        reserved_arguments=["after", "id"],
    )

    name = name or self.__class__.__name__

    logger.debug(
        "Step `%s`: Caching %s.",
        name,
        "enabled" if enable_cache is not False else "disabled",
    )
    logger.debug(
        "Step `%s`: Artifact metadata %s.",
        name,
        "enabled" if enable_artifact_metadata is not False else "disabled",
    )
    logger.debug(
        "Step `%s`: Artifact visualization %s.",
        name,
        "enabled"
        if enable_artifact_visualization is not False
        else "disabled",
    )
    logger.debug(
        "Step `%s`: logs %s.",
        name,
        "enabled" if enable_step_logs is not False else "disabled",
    )
    if model is not None:
        logger.debug(
            "Step `%s`: Is in Model context %s.",
            name,
            {
                "model": model.name,
                "version": model.version,
            },
        )

    self._configuration = PartialStepConfiguration(
        name=name,
        enable_cache=enable_cache,
        enable_artifact_metadata=enable_artifact_metadata,
        enable_artifact_visualization=enable_artifact_visualization,
        enable_step_logs=enable_step_logs,
    )
    self.configure(
        experiment_tracker=experiment_tracker,
        step_operator=step_operator,
        output_materializers=output_materializers,
        parameters=parameters,
        settings=settings,
        extra=extra,
        on_failure=on_failure,
        on_success=on_success,
        model=model,
        retry=retry,
        substitutions=substitutions,
    )

    notebook_utils.try_to_save_notebook_cell_code(self.source_object)

call_entrypoint(*args, **kwargs)

Calls the entrypoint function of the step.

Parameters:

Name Type Description Default
*args Any

Entrypoint function arguments.

()
**kwargs Any

Entrypoint function keyword arguments.

{}

Returns:

Type Description
Any

The return value of the entrypoint function.

Raises:

Type Description
StepInterfaceError

If the arguments to the entrypoint function are invalid.

Source code in src/zenml/steps/base_step.py
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
def call_entrypoint(self, *args: Any, **kwargs: Any) -> Any:
    """Calls the entrypoint function of the step.

    Args:
        *args: Entrypoint function arguments.
        **kwargs: Entrypoint function keyword arguments.

    Returns:
        The return value of the entrypoint function.

    Raises:
        StepInterfaceError: If the arguments to the entrypoint function are
            invalid.
    """
    try:
        validated_args = pydantic_utils.validate_function_args(
            self.entrypoint,
            ConfigDict(arbitrary_types_allowed=True),
            *args,
            **kwargs,
        )
    except ValidationError as e:
        raise StepInterfaceError(
            "Invalid step function entrypoint arguments. Check out the "
            "pydantic error above for more details."
        ) from e

    return self.entrypoint(**validated_args)

configure(enable_cache=None, enable_artifact_metadata=None, enable_artifact_visualization=None, enable_step_logs=None, experiment_tracker=None, step_operator=None, parameters=None, output_materializers=None, settings=None, extra=None, on_failure=None, on_success=None, model=None, retry=None, substitutions=None, merge=True)

Configures the step.

Configuration merging example: * merge==True: step.configure(extra={"key1": 1}) step.configure(extra={"key2": 2}, merge=True) step.configuration.extra # {"key1": 1, "key2": 2} * merge==False: step.configure(extra={"key1": 1}) step.configure(extra={"key2": 2}, merge=False) step.configuration.extra # {"key2": 2}

Parameters:

Name Type Description Default
enable_cache Optional[bool]

If caching should be enabled for this step.

None
enable_artifact_metadata Optional[bool]

If artifact metadata should be enabled for this step.

None
enable_artifact_visualization Optional[bool]

If artifact visualization should be enabled for this step.

None
enable_step_logs Optional[bool]

If step logs should be enabled for this step.

None
experiment_tracker Optional[str]

The experiment tracker to use for this step.

None
step_operator Optional[str]

The step operator to use for this step.

None
parameters Optional[Dict[str, Any]]

Function parameters for this step

None
output_materializers Optional[OutputMaterializersSpecification]

Output materializers for this step. If given as a dict, the keys must be a subset of the output names of this step. If a single value (type or string) is given, the materializer will be used for all outputs.

None
settings Optional[Mapping[str, SettingsOrDict]]

settings for this step.

None
extra Optional[Dict[str, Any]]

Extra configurations for this step.

None
on_failure Optional[HookSpecification]

Callback function in event of failure of the step. Can be a function with a single argument of type BaseException, or a source path to such a function (e.g. module.my_function).

None
on_success Optional[HookSpecification]

Callback function in event of success of the step. Can be a function with no arguments, or a source path to such a function (e.g. module.my_function).

None
model Optional[Model]

Model to use for this step.

None
retry Optional[StepRetryConfig]

Configuration for retrying the step in case of failure.

None
substitutions Optional[Dict[str, str]]

Extra placeholders to use in the name template.

None
merge bool

If True, will merge the given dictionary configurations like parameters and settings with existing configurations. If False the given configurations will overwrite all existing ones. See the general description of this method for an example.

True

Returns:

Type Description
T

The step instance that this method was called on.

Source code in src/zenml/steps/base_step.py
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
def configure(
    self: T,
    enable_cache: Optional[bool] = None,
    enable_artifact_metadata: Optional[bool] = None,
    enable_artifact_visualization: Optional[bool] = None,
    enable_step_logs: Optional[bool] = None,
    experiment_tracker: Optional[str] = None,
    step_operator: Optional[str] = None,
    parameters: Optional[Dict[str, Any]] = None,
    output_materializers: Optional[
        "OutputMaterializersSpecification"
    ] = None,
    settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
    extra: Optional[Dict[str, Any]] = None,
    on_failure: Optional["HookSpecification"] = None,
    on_success: Optional["HookSpecification"] = None,
    model: Optional["Model"] = None,
    retry: Optional[StepRetryConfig] = None,
    substitutions: Optional[Dict[str, str]] = None,
    merge: bool = True,
) -> T:
    """Configures the step.

    Configuration merging example:
    * `merge==True`:
        step.configure(extra={"key1": 1})
        step.configure(extra={"key2": 2}, merge=True)
        step.configuration.extra # {"key1": 1, "key2": 2}
    * `merge==False`:
        step.configure(extra={"key1": 1})
        step.configure(extra={"key2": 2}, merge=False)
        step.configuration.extra # {"key2": 2}

    Args:
        enable_cache: If caching should be enabled for this step.
        enable_artifact_metadata: If artifact metadata should be enabled
            for this step.
        enable_artifact_visualization: If artifact visualization should be
            enabled for this step.
        enable_step_logs: If step logs should be enabled for this step.
        experiment_tracker: The experiment tracker to use for this step.
        step_operator: The step operator to use for this step.
        parameters: Function parameters for this step
        output_materializers: Output materializers for this step. If
            given as a dict, the keys must be a subset of the output names
            of this step. If a single value (type or string) is given, the
            materializer will be used for all outputs.
        settings: settings for this step.
        extra: Extra configurations for this step.
        on_failure: Callback function in event of failure of the step. Can
            be a function with a single argument of type `BaseException`, or
            a source path to such a function (e.g. `module.my_function`).
        on_success: Callback function in event of success of the step. Can
            be a function with no arguments, or a source path to such a
            function (e.g. `module.my_function`).
        model: Model to use for this step.
        retry: Configuration for retrying the step in case of failure.
        substitutions: Extra placeholders to use in the name template.
        merge: If `True`, will merge the given dictionary configurations
            like `parameters` and `settings` with existing
            configurations. If `False` the given configurations will
            overwrite all existing ones. See the general description of this
            method for an example.

    Returns:
        The step instance that this method was called on.
    """
    from zenml.config.step_configurations import StepConfigurationUpdate
    from zenml.hooks.hook_validators import resolve_and_validate_hook

    def _resolve_if_necessary(
        value: Union[str, Source, Type[Any]],
    ) -> Source:
        if isinstance(value, str):
            return Source.from_import_path(value)
        elif isinstance(value, Source):
            return value
        else:
            return source_utils.resolve(value)

    def _convert_to_tuple(value: Any) -> Tuple[Source, ...]:
        if isinstance(value, str) or not isinstance(value, Sequence):
            return (_resolve_if_necessary(value),)
        else:
            return tuple(_resolve_if_necessary(v) for v in value)

    outputs: Dict[str, Dict[str, Tuple[Source, ...]]] = defaultdict(dict)
    allowed_output_names = set(self.entrypoint_definition.outputs)

    if output_materializers:
        if not isinstance(output_materializers, Mapping):
            sources = _convert_to_tuple(output_materializers)
            output_materializers = {
                output_name: sources
                for output_name in allowed_output_names
            }

        for output_name, materializer in output_materializers.items():
            sources = _convert_to_tuple(materializer)
            outputs[output_name]["materializer_source"] = sources

    failure_hook_source = None
    if on_failure:
        # string of on_failure hook function to be used for this step
        failure_hook_source = resolve_and_validate_hook(on_failure)

    success_hook_source = None
    if on_success:
        # string of on_success hook function to be used for this step
        success_hook_source = resolve_and_validate_hook(on_success)

    values = dict_utils.remove_none_values(
        {
            "enable_cache": enable_cache,
            "enable_artifact_metadata": enable_artifact_metadata,
            "enable_artifact_visualization": enable_artifact_visualization,
            "enable_step_logs": enable_step_logs,
            "experiment_tracker": experiment_tracker,
            "step_operator": step_operator,
            "parameters": parameters,
            "settings": settings,
            "outputs": outputs or None,
            "extra": extra,
            "failure_hook_source": failure_hook_source,
            "success_hook_source": success_hook_source,
            "model": model,
            "retry": retry,
            "substitutions": substitutions,
        }
    )
    config = StepConfigurationUpdate(**values)
    self._apply_configuration(config, merge=merge)
    return self

copy()

Copies the step.

Returns:

Type Description
BaseStep

The step copy.

Source code in src/zenml/steps/base_step.py
803
804
805
806
807
808
809
def copy(self) -> "BaseStep":
    """Copies the step.

    Returns:
        The step copy.
    """
    return copy.deepcopy(self)

entrypoint(*args, **kwargs) abstractmethod

Abstract method for core step logic.

Parameters:

Name Type Description Default
*args Any

Positional arguments passed to the step.

()
**kwargs Any

Keyword arguments passed to the step.

{}

Returns:

Type Description
Any

The output of the step.

Source code in src/zenml/steps/base_step.py
214
215
216
217
218
219
220
221
222
223
224
@abstractmethod
def entrypoint(self, *args: Any, **kwargs: Any) -> Any:
    """Abstract method for core step logic.

    Args:
        *args: Positional arguments passed to the step.
        **kwargs: Keyword arguments passed to the step.

    Returns:
        The output of the step.
    """

load_from_source(source) classmethod

Loads a step from source.

Parameters:

Name Type Description Default
source Union[Source, str]

The path to the step source.

required

Returns:

Type Description
BaseStep

The loaded step.

Raises:

Type Description
ValueError

If the source is not a valid step source.

Source code in src/zenml/steps/base_step.py
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
@classmethod
def load_from_source(cls, source: Union[Source, str]) -> "BaseStep":
    """Loads a step from source.

    Args:
        source: The path to the step source.

    Returns:
        The loaded step.

    Raises:
        ValueError: If the source is not a valid step source.
    """
    obj = source_utils.load(source)

    if isinstance(obj, BaseStep):
        return obj
    elif isinstance(obj, type) and issubclass(obj, BaseStep):
        return obj()
    else:
        raise ValueError("Invalid step source.")

resolve()

Resolves the step.

Returns:

Type Description
Source

The step source.

Source code in src/zenml/steps/base_step.py
248
249
250
251
252
253
254
def resolve(self) -> Source:
    """Resolves the step.

    Returns:
        The step source.
    """
    return source_utils.resolve(self.__class__)

with_options(enable_cache=None, enable_artifact_metadata=None, enable_artifact_visualization=None, enable_step_logs=None, experiment_tracker=None, step_operator=None, parameters=None, output_materializers=None, settings=None, extra=None, on_failure=None, on_success=None, model=None, retry=None, substitutions=None, merge=True)

Copies the step and applies the given configurations.

Parameters:

Name Type Description Default
enable_cache Optional[bool]

If caching should be enabled for this step.

None
enable_artifact_metadata Optional[bool]

If artifact metadata should be enabled for this step.

None
enable_artifact_visualization Optional[bool]

If artifact visualization should be enabled for this step.

None
enable_step_logs Optional[bool]

If step logs should be enabled for this step.

None
experiment_tracker Optional[str]

The experiment tracker to use for this step.

None
step_operator Optional[str]

The step operator to use for this step.

None
parameters Optional[Dict[str, Any]]

Function parameters for this step

None
output_materializers Optional[OutputMaterializersSpecification]

Output materializers for this step. If given as a dict, the keys must be a subset of the output names of this step. If a single value (type or string) is given, the materializer will be used for all outputs.

None
settings Optional[Mapping[str, SettingsOrDict]]

settings for this step.

None
extra Optional[Dict[str, Any]]

Extra configurations for this step.

None
on_failure Optional[HookSpecification]

Callback function in event of failure of the step. Can be a function with a single argument of type BaseException, or a source path to such a function (e.g. module.my_function).

None
on_success Optional[HookSpecification]

Callback function in event of success of the step. Can be a function with no arguments, or a source path to such a function (e.g. module.my_function).

None
model Optional[Model]

Model to use for this step.

None
retry Optional[StepRetryConfig]

Configuration for retrying the step in case of failure.

None
substitutions Optional[Dict[str, str]]

Extra placeholders for the step name.

None
merge bool

If True, will merge the given dictionary configurations like parameters and settings with existing configurations. If False the given configurations will overwrite all existing ones. See the general description of this method for an example.

True

Returns:

Type Description
BaseStep

The copied step instance.

Source code in src/zenml/steps/base_step.py
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
def with_options(
    self,
    enable_cache: Optional[bool] = None,
    enable_artifact_metadata: Optional[bool] = None,
    enable_artifact_visualization: Optional[bool] = None,
    enable_step_logs: Optional[bool] = None,
    experiment_tracker: Optional[str] = None,
    step_operator: Optional[str] = None,
    parameters: Optional[Dict[str, Any]] = None,
    output_materializers: Optional[
        "OutputMaterializersSpecification"
    ] = None,
    settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
    extra: Optional[Dict[str, Any]] = None,
    on_failure: Optional["HookSpecification"] = None,
    on_success: Optional["HookSpecification"] = None,
    model: Optional["Model"] = None,
    retry: Optional[StepRetryConfig] = None,
    substitutions: Optional[Dict[str, str]] = None,
    merge: bool = True,
) -> "BaseStep":
    """Copies the step and applies the given configurations.

    Args:
        enable_cache: If caching should be enabled for this step.
        enable_artifact_metadata: If artifact metadata should be enabled
            for this step.
        enable_artifact_visualization: If artifact visualization should be
            enabled for this step.
        enable_step_logs: If step logs should be enabled for this step.
        experiment_tracker: The experiment tracker to use for this step.
        step_operator: The step operator to use for this step.
        parameters: Function parameters for this step
        output_materializers: Output materializers for this step. If
            given as a dict, the keys must be a subset of the output names
            of this step. If a single value (type or string) is given, the
            materializer will be used for all outputs.
        settings: settings for this step.
        extra: Extra configurations for this step.
        on_failure: Callback function in event of failure of the step. Can
            be a function with a single argument of type `BaseException`, or
            a source path to such a function (e.g. `module.my_function`).
        on_success: Callback function in event of success of the step. Can
            be a function with no arguments, or a source path to such a
            function (e.g. `module.my_function`).
        model: Model to use for this step.
        retry: Configuration for retrying the step in case of failure.
        substitutions: Extra placeholders for the step name.
        merge: If `True`, will merge the given dictionary configurations
            like `parameters` and `settings` with existing
            configurations. If `False` the given configurations will
            overwrite all existing ones. See the general description of this
            method for an example.

    Returns:
        The copied step instance.
    """
    step_copy = self.copy()
    step_copy.configure(
        enable_cache=enable_cache,
        enable_artifact_metadata=enable_artifact_metadata,
        enable_artifact_visualization=enable_artifact_visualization,
        enable_step_logs=enable_step_logs,
        experiment_tracker=experiment_tracker,
        step_operator=step_operator,
        parameters=parameters,
        output_materializers=output_materializers,
        settings=settings,
        extra=extra,
        on_failure=on_failure,
        on_success=on_success,
        model=model,
        retry=retry,
        substitutions=substitutions,
        merge=merge,
    )
    return step_copy

ResourceSettings

Bases: BaseSettings

Hardware resource settings.

Attributes:

Name Type Description
cpu_count Optional[PositiveFloat]

The amount of CPU cores that should be configured.

gpu_count Optional[NonNegativeInt]

The amount of GPUs that should be configured.

memory Optional[str]

The amount of memory that should be configured.

Source code in src/zenml/config/resource_settings.py
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
class ResourceSettings(BaseSettings):
    """Hardware resource settings.

    Attributes:
        cpu_count: The amount of CPU cores that should be configured.
        gpu_count: The amount of GPUs that should be configured.
        memory: The amount of memory that should be configured.
    """

    cpu_count: Optional[PositiveFloat] = None
    gpu_count: Optional[NonNegativeInt] = None
    memory: Optional[str] = Field(pattern=MEMORY_REGEX, default=None)

    @property
    def empty(self) -> bool:
        """Returns if this object is "empty" (=no values configured) or not.

        Returns:
            `True` if no values were configured, `False` otherwise.
        """
        # To detect whether this config is empty (= no values specified), we
        # check if there are any attributes which are explicitly set to any
        # value other than `None`.
        return len(self.model_dump(exclude_unset=True, exclude_none=True)) == 0

    def get_memory(
        self, unit: Union[str, ByteUnit] = ByteUnit.GB
    ) -> Optional[float]:
        """Gets the memory configuration in a specific unit.

        Args:
            unit: The unit to which the memory should be converted.

        Raises:
            ValueError: If the memory string is invalid.

        Returns:
            The memory configuration converted to the requested unit, or None
            if no memory was configured.
        """
        if not self.memory:
            return None

        if isinstance(unit, str):
            unit = ByteUnit(unit)

        memory = self.memory
        for memory_unit in ByteUnit:
            if memory.endswith(memory_unit.value):
                memory_value = int(memory[: -len(memory_unit.value)])
                return memory_value * memory_unit.byte_value / unit.byte_value
        else:
            # Should never happen due to the regex validation
            raise ValueError(f"Unable to parse memory unit from '{memory}'.")

    model_config = SettingsConfigDict(
        # public attributes are immutable
        frozen=True,
        # prevent extra attributes during model initialization
        extra="ignore",
    )

empty property

Returns if this object is "empty" (=no values configured) or not.

Returns:

Type Description
bool

True if no values were configured, False otherwise.

get_memory(unit=ByteUnit.GB)

Gets the memory configuration in a specific unit.

Parameters:

Name Type Description Default
unit Union[str, ByteUnit]

The unit to which the memory should be converted.

GB

Raises:

Type Description
ValueError

If the memory string is invalid.

Returns:

Type Description
Optional[float]

The memory configuration converted to the requested unit, or None

Optional[float]

if no memory was configured.

Source code in src/zenml/config/resource_settings.py
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
def get_memory(
    self, unit: Union[str, ByteUnit] = ByteUnit.GB
) -> Optional[float]:
    """Gets the memory configuration in a specific unit.

    Args:
        unit: The unit to which the memory should be converted.

    Raises:
        ValueError: If the memory string is invalid.

    Returns:
        The memory configuration converted to the requested unit, or None
        if no memory was configured.
    """
    if not self.memory:
        return None

    if isinstance(unit, str):
        unit = ByteUnit(unit)

    memory = self.memory
    for memory_unit in ByteUnit:
        if memory.endswith(memory_unit.value):
            memory_value = int(memory[: -len(memory_unit.value)])
            return memory_value * memory_unit.byte_value / unit.byte_value
    else:
        # Should never happen due to the regex validation
        raise ValueError(f"Unable to parse memory unit from '{memory}'.")

StepContext

Provides additional context inside a step function.

This singleton class is used to access information about the current run, step run, or its outputs inside a step function.

Usage example:

from zenml.steps import get_step_context

@step
def my_trainer_step() -> Any:
    context = get_step_context()

    # get info about the current pipeline run
    current_pipeline_run = context.pipeline_run

    # get info about the current step run
    current_step_run = context.step_run

    # get info about the future output artifacts of this step
    output_artifact_uri = context.get_output_artifact_uri()

    ...
Source code in src/zenml/steps/step_context.py
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
class StepContext(metaclass=SingletonMetaClass):
    """Provides additional context inside a step function.

    This singleton class is used to access information about the current run,
    step run, or its outputs inside a step function.

    Usage example:

    ```python
    from zenml.steps import get_step_context

    @step
    def my_trainer_step() -> Any:
        context = get_step_context()

        # get info about the current pipeline run
        current_pipeline_run = context.pipeline_run

        # get info about the current step run
        current_step_run = context.step_run

        # get info about the future output artifacts of this step
        output_artifact_uri = context.get_output_artifact_uri()

        ...
    ```
    """

    def __init__(
        self,
        pipeline_run: "PipelineRunResponse",
        step_run: "StepRunResponse",
        output_materializers: Mapping[str, Sequence[Type["BaseMaterializer"]]],
        output_artifact_uris: Mapping[str, str],
        output_artifact_configs: Mapping[str, Optional["ArtifactConfig"]],
    ) -> None:
        """Initialize the context of the currently running step.

        Args:
            pipeline_run: The model of the current pipeline run.
            step_run: The model of the current step run.
            output_materializers: The output materializers of the step that
                this context is used in.
            output_artifact_uris: The output artifacts of the step that this
                context is used in.
            output_artifact_configs: The outputs' ArtifactConfigs of the step that this
                context is used in.

        Raises:
            StepContextError: If the keys of the output materializers and
                output artifacts do not match.
        """
        from zenml.client import Client

        try:
            pipeline_run = Client().get_pipeline_run(pipeline_run.id)
        except KeyError:
            pass
        self.pipeline_run = pipeline_run
        try:
            step_run = Client().get_run_step(step_run.id)
        except KeyError:
            pass
        self.step_run = step_run
        self.model_version = (
            step_run.model_version or pipeline_run.model_version
        )

        self.step_name = self.step_run.name

        # set outputs
        if output_materializers.keys() != output_artifact_uris.keys():
            raise StepContextError(
                f"Mismatched keys in output materializers and output artifact "
                f"URIs for step `{self.step_name}`. Output materializer "
                f"keys: {set(output_materializers)}, output artifact URI "
                f"keys: {set(output_artifact_uris)}"
            )
        self._outputs = {
            key: StepContextOutput(
                materializer_classes=output_materializers[key],
                artifact_uri=output_artifact_uris[key],
                artifact_config=output_artifact_configs[key],
            )
            for key in output_materializers.keys()
        }
        self._cleanup_registry = CallbackRegistry()

    @property
    def pipeline(self) -> "PipelineResponse":
        """Returns the current pipeline.

        Returns:
            The current pipeline or None.

        Raises:
            StepContextError: If the pipeline run does not have a pipeline.
        """
        if self.pipeline_run.pipeline:
            return self.pipeline_run.pipeline
        raise StepContextError(
            f"Unable to get pipeline in step `{self.step_name}` of pipeline "
            f"run '{self.pipeline_run.id}': This pipeline run does not have "
            f"a pipeline associated with it."
        )

    @property
    def model(self) -> "Model":
        """Returns configured Model.

        Order of resolution to search for Model is:
            1. Model from the step context
            2. Model from the pipeline context

        Returns:
            The `Model` object associated with the current step.

        Raises:
            StepContextError: If no `Model` object was specified for the step
                or pipeline.
        """
        if not self.model_version:
            raise StepContextError(
                f"Unable to get Model in step `{self.step_name}` of pipeline "
                f"run '{self.pipeline_run.id}': No model has been specified "
                "the step or pipeline."
            )

        return self.model_version.to_model_class()

    @property
    def inputs(self) -> Dict[str, "StepRunInputResponse"]:
        """Returns the input artifacts of the current step.

        Returns:
            The input artifacts of the current step.
        """
        return self.step_run.regular_inputs

    def _get_output(
        self, output_name: Optional[str] = None
    ) -> "StepContextOutput":
        """Returns the materializer and artifact URI for a given step output.

        Args:
            output_name: Optional name of the output for which to get the
                materializer and URI.

        Returns:
            Tuple containing the materializer and artifact URI for the
                given output.

        Raises:
            StepContextError: If the step has no outputs, no output for
                the given `output_name` or if no `output_name` was given but
                the step has multiple outputs.
        """
        output_count = len(self._outputs)
        if output_count == 0:
            raise StepContextError(
                f"Unable to get step output for step `{self.step_name}`: "
                f"This step does not have any outputs."
            )

        if not output_name and output_count > 1:
            raise StepContextError(
                f"Unable to get step output for step `{self.step_name}`: "
                f"This step has multiple outputs ({set(self._outputs)}), "
                f"please specify which output to return."
            )

        if output_name:
            if output_name not in self._outputs:
                raise StepContextError(
                    f"Unable to get step output '{output_name}' for "
                    f"step `{self.step_name}`. This step does not have an "
                    f"output with the given name, please specify one of the "
                    f"available outputs: {set(self._outputs)}."
                )
            return self._outputs[output_name]
        else:
            return next(iter(self._outputs.values()))

    def get_output_materializer(
        self,
        output_name: Optional[str] = None,
        custom_materializer_class: Optional[Type["BaseMaterializer"]] = None,
        data_type: Optional[Type[Any]] = None,
    ) -> "BaseMaterializer":
        """Returns a materializer for a given step output.

        Args:
            output_name: Optional name of the output for which to get the
                materializer. If no name is given and the step only has a
                single output, the materializer of this output will be
                returned. If the step has multiple outputs, an exception
                will be raised.
            custom_materializer_class: If given, this `BaseMaterializer`
                subclass will be initialized with the output artifact instead
                of the materializer that was registered for this step output.
            data_type: If the output annotation is of type `Union` and the step
                therefore has multiple materializers configured, you can provide
                a data type for the output which will be used to select the
                correct materializer. If not provided, the first materializer
                will be used.

        Returns:
            A materializer initialized with the output artifact for
            the given output.
        """
        from zenml.utils import materializer_utils

        output = self._get_output(output_name)
        materializer_classes = output.materializer_classes
        artifact_uri = output.artifact_uri

        if custom_materializer_class:
            materializer_class = custom_materializer_class
        elif len(materializer_classes) == 1 or not data_type:
            materializer_class = materializer_classes[0]
        else:
            materializer_class = materializer_utils.select_materializer(
                data_type=data_type, materializer_classes=materializer_classes
            )

        return materializer_class(artifact_uri)

    def get_output_artifact_uri(
        self, output_name: Optional[str] = None
    ) -> str:
        """Returns the artifact URI for a given step output.

        Args:
            output_name: Optional name of the output for which to get the URI.
                If no name is given and the step only has a single output,
                the URI of this output will be returned. If the step has
                multiple outputs, an exception will be raised.

        Returns:
            Artifact URI for the given output.
        """
        return self._get_output(output_name).artifact_uri

    def get_output_metadata(
        self, output_name: Optional[str] = None
    ) -> Dict[str, "MetadataType"]:
        """Returns the metadata for a given step output.

        Args:
            output_name: Optional name of the output for which to get the
                metadata. If no name is given and the step only has a single
                output, the metadata of this output will be returned. If the
                step has multiple outputs, an exception will be raised.

        Returns:
            Metadata for the given output.
        """
        output = self._get_output(output_name)
        custom_metadata = output.run_metadata or {}
        if output.artifact_config:
            custom_metadata.update(
                **(output.artifact_config.run_metadata or {})
            )
        return custom_metadata

    def get_output_tags(self, output_name: Optional[str] = None) -> List[str]:
        """Returns the tags for a given step output.

        Args:
            output_name: Optional name of the output for which to get the
                metadata. If no name is given and the step only has a single
                output, the metadata of this output will be returned. If the
                step has multiple outputs, an exception will be raised.

        Returns:
            Tags for the given output.
        """
        output = self._get_output(output_name)
        custom_tags = set(output.tags or [])
        if output.artifact_config:
            return list(
                set(output.artifact_config.tags or []).union(custom_tags)
            )
        return list(custom_tags)

    def add_output_metadata(
        self,
        metadata: Dict[str, "MetadataType"],
        output_name: Optional[str] = None,
    ) -> None:
        """Adds metadata for a given step output.

        Args:
            metadata: The metadata to add.
            output_name: Optional name of the output for which to add the
                metadata. If no name is given and the step only has a single
                output, the metadata of this output will be added. If the
                step has multiple outputs, an exception will be raised.
        """
        output = self._get_output(output_name)
        if not output.run_metadata:
            output.run_metadata = {}
        output.run_metadata.update(**metadata)

    def add_output_tags(
        self,
        tags: List[str],
        output_name: Optional[str] = None,
    ) -> None:
        """Adds tags for a given step output.

        Args:
            tags: The tags to add.
            output_name: Optional name of the output for which to add the
                tags. If no name is given and the step only has a single
                output, the tags of this output will be added. If the
                step has multiple outputs, an exception will be raised.
        """
        output = self._get_output(output_name)
        if not output.tags:
            output.tags = []
        output.tags += tags

    def remove_output_tags(
        self,
        tags: List[str],
        output_name: Optional[str] = None,
    ) -> None:
        """Removes tags for a given step output.

        Args:
            tags: The tags to remove.
            output_name: Optional name of the output for which to remove the
                tags. If no name is given and the step only has a single
                output, the tags of this output will be removed. If the
                step has multiple outputs, an exception will be raised.
        """
        output = self._get_output(output_name)
        if not output.tags:
            return
        output.tags = [tag for tag in output.tags if tag not in tags]

inputs property

Returns the input artifacts of the current step.

Returns:

Type Description
Dict[str, StepRunInputResponse]

The input artifacts of the current step.

model property

Returns configured Model.

Order of resolution to search for Model is
  1. Model from the step context
  2. Model from the pipeline context

Returns:

Type Description
Model

The Model object associated with the current step.

Raises:

Type Description
StepContextError

If no Model object was specified for the step or pipeline.

pipeline property

Returns the current pipeline.

Returns:

Type Description
PipelineResponse

The current pipeline or None.

Raises:

Type Description
StepContextError

If the pipeline run does not have a pipeline.

__init__(pipeline_run, step_run, output_materializers, output_artifact_uris, output_artifact_configs)

Initialize the context of the currently running step.

Parameters:

Name Type Description Default
pipeline_run PipelineRunResponse

The model of the current pipeline run.

required
step_run StepRunResponse

The model of the current step run.

required
output_materializers Mapping[str, Sequence[Type[BaseMaterializer]]]

The output materializers of the step that this context is used in.

required
output_artifact_uris Mapping[str, str]

The output artifacts of the step that this context is used in.

required
output_artifact_configs Mapping[str, Optional[ArtifactConfig]]

The outputs' ArtifactConfigs of the step that this context is used in.

required

Raises:

Type Description
StepContextError

If the keys of the output materializers and output artifacts do not match.

Source code in src/zenml/steps/step_context.py
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
def __init__(
    self,
    pipeline_run: "PipelineRunResponse",
    step_run: "StepRunResponse",
    output_materializers: Mapping[str, Sequence[Type["BaseMaterializer"]]],
    output_artifact_uris: Mapping[str, str],
    output_artifact_configs: Mapping[str, Optional["ArtifactConfig"]],
) -> None:
    """Initialize the context of the currently running step.

    Args:
        pipeline_run: The model of the current pipeline run.
        step_run: The model of the current step run.
        output_materializers: The output materializers of the step that
            this context is used in.
        output_artifact_uris: The output artifacts of the step that this
            context is used in.
        output_artifact_configs: The outputs' ArtifactConfigs of the step that this
            context is used in.

    Raises:
        StepContextError: If the keys of the output materializers and
            output artifacts do not match.
    """
    from zenml.client import Client

    try:
        pipeline_run = Client().get_pipeline_run(pipeline_run.id)
    except KeyError:
        pass
    self.pipeline_run = pipeline_run
    try:
        step_run = Client().get_run_step(step_run.id)
    except KeyError:
        pass
    self.step_run = step_run
    self.model_version = (
        step_run.model_version or pipeline_run.model_version
    )

    self.step_name = self.step_run.name

    # set outputs
    if output_materializers.keys() != output_artifact_uris.keys():
        raise StepContextError(
            f"Mismatched keys in output materializers and output artifact "
            f"URIs for step `{self.step_name}`. Output materializer "
            f"keys: {set(output_materializers)}, output artifact URI "
            f"keys: {set(output_artifact_uris)}"
        )
    self._outputs = {
        key: StepContextOutput(
            materializer_classes=output_materializers[key],
            artifact_uri=output_artifact_uris[key],
            artifact_config=output_artifact_configs[key],
        )
        for key in output_materializers.keys()
    }
    self._cleanup_registry = CallbackRegistry()

add_output_metadata(metadata, output_name=None)

Adds metadata for a given step output.

Parameters:

Name Type Description Default
metadata Dict[str, MetadataType]

The metadata to add.

required
output_name Optional[str]

Optional name of the output for which to add the metadata. If no name is given and the step only has a single output, the metadata of this output will be added. If the step has multiple outputs, an exception will be raised.

None
Source code in src/zenml/steps/step_context.py
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
def add_output_metadata(
    self,
    metadata: Dict[str, "MetadataType"],
    output_name: Optional[str] = None,
) -> None:
    """Adds metadata for a given step output.

    Args:
        metadata: The metadata to add.
        output_name: Optional name of the output for which to add the
            metadata. If no name is given and the step only has a single
            output, the metadata of this output will be added. If the
            step has multiple outputs, an exception will be raised.
    """
    output = self._get_output(output_name)
    if not output.run_metadata:
        output.run_metadata = {}
    output.run_metadata.update(**metadata)

add_output_tags(tags, output_name=None)

Adds tags for a given step output.

Parameters:

Name Type Description Default
tags List[str]

The tags to add.

required
output_name Optional[str]

Optional name of the output for which to add the tags. If no name is given and the step only has a single output, the tags of this output will be added. If the step has multiple outputs, an exception will be raised.

None
Source code in src/zenml/steps/step_context.py
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
def add_output_tags(
    self,
    tags: List[str],
    output_name: Optional[str] = None,
) -> None:
    """Adds tags for a given step output.

    Args:
        tags: The tags to add.
        output_name: Optional name of the output for which to add the
            tags. If no name is given and the step only has a single
            output, the tags of this output will be added. If the
            step has multiple outputs, an exception will be raised.
    """
    output = self._get_output(output_name)
    if not output.tags:
        output.tags = []
    output.tags += tags

get_output_artifact_uri(output_name=None)

Returns the artifact URI for a given step output.

Parameters:

Name Type Description Default
output_name Optional[str]

Optional name of the output for which to get the URI. If no name is given and the step only has a single output, the URI of this output will be returned. If the step has multiple outputs, an exception will be raised.

None

Returns:

Type Description
str

Artifact URI for the given output.

Source code in src/zenml/steps/step_context.py
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
def get_output_artifact_uri(
    self, output_name: Optional[str] = None
) -> str:
    """Returns the artifact URI for a given step output.

    Args:
        output_name: Optional name of the output for which to get the URI.
            If no name is given and the step only has a single output,
            the URI of this output will be returned. If the step has
            multiple outputs, an exception will be raised.

    Returns:
        Artifact URI for the given output.
    """
    return self._get_output(output_name).artifact_uri

get_output_materializer(output_name=None, custom_materializer_class=None, data_type=None)

Returns a materializer for a given step output.

Parameters:

Name Type Description Default
output_name Optional[str]

Optional name of the output for which to get the materializer. If no name is given and the step only has a single output, the materializer of this output will be returned. If the step has multiple outputs, an exception will be raised.

None
custom_materializer_class Optional[Type[BaseMaterializer]]

If given, this BaseMaterializer subclass will be initialized with the output artifact instead of the materializer that was registered for this step output.

None
data_type Optional[Type[Any]]

If the output annotation is of type Union and the step therefore has multiple materializers configured, you can provide a data type for the output which will be used to select the correct materializer. If not provided, the first materializer will be used.

None

Returns:

Type Description
BaseMaterializer

A materializer initialized with the output artifact for

BaseMaterializer

the given output.

Source code in src/zenml/steps/step_context.py
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
def get_output_materializer(
    self,
    output_name: Optional[str] = None,
    custom_materializer_class: Optional[Type["BaseMaterializer"]] = None,
    data_type: Optional[Type[Any]] = None,
) -> "BaseMaterializer":
    """Returns a materializer for a given step output.

    Args:
        output_name: Optional name of the output for which to get the
            materializer. If no name is given and the step only has a
            single output, the materializer of this output will be
            returned. If the step has multiple outputs, an exception
            will be raised.
        custom_materializer_class: If given, this `BaseMaterializer`
            subclass will be initialized with the output artifact instead
            of the materializer that was registered for this step output.
        data_type: If the output annotation is of type `Union` and the step
            therefore has multiple materializers configured, you can provide
            a data type for the output which will be used to select the
            correct materializer. If not provided, the first materializer
            will be used.

    Returns:
        A materializer initialized with the output artifact for
        the given output.
    """
    from zenml.utils import materializer_utils

    output = self._get_output(output_name)
    materializer_classes = output.materializer_classes
    artifact_uri = output.artifact_uri

    if custom_materializer_class:
        materializer_class = custom_materializer_class
    elif len(materializer_classes) == 1 or not data_type:
        materializer_class = materializer_classes[0]
    else:
        materializer_class = materializer_utils.select_materializer(
            data_type=data_type, materializer_classes=materializer_classes
        )

    return materializer_class(artifact_uri)

get_output_metadata(output_name=None)

Returns the metadata for a given step output.

Parameters:

Name Type Description Default
output_name Optional[str]

Optional name of the output for which to get the metadata. If no name is given and the step only has a single output, the metadata of this output will be returned. If the step has multiple outputs, an exception will be raised.

None

Returns:

Type Description
Dict[str, MetadataType]

Metadata for the given output.

Source code in src/zenml/steps/step_context.py
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
def get_output_metadata(
    self, output_name: Optional[str] = None
) -> Dict[str, "MetadataType"]:
    """Returns the metadata for a given step output.

    Args:
        output_name: Optional name of the output for which to get the
            metadata. If no name is given and the step only has a single
            output, the metadata of this output will be returned. If the
            step has multiple outputs, an exception will be raised.

    Returns:
        Metadata for the given output.
    """
    output = self._get_output(output_name)
    custom_metadata = output.run_metadata or {}
    if output.artifact_config:
        custom_metadata.update(
            **(output.artifact_config.run_metadata or {})
        )
    return custom_metadata

get_output_tags(output_name=None)

Returns the tags for a given step output.

Parameters:

Name Type Description Default
output_name Optional[str]

Optional name of the output for which to get the metadata. If no name is given and the step only has a single output, the metadata of this output will be returned. If the step has multiple outputs, an exception will be raised.

None

Returns:

Type Description
List[str]

Tags for the given output.

Source code in src/zenml/steps/step_context.py
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
def get_output_tags(self, output_name: Optional[str] = None) -> List[str]:
    """Returns the tags for a given step output.

    Args:
        output_name: Optional name of the output for which to get the
            metadata. If no name is given and the step only has a single
            output, the metadata of this output will be returned. If the
            step has multiple outputs, an exception will be raised.

    Returns:
        Tags for the given output.
    """
    output = self._get_output(output_name)
    custom_tags = set(output.tags or [])
    if output.artifact_config:
        return list(
            set(output.artifact_config.tags or []).union(custom_tags)
        )
    return list(custom_tags)

remove_output_tags(tags, output_name=None)

Removes tags for a given step output.

Parameters:

Name Type Description Default
tags List[str]

The tags to remove.

required
output_name Optional[str]

Optional name of the output for which to remove the tags. If no name is given and the step only has a single output, the tags of this output will be removed. If the step has multiple outputs, an exception will be raised.

None
Source code in src/zenml/steps/step_context.py
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
def remove_output_tags(
    self,
    tags: List[str],
    output_name: Optional[str] = None,
) -> None:
    """Removes tags for a given step output.

    Args:
        tags: The tags to remove.
        output_name: Optional name of the output for which to remove the
            tags. If no name is given and the step only has a single
            output, the tags of this output will be removed. If the
            step has multiple outputs, an exception will be raised.
    """
    output = self._get_output(output_name)
    if not output.tags:
        return
    output.tags = [tag for tag in output.tags if tag not in tags]

get_step_context()

Get the context of the currently running step.

Returns:

Type Description
StepContext

The context of the currently running step.

Raises:

Type Description
RuntimeError

If no step is currently running.

Source code in src/zenml/steps/step_context.py
48
49
50
51
52
53
54
55
56
57
58
59
60
61
def get_step_context() -> "StepContext":
    """Get the context of the currently running step.

    Returns:
        The context of the currently running step.

    Raises:
        RuntimeError: If no step is currently running.
    """
    if StepContext._exists():
        return StepContext()  # type: ignore
    raise RuntimeError(
        "The step context is only available inside a step function."
    )

step(_func=None, *, name=None, enable_cache=None, enable_artifact_metadata=None, enable_artifact_visualization=None, enable_step_logs=None, experiment_tracker=None, step_operator=None, output_materializers=None, settings=None, extra=None, on_failure=None, on_success=None, model=None, retry=None, substitutions=None)

step(_func: F) -> BaseStep
step(
    *,
    name: Optional[str] = None,
    enable_cache: Optional[bool] = None,
    enable_artifact_metadata: Optional[bool] = None,
    enable_artifact_visualization: Optional[bool] = None,
    enable_step_logs: Optional[bool] = None,
    experiment_tracker: Optional[str] = None,
    step_operator: Optional[str] = None,
    output_materializers: Optional[
        OutputMaterializersSpecification
    ] = None,
    settings: Optional[Dict[str, SettingsOrDict]] = None,
    extra: Optional[Dict[str, Any]] = None,
    on_failure: Optional[HookSpecification] = None,
    on_success: Optional[HookSpecification] = None,
    model: Optional[Model] = None,
    retry: Optional[StepRetryConfig] = None,
    substitutions: Optional[Dict[str, str]] = None,
) -> Callable[[F], BaseStep]

Decorator to create a ZenML step.

Parameters:

Name Type Description Default
_func Optional[F]

The decorated function.

None
name Optional[str]

The name of the step. If left empty, the name of the decorated function will be used as a fallback.

None
enable_cache Optional[bool]

Specify whether caching is enabled for this step. If no value is passed, caching is enabled by default.

None
enable_artifact_metadata Optional[bool]

Specify whether metadata is enabled for this step. If no value is passed, metadata is enabled by default.

None
enable_artifact_visualization Optional[bool]

Specify whether visualization is enabled for this step. If no value is passed, visualization is enabled by default.

None
enable_step_logs Optional[bool]

Specify whether step logs are enabled for this step.

None
experiment_tracker Optional[str]

The experiment tracker to use for this step.

None
step_operator Optional[str]

The step operator to use for this step.

None
output_materializers Optional[OutputMaterializersSpecification]

Output materializers for this step. If given as a dict, the keys must be a subset of the output names of this step. If a single value (type or string) is given, the materializer will be used for all outputs.

None
settings Optional[Dict[str, SettingsOrDict]]

Settings for this step.

None
extra Optional[Dict[str, Any]]

Extra configurations for this step.

None
on_failure Optional[HookSpecification]

Callback function in event of failure of the step. Can be a function with a single argument of type BaseException, or a source path to such a function (e.g. module.my_function).

None
on_success Optional[HookSpecification]

Callback function in event of success of the step. Can be a function with no arguments, or a source path to such a function (e.g. module.my_function).

None
model Optional[Model]

configuration of the model in the Model Control Plane.

None
retry Optional[StepRetryConfig]

configuration of step retry in case of step failure.

None
substitutions Optional[Dict[str, str]]

Extra placeholders for the step name.

None

Returns:

Type Description
Union[BaseStep, Callable[[F], BaseStep]]

The step instance.

Source code in src/zenml/steps/step_decorator.py
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
def step(
    _func: Optional["F"] = None,
    *,
    name: Optional[str] = None,
    enable_cache: Optional[bool] = None,
    enable_artifact_metadata: Optional[bool] = None,
    enable_artifact_visualization: Optional[bool] = None,
    enable_step_logs: Optional[bool] = None,
    experiment_tracker: Optional[str] = None,
    step_operator: Optional[str] = None,
    output_materializers: Optional["OutputMaterializersSpecification"] = None,
    settings: Optional[Dict[str, "SettingsOrDict"]] = None,
    extra: Optional[Dict[str, Any]] = None,
    on_failure: Optional["HookSpecification"] = None,
    on_success: Optional["HookSpecification"] = None,
    model: Optional["Model"] = None,
    retry: Optional["StepRetryConfig"] = None,
    substitutions: Optional[Dict[str, str]] = None,
) -> Union["BaseStep", Callable[["F"], "BaseStep"]]:
    """Decorator to create a ZenML step.

    Args:
        _func: The decorated function.
        name: The name of the step. If left empty, the name of the decorated
            function will be used as a fallback.
        enable_cache: Specify whether caching is enabled for this step. If no
            value is passed, caching is enabled by default.
        enable_artifact_metadata: Specify whether metadata is enabled for this
            step. If no value is passed, metadata is enabled by default.
        enable_artifact_visualization: Specify whether visualization is enabled
            for this step. If no value is passed, visualization is enabled by
            default.
        enable_step_logs: Specify whether step logs are enabled for this step.
        experiment_tracker: The experiment tracker to use for this step.
        step_operator: The step operator to use for this step.
        output_materializers: Output materializers for this step. If
            given as a dict, the keys must be a subset of the output names
            of this step. If a single value (type or string) is given, the
            materializer will be used for all outputs.
        settings: Settings for this step.
        extra: Extra configurations for this step.
        on_failure: Callback function in event of failure of the step. Can be a
            function with a single argument of type `BaseException`, or a source
            path to such a function (e.g. `module.my_function`).
        on_success: Callback function in event of success of the step. Can be a
            function with no arguments, or a source path to such a function
            (e.g. `module.my_function`).
        model: configuration of the model in the Model Control Plane.
        retry: configuration of step retry in case of step failure.
        substitutions: Extra placeholders for the step name.

    Returns:
        The step instance.
    """

    def inner_decorator(func: "F") -> "BaseStep":
        from zenml.steps.decorated_step import _DecoratedStep

        class_: Type["BaseStep"] = type(
            func.__name__,
            (_DecoratedStep,),
            {
                "entrypoint": staticmethod(func),
                "__module__": func.__module__,
                "__doc__": func.__doc__,
            },
        )

        step_instance = class_(
            name=name or func.__name__,
            enable_cache=enable_cache,
            enable_artifact_metadata=enable_artifact_metadata,
            enable_artifact_visualization=enable_artifact_visualization,
            enable_step_logs=enable_step_logs,
            experiment_tracker=experiment_tracker,
            step_operator=step_operator,
            output_materializers=output_materializers,
            settings=settings,
            extra=extra,
            on_failure=on_failure,
            on_success=on_success,
            model=model,
            retry=retry,
            substitutions=substitutions,
        )

        return step_instance

    if _func is None:
        return inner_decorator
    else:
        return inner_decorator(_func)

Types

Custom ZenML types.

CSVString

Bases: str

Special string class to indicate a CSV string.

Source code in src/zenml/types.py
34
35
class CSVString(str):
    """Special string class to indicate a CSV string."""

HTMLString

Bases: str

Special string class to indicate an HTML string.

Source code in src/zenml/types.py
26
27
class HTMLString(str):
    """Special string class to indicate an HTML string."""

JSONString

Bases: str

Special string class to indicate a JSON string.

Source code in src/zenml/types.py
38
39
class JSONString(str):
    """Special string class to indicate a JSON string."""

MarkdownString

Bases: str

Special string class to indicate a Markdown string.

Source code in src/zenml/types.py
30
31
class MarkdownString(str):
    """Special string class to indicate a Markdown string."""

Utils

Initialization of the utils module.

The utils module contains utility functions handling analytics, reading and writing YAML data as well as other general purpose functions.

Zen Server

ZenML Server Implementation.

The ZenML Server is a centralized service meant for use in a collaborative setting in which stacks, stack components, flavors, pipeline and pipeline runs can be shared over the network with other users.

You can use the zenml server up command to spin up ZenML server instances that are either running locally as daemon processes or docker containers, or to deploy a ZenML server remotely on a managed cloud platform. The other CLI commands in the same zenml server group can be used to manage the server instances deployed from your local machine.

To connect the local ZenML client to one of the managed ZenML servers, call zenml server connect with the name of the server you want to connect to.

Zen Stores

ZenStores define ways to store ZenML relevant data locally or remotely.